Sample records for high statistics study

  1. Teaching Probabilities and Statistics to Preschool Children

    ERIC Educational Resources Information Center

    Pange, Jenny

    2003-01-01

    This study considers the teaching of probabilities and statistics to a group of preschool children using traditional classroom activities and Internet games. It was clear from this study that children can show a high level of understanding of probabilities and statistics, and demonstrate high performance in probability games. The use of Internet…

  2. Towards sound epistemological foundations of statistical methods for high-dimensional biology.

    PubMed

    Mehta, Tapan; Tanik, Murat; Allison, David B

    2004-09-01

    A sound epistemological foundation for biological inquiry comes, in part, from application of valid statistical procedures. This tenet is widely appreciated by scientists studying the new realm of high-dimensional biology, or 'omic' research, which involves multiplicity at unprecedented scales. Many papers aimed at the high-dimensional biology community describe the development or application of statistical techniques. The validity of many of these is questionable, and a shared understanding about the epistemological foundations of the statistical methods themselves seems to be lacking. Here we offer a framework in which the epistemological foundation of proposed statistical methods can be evaluated.

  3. A Framework for Assessing High School Students' Statistical Reasoning.

    PubMed

    Chan, Shiau Wei; Ismail, Zaleha; Sumintono, Bambang

    2016-01-01

    Based on a synthesis of literature, earlier studies, analyses and observations on high school students, this study developed an initial framework for assessing students' statistical reasoning about descriptive statistics. Framework descriptors were established across five levels of statistical reasoning and four key constructs. The former consisted of idiosyncratic reasoning, verbal reasoning, transitional reasoning, procedural reasoning, and integrated process reasoning. The latter include describing data, organizing and reducing data, representing data, and analyzing and interpreting data. In contrast to earlier studies, this initial framework formulated a complete and coherent statistical reasoning framework. A statistical reasoning assessment tool was then constructed from this initial framework. The tool was administered to 10 tenth-grade students in a task-based interview. The initial framework was refined, and the statistical reasoning assessment tool was revised. The ten students then participated in the second task-based interview, and the data obtained were used to validate the framework. The findings showed that the students' statistical reasoning levels were consistent across the four constructs, and this result confirmed the framework's cohesion. Developed to contribute to statistics education, this newly developed statistical reasoning framework provides a guide for planning learning goals and designing instruction and assessments.

  4. A Framework for Assessing High School Students' Statistical Reasoning

    PubMed Central

    2016-01-01

    Based on a synthesis of literature, earlier studies, analyses and observations on high school students, this study developed an initial framework for assessing students’ statistical reasoning about descriptive statistics. Framework descriptors were established across five levels of statistical reasoning and four key constructs. The former consisted of idiosyncratic reasoning, verbal reasoning, transitional reasoning, procedural reasoning, and integrated process reasoning. The latter include describing data, organizing and reducing data, representing data, and analyzing and interpreting data. In contrast to earlier studies, this initial framework formulated a complete and coherent statistical reasoning framework. A statistical reasoning assessment tool was then constructed from this initial framework. The tool was administered to 10 tenth-grade students in a task-based interview. The initial framework was refined, and the statistical reasoning assessment tool was revised. The ten students then participated in the second task-based interview, and the data obtained were used to validate the framework. The findings showed that the students’ statistical reasoning levels were consistent across the four constructs, and this result confirmed the framework’s cohesion. Developed to contribute to statistics education, this newly developed statistical reasoning framework provides a guide for planning learning goals and designing instruction and assessments. PMID:27812091

  5. Selected Streamflow Statistics and Regression Equations for Predicting Statistics at Stream Locations in Monroe County, Pennsylvania

    USGS Publications Warehouse

    Thompson, Ronald E.; Hoffman, Scott A.

    2006-01-01

    A suite of 28 streamflow statistics, ranging from extreme low to high flows, was computed for 17 continuous-record streamflow-gaging stations and predicted for 20 partial-record stations in Monroe County and contiguous counties in north-eastern Pennsylvania. The predicted statistics for the partial-record stations were based on regression analyses relating inter-mittent flow measurements made at the partial-record stations indexed to concurrent daily mean flows at continuous-record stations during base-flow conditions. The same statistics also were predicted for 134 ungaged stream locations in Monroe County on the basis of regression analyses relating the statistics to GIS-determined basin characteristics for the continuous-record station drainage areas. The prediction methodology for developing the regression equations used to estimate statistics was developed for estimating low-flow frequencies. This study and a companion study found that the methodology also has application potential for predicting intermediate- and high-flow statistics. The statistics included mean monthly flows, mean annual flow, 7-day low flows for three recurrence intervals, nine flow durations, mean annual base flow, and annual mean base flows for two recurrence intervals. Low standard errors of prediction and high coefficients of determination (R2) indicated good results in using the regression equations to predict the statistics. Regression equations for the larger flow statistics tended to have lower standard errors of prediction and higher coefficients of determination (R2) than equations for the smaller flow statistics. The report discusses the methodologies used in determining the statistics and the limitations of the statistics and the equations used to predict the statistics. Caution is indicated in using the predicted statistics for small drainage area situations. Study results constitute input needed by water-resource managers in Monroe County for planning purposes and evaluation of water-resources availability.

  6. Effectiveness of "Essentials for College Math" as a High School Transitional Course

    ERIC Educational Resources Information Center

    Riggleman, Jennifer S.

    2017-01-01

    Statistics on the number of students who leave high school underprepared for postsecondary education, and have to take remedial coursework upon entrance to college vary, but, unfortunately, for at least the last 10 years, these statistics have remained high. This study examined the effectiveness of one transitional high school math curriculum…

  7. A Virtual Study of Grid Resolution on Experiments of a Highly-Resolved Turbulent Plume

    NASA Astrophysics Data System (ADS)

    Maisto, Pietro M. F.; Marshall, Andre W.; Gollner, Michael J.; Fire Protection Engineering Department Collaboration

    2017-11-01

    An accurate representation of sub-grid scale turbulent mixing is critical for modeling fire plumes and smoke transport. In this study, PLIF and PIV diagnostics are used with the saltwater modeling technique to provide highly-resolved instantaneous field measurements in unconfined turbulent plumes useful for statistical analysis, physical insight, and model validation. The effect of resolution was investigated employing a virtual interrogation window (of varying size) applied to the high-resolution field measurements. Motivated by LES low-pass filtering concepts, the high-resolution experimental data in this study can be analyzed within the interrogation windows (i.e. statistics at the sub-grid scale) and on interrogation windows (i.e. statistics at the resolved scale). A dimensionless resolution threshold (L/D*) criterion was determined to achieve converged statistics on the filtered measurements. Such a criterion was then used to establish the relative importance between large and small-scale turbulence phenomena while investigating specific scales for the turbulent flow. First order data sets start to collapse at a resolution of 0.3D*, while for second and higher order statistical moments the interrogation window size drops down to 0.2D*.

  8. Response properties of ON-OFF retinal ganglion cells to high-order stimulus statistics.

    PubMed

    Xiao, Lei; Gong, Han-Yan; Gong, Hai-Qing; Liang, Pei-Ji; Zhang, Pu-Ming

    2014-10-17

    The visual stimulus statistics are the fundamental parameters to provide the reference for studying visual coding rules. In this study, the multi-electrode extracellular recording experiments were designed and implemented on bullfrog retinal ganglion cells to explore the neural response properties to the changes in stimulus statistics. The changes in low-order stimulus statistics, such as intensity and contrast, were clearly reflected in the neuronal firing rate. However, it was difficult to distinguish the changes in high-order statistics, such as skewness and kurtosis, only based on the neuronal firing rate. The neuronal temporal filtering and sensitivity characteristics were further analyzed. We observed that the peak-to-peak amplitude of the temporal filter and the neuronal sensitivity, which were obtained from either neuronal ON spikes or OFF spikes, could exhibit significant changes when the high-order stimulus statistics were changed. These results indicate that in the retina, the neuronal response properties may be reliable and powerful in carrying some complex and subtle visual information. Copyright © 2014 Elsevier Ireland Ltd. All rights reserved.

  9. Pattern statistics on Markov chains and sensitivity to parameter estimation

    PubMed Central

    Nuel, Grégory

    2006-01-01

    Background: In order to compute pattern statistics in computational biology a Markov model is commonly used to take into account the sequence composition. Usually its parameter must be estimated. The aim of this paper is to determine how sensitive these statistics are to parameter estimation, and what are the consequences of this variability on pattern studies (finding the most over-represented words in a genome, the most significant common words to a set of sequences,...). Results: In the particular case where pattern statistics (overlap counting only) computed through binomial approximations we use the delta-method to give an explicit expression of σ, the standard deviation of a pattern statistic. This result is validated using simulations and a simple pattern study is also considered. Conclusion: We establish that the use of high order Markov model could easily lead to major mistakes due to the high sensitivity of pattern statistics to parameter estimation. PMID:17044916

  10. Pattern statistics on Markov chains and sensitivity to parameter estimation.

    PubMed

    Nuel, Grégory

    2006-10-17

    In order to compute pattern statistics in computational biology a Markov model is commonly used to take into account the sequence composition. Usually its parameter must be estimated. The aim of this paper is to determine how sensitive these statistics are to parameter estimation, and what are the consequences of this variability on pattern studies (finding the most over-represented words in a genome, the most significant common words to a set of sequences,...). In the particular case where pattern statistics (overlap counting only) computed through binomial approximations we use the delta-method to give an explicit expression of sigma, the standard deviation of a pattern statistic. This result is validated using simulations and a simple pattern study is also considered. We establish that the use of high order Markov model could easily lead to major mistakes due to the high sensitivity of pattern statistics to parameter estimation.

  11. Reporting quality of statistical methods in surgical observational studies: protocol for systematic review.

    PubMed

    Wu, Robert; Glen, Peter; Ramsay, Tim; Martel, Guillaume

    2014-06-28

    Observational studies dominate the surgical literature. Statistical adjustment is an important strategy to account for confounders in observational studies. Research has shown that published articles are often poor in statistical quality, which may jeopardize their conclusions. The Statistical Analyses and Methods in the Published Literature (SAMPL) guidelines have been published to help establish standards for statistical reporting.This study will seek to determine whether the quality of statistical adjustment and the reporting of these methods are adequate in surgical observational studies. We hypothesize that incomplete reporting will be found in all surgical observational studies, and that the quality and reporting of these methods will be of lower quality in surgical journals when compared with medical journals. Finally, this work will seek to identify predictors of high-quality reporting. This work will examine the top five general surgical and medical journals, based on a 5-year impact factor (2007-2012). All observational studies investigating an intervention related to an essential component area of general surgery (defined by the American Board of Surgery), with an exposure, outcome, and comparator, will be included in this systematic review. Essential elements related to statistical reporting and quality were extracted from the SAMPL guidelines and include domains such as intent of analysis, primary analysis, multiple comparisons, numbers and descriptive statistics, association and correlation analyses, linear regression, logistic regression, Cox proportional hazard analysis, analysis of variance, survival analysis, propensity analysis, and independent and correlated analyses. Each article will be scored as a proportion based on fulfilling criteria in relevant analyses used in the study. A logistic regression model will be built to identify variables associated with high-quality reporting. A comparison will be made between the scores of surgical observational studies published in medical versus surgical journals. Secondary outcomes will pertain to individual domains of analysis. Sensitivity analyses will be conducted. This study will explore the reporting and quality of statistical analyses in surgical observational studies published in the most referenced surgical and medical journals in 2013 and examine whether variables (including the type of journal) can predict high-quality reporting.

  12. Advanced Placement® Statistics Students' Education Choices after High School. Research Notes. RN-38

    ERIC Educational Resources Information Center

    Patterson, Brian F.

    2009-01-01

    Taking the AP Statistics course and exam does not appear to be related to greater interest in the statistical sciences. Despite this finding, with respect to deciding whether to take further statistics course work and majoring in statistics, students appear to feel prepared for, but not interested in, further study. There is certainly more…

  13. Recommendations for describing statistical studies and results in general readership science and engineering journals.

    PubMed

    Gardenier, John S

    2012-12-01

    This paper recommends how authors of statistical studies can communicate to general audiences fully, clearly, and comfortably. The studies may use statistical methods to explore issues in science, engineering, and society or they may address issues in statistics specifically. In either case, readers without explicit statistical training should have no problem understanding the issues, the methods, or the results at a non-technical level. The arguments for those results should be clear, logical, and persuasive. This paper also provides advice for editors of general journals on selecting high quality statistical articles without the need for exceptional work or expense. Finally, readers are also advised to watch out for some common errors or misuses of statistics that can be detected without a technical statistical background.

  14. Relative risk estimates from spatial and space-time scan statistics: Are they biased?

    PubMed Central

    Prates, Marcos O.; Kulldorff, Martin; Assunção, Renato M.

    2014-01-01

    The purely spatial and space-time scan statistics have been successfully used by many scientists to detect and evaluate geographical disease clusters. Although the scan statistic has high power in correctly identifying a cluster, no study has considered the estimates of the cluster relative risk in the detected cluster. In this paper we evaluate whether there is any bias on these estimated relative risks. Intuitively, one may expect that the estimated relative risks has upward bias, since the scan statistic cherry picks high rate areas to include in the cluster. We show that this intuition is correct for clusters with low statistical power, but with medium to high power the bias becomes negligible. The same behaviour is not observed for the prospective space-time scan statistic, where there is an increasing conservative downward bias of the relative risk as the power to detect the cluster increases. PMID:24639031

  15. Local image statistics: maximum-entropy constructions and perceptual salience

    PubMed Central

    Victor, Jonathan D.; Conte, Mary M.

    2012-01-01

    The space of visual signals is high-dimensional and natural visual images have a highly complex statistical structure. While many studies suggest that only a limited number of image statistics are used for perceptual judgments, a full understanding of visual function requires analysis not only of the impact of individual image statistics, but also, how they interact. In natural images, these statistical elements (luminance distributions, correlations of low and high order, edges, occlusions, etc.) are intermixed, and their effects are difficult to disentangle. Thus, there is a need for construction of stimuli in which one or more statistical elements are introduced in a controlled fashion, so that their individual and joint contributions can be analyzed. With this as motivation, we present algorithms to construct synthetic images in which local image statistics—including luminance distributions, pair-wise correlations, and higher-order correlations—are explicitly specified and all other statistics are determined implicitly by maximum-entropy. We then apply this approach to measure the sensitivity of the human visual system to local image statistics and to sample their interactions. PMID:22751397

  16. A spatial scan statistic for nonisotropic two-level risk cluster.

    PubMed

    Li, Xiao-Zhou; Wang, Jin-Feng; Yang, Wei-Zhong; Li, Zhong-Jie; Lai, Sheng-Jie

    2012-01-30

    Spatial scan statistic methods are commonly used for geographical disease surveillance and cluster detection. The standard spatial scan statistic does not model any variability in the underlying risks of subregions belonging to a detected cluster. For a multilevel risk cluster, the isotonic spatial scan statistic could model a centralized high-risk kernel in the cluster. Because variations in disease risks are anisotropic owing to different social, economical, or transport factors, the real high-risk kernel will not necessarily take the central place in a whole cluster area. We propose a spatial scan statistic for a nonisotropic two-level risk cluster, which could be used to detect a whole cluster and a noncentralized high-risk kernel within the cluster simultaneously. The performance of the three methods was evaluated through an intensive simulation study. Our proposed nonisotropic two-level method showed better power and geographical precision with two-level risk cluster scenarios, especially for a noncentralized high-risk kernel. Our proposed method is illustrated using the hand-foot-mouth disease data in Pingdu City, Shandong, China in May 2009, compared with two other methods. In this practical study, the nonisotropic two-level method is the only way to precisely detect a high-risk area in a detected whole cluster. Copyright © 2011 John Wiley & Sons, Ltd.

  17. [Notes on vital statistics for the study of perinatal health].

    PubMed

    Juárez, Sol Pía

    2014-01-01

    Vital statistics, published by the National Statistics Institute in Spain, are a highly important source for the study of perinatal health nationwide. However, the process of data collection is not well-known and has implications both for the quality and interpretation of the epidemiological results derived from this source. The aim of this study was to present how the information is collected and some of the associated problems. This study is the result of an analysis of the methodological notes from the National Statistics Institute and first-hand information obtained from hospitals, the Central Civil Registry of Madrid, and the Madrid Institute for Statistics. Greater integration between these institutions is required to improve the quality of birth and stillbirth statistics. Copyright © 2014 SESPAS. Published by Elsevier Espana. All rights reserved.

  18. Monitoring Statistics Which Have Increased Power over a Reduced Time Range.

    ERIC Educational Resources Information Center

    Tang, S. M.; MacNeill, I. B.

    1992-01-01

    The problem of monitoring trends for changes at unknown times is considered. Statistics that permit one to focus high power on a segment of the monitored period are studied. Numerical procedures are developed to compute the null distribution of these statistics. (Author)

  19. Problem Representation and Academic Performance in Statistics

    ERIC Educational Resources Information Center

    Li, Jun

    2014-01-01

    The purpose of this study was to examine the relationship between problem representation and academic performance in statistics. A specially-designed triad judgment task was administered through SurveyMonkey, an online survey service. Participants were 162 high school graduates who took the AP Statistics Examination in the spring of 2013. Results…

  20. Self-Esteem and Academic Achievement of High School Students

    ERIC Educational Resources Information Center

    Moradi Sheykhjan, Tohid; Jabari, Kamran; Rajeswari, K.

    2014-01-01

    The primary purpose of this study was to determine the influence of self-esteem on academic achievement among high school students in Miandoab City of Iran. The methodology of the research is descriptive and correlation that descriptive and inferential statistics were used to analyze the data. Statistical Society includes male and female high…

  1. A Comparison of Methods for Estimating the Determinant of High-Dimensional Covariance Matrix.

    PubMed

    Hu, Zongliang; Dong, Kai; Dai, Wenlin; Tong, Tiejun

    2017-09-21

    The determinant of the covariance matrix for high-dimensional data plays an important role in statistical inference and decision. It has many real applications including statistical tests and information theory. Due to the statistical and computational challenges with high dimensionality, little work has been proposed in the literature for estimating the determinant of high-dimensional covariance matrix. In this paper, we estimate the determinant of the covariance matrix using some recent proposals for estimating high-dimensional covariance matrix. Specifically, we consider a total of eight covariance matrix estimation methods for comparison. Through extensive simulation studies, we explore and summarize some interesting comparison results among all compared methods. We also provide practical guidelines based on the sample size, the dimension, and the correlation of the data set for estimating the determinant of high-dimensional covariance matrix. Finally, from a perspective of the loss function, the comparison study in this paper may also serve as a proxy to assess the performance of the covariance matrix estimation.

  2. Under the hood of statistical learning: A statistical MMN reflects the magnitude of transitional probabilities in auditory sequences.

    PubMed

    Koelsch, Stefan; Busch, Tobias; Jentschke, Sebastian; Rohrmeier, Martin

    2016-02-02

    Within the framework of statistical learning, many behavioural studies investigated the processing of unpredicted events. However, surprisingly few neurophysiological studies are available on this topic, and no statistical learning experiment has investigated electroencephalographic (EEG) correlates of processing events with different transition probabilities. We carried out an EEG study with a novel variant of the established statistical learning paradigm. Timbres were presented in isochronous sequences of triplets. The first two sounds of all triplets were equiprobable, while the third sound occurred with either low (10%), intermediate (30%), or high (60%) probability. Thus, the occurrence probability of the third item of each triplet (given the first two items) was varied. Compared to high-probability triplet endings, endings with low and intermediate probability elicited an early anterior negativity that had an onset around 100 ms and was maximal at around 180 ms. This effect was larger for events with low than for events with intermediate probability. Our results reveal that, when predictions are based on statistical learning, events that do not match a prediction evoke an early anterior negativity, with the amplitude of this mismatch response being inversely related to the probability of such events. Thus, we report a statistical mismatch negativity (sMMN) that reflects statistical learning of transitional probability distributions that go beyond auditory sensory memory capabilities.

  3. High cumulants of conserved charges and their statistical uncertainties

    NASA Astrophysics Data System (ADS)

    Li-Zhu, Chen; Ye-Yin, Zhao; Xue, Pan; Zhi-Ming, Li; Yuan-Fang, Wu

    2017-10-01

    We study the influence of measured high cumulants of conserved charges on their associated statistical uncertainties in relativistic heavy-ion collisions. With a given number of events, the measured cumulants randomly fluctuate with an approximately normal distribution, while the estimated statistical uncertainties are found to be correlated with corresponding values of the obtained cumulants. Generally, with a given number of events, the larger the cumulants we measure, the larger the statistical uncertainties that are estimated. The error-weighted averaged cumulants are dependent on statistics. Despite this effect, however, it is found that the three sigma rule of thumb is still applicable when the statistics are above one million. Supported by NSFC (11405088, 11521064, 11647093), Major State Basic Research Development Program of China (2014CB845402) and Ministry of Science and Technology (MoST) (2016YFE0104800)

  4. Factors That Explain the Attitude towards Statistics in High-School Students: Empirical Evidence at Technological Study Center of the Sea in Veracruz, Mexico

    ERIC Educational Resources Information Center

    Rojas-Kramer, Carlos; Limón-Suárez, Enrique; Moreno-García, Elena; García-Santillán, Arturo

    2018-01-01

    The aim of this paper was to analyze attitude towards statistics in high-school students using the SATS scale designed by Auzmendi (1992). The sample was 200 students from the sixth semester of the afternoon shift, who were enrolled in technical careers from the Technological Study Center of the Sea (Centro de Estudios Tecnológicos del Mar 07…

  5. Low statistical power in biomedical science: a review of three human research domains.

    PubMed

    Dumas-Mallet, Estelle; Button, Katherine S; Boraud, Thomas; Gonon, Francois; Munafò, Marcus R

    2017-02-01

    Studies with low statistical power increase the likelihood that a statistically significant finding represents a false positive result. We conducted a review of meta-analyses of studies investigating the association of biological, environmental or cognitive parameters with neurological, psychiatric and somatic diseases, excluding treatment studies, in order to estimate the average statistical power across these domains. Taking the effect size indicated by a meta-analysis as the best estimate of the likely true effect size, and assuming a threshold for declaring statistical significance of 5%, we found that approximately 50% of studies have statistical power in the 0-10% or 11-20% range, well below the minimum of 80% that is often considered conventional. Studies with low statistical power appear to be common in the biomedical sciences, at least in the specific subject areas captured by our search strategy. However, we also observe evidence that this depends in part on research methodology, with candidate gene studies showing very low average power and studies using cognitive/behavioural measures showing high average power. This warrants further investigation.

  6. Low statistical power in biomedical science: a review of three human research domains

    PubMed Central

    Dumas-Mallet, Estelle; Button, Katherine S.; Boraud, Thomas; Gonon, Francois

    2017-01-01

    Studies with low statistical power increase the likelihood that a statistically significant finding represents a false positive result. We conducted a review of meta-analyses of studies investigating the association of biological, environmental or cognitive parameters with neurological, psychiatric and somatic diseases, excluding treatment studies, in order to estimate the average statistical power across these domains. Taking the effect size indicated by a meta-analysis as the best estimate of the likely true effect size, and assuming a threshold for declaring statistical significance of 5%, we found that approximately 50% of studies have statistical power in the 0–10% or 11–20% range, well below the minimum of 80% that is often considered conventional. Studies with low statistical power appear to be common in the biomedical sciences, at least in the specific subject areas captured by our search strategy. However, we also observe evidence that this depends in part on research methodology, with candidate gene studies showing very low average power and studies using cognitive/behavioural measures showing high average power. This warrants further investigation. PMID:28386409

  7. The Ironic Effect of Significant Results on the Credibility of Multiple-Study Articles

    ERIC Educational Resources Information Center

    Schimmack, Ulrich

    2012-01-01

    Cohen (1962) pointed out the importance of statistical power for psychology as a science, but statistical power of studies has not increased, while the number of studies in a single article has increased. It has been overlooked that multiple studies with modest power have a high probability of producing nonsignificant results because power…

  8. Factors contributing to academic achievement: a Bayesian structure equation modelling study

    NASA Astrophysics Data System (ADS)

    Payandeh Najafabadi, Amir T.; Omidi Najafabadi, Maryam; Farid-Rohani, Mohammad Reza

    2013-06-01

    In Iran, high school graduates enter university after taking a very difficult entrance exam called the Konkoor. Therefore, only the top-performing students are admitted by universities to continue their bachelor's education in statistics. Surprisingly, statistically, most of such students fall into the following categories: (1) do not succeed in their education despite their excellent performance on the Konkoor and in high school; (2) graduate with a grade point average (GPA) that is considerably lower than their high school GPA; (3) continue their master's education in majors other than statistics and (4) try to find jobs unrelated to statistics. This article employs the well-known and powerful statistical technique, the Bayesian structural equation modelling (SEM), to study the academic success of recent graduates who have studied statistics at Shahid Beheshti University in Iran. This research: (i) considered academic success as a latent variable, which was measured by GPA and other academic success (see below) of students in the target population; (ii) employed the Bayesian SEM, which works properly for small sample sizes and ordinal variables; (iii), which is taken from the literature, developed five main factors that affected academic success and (iv) considered several standard psychological tests and measured characteristics such as 'self-esteem' and 'anxiety'. We then study the impact of such factors on the academic success of the target population. Six factors that positively impact student academic success were identified in the following order of relative impact (from greatest to least): 'Teaching-Evaluation', 'Learner', 'Environment', 'Family', 'Curriculum' and 'Teaching Knowledge'. Particularly, influential variables within each factor have also been noted.

  9. Student and Professor Gender Effects in Introductory Business Statistics

    ERIC Educational Resources Information Center

    Haley, M. Ryan; Johnson, Marianne F.; Kuennen, Eric W.

    2007-01-01

    Studies have yielded highly mixed results as to differences in male and female student performance in statistics courses; the role that professors play in these differences is even less clear. In this paper, we consider the impact of professor and student gender on student performance in an introductory business statistics course taught by…

  10. Black Females in High School: A Statistical Educational Profile

    ERIC Educational Resources Information Center

    Muhammad, Crystal Gafford; Dixson, Adrienne D.

    2008-01-01

    In life as in literature, both the mainstream public and the Black community writ large, overlook the Black female experiences, both adolescent and adult. In order to contribute to the knowledge base regarding this population, we present through our study a statistical portrait of Black females in high school. To do so, we present an analysis of…

  11. Assessing the Temporal Relationship between Race and Ecstasy Use among High School Seniors.

    ERIC Educational Resources Information Center

    Yacoubian, George S., Jr.

    2002-01-01

    Analyzes data from 10,088 high school seniors surveyed through the Monitoring the Future study between 1996 and 1999. Chi-square statistics are used to explore the temporal relationship between ace and the use of ecstasy during this time frame. Statistically significant relationship between race and ecstasy use are discerned. (Contains 46…

  12. Statistical Analysis on the Mechanical Properties of Magnesium Alloys

    PubMed Central

    Liu, Ruoyu; Jiang, Xianquan; Zhang, Hongju; Zhang, Dingfei; Wang, Jingfeng; Pan, Fusheng

    2017-01-01

    Knowledge of statistical characteristics of mechanical properties is very important for the practical application of structural materials. Unfortunately, the scatter characteristics of magnesium alloys for mechanical performance remain poorly understood until now. In this study, the mechanical reliability of magnesium alloys is systematically estimated using Weibull statistical analysis. Interestingly, the Weibull modulus, m, of strength for magnesium alloys is as high as that for aluminum and steels, confirming the very high reliability of magnesium alloys. The high predictability in the tensile strength of magnesium alloys represents the capability of preventing catastrophic premature failure during service, which is essential for safety and reliability assessment. PMID:29113116

  13. Dietary fat intake and risk of epithelial ovarian cancer: a meta-analysis of 6,689 subjects from 8 observational studies.

    PubMed

    Huncharek, M; Kupelnick, B

    2001-01-01

    The etiology of epithelial ovarian cancer is unknown. Prior work suggests that high dietary fat intake is associated with an increased risk of this tumor, although this association remains speculative. A meta-analysis was performed to evaluate this suspected relationship. Using previously described methods, a protocol was developed for a meta-analysis examining the association between high vs. low dietary fat intake and the risk of epithelial ovarian cancer. Literature search techniques, study inclusion criteria, and statistical procedures were prospectively defined. Data from observational studies were pooled using a general variance-based meta-analytic method employing confidence intervals (CI) previously described by Greenland. The outcome of interest was a summary relative risk (RRs) reflecting the risk of ovarian cancer associated with high vs. low dietary fat intake. Sensitivity analyses were performed when necessary to evaluate any observed statistical heterogeneity. The literature search yielded 8 observational studies enrolling 6,689 subjects. Data were stratified into three dietary fat intake categories: total fat, animal fat, and saturated fat. Initial tests for statistical homogeneity demonstrated that hospital-based studies accounted for observed heterogeneity possibly because of selection bias. Accounting for this, an RRs was calculated for high vs. low total fat intake, yielding a value of 1.24 (95% CI = 1.07-1.43), a statistically significant result. That is, high total fat intake is associated with a 24% increased risk of ovarian cancer development. The RRs for high saturated fat intake was 1.20 (95% CI = 1.04-1.39), suggesting a 20% increased risk of ovarian cancer among subjects with these dietary habits. High vs. low animal fat diet gave an RRs of 1.70 (95% CI = 1.43-2.03), consistent with a statistically significant 70% increased ovarian cancer risk. High dietary fat intake appears to represent a significant risk factor for the development of ovarian cancer. The magnitude of this risk associated with total fat and saturated fat is rather modest. Ovarian cancer risk associated with high animal fat intake appears significantly greater than that associated with the other types of fat intake studied, although this requires confirmation via larger analyses. Further work is needed to clarify factors that may modify the effects of dietary fat in vivo.

  14. Statistical Design in Isothermal Aging of Polyimide Resins

    NASA Technical Reports Server (NTRS)

    Sutter, James K.; Jobe, Marcus; Crane, Elizabeth A.

    1995-01-01

    Recent developments in research on polyimides for high temperature applications have led to the synthesis of many new polymers. Among the criteria that determines their thermal oxidative stability, isothermal aging is one of the most important. Isothermal aging studies require that many experimental factors are controlled to provide accurate results. In this article we describe a statistical plan that compares the isothermal stability of several polyimide resins, while minimizing the variations inherent in high-temperature aging studies.

  15. A Psychophysiological Study of Processing HIV/AIDS Public Service Announcements: The Effects of Novelty Appeals, Sexual Appeals, Narrative Versus Statistical Evidence, and Viewer's Sex.

    PubMed

    Zhang, Jueman Mandy; Chen, Gina Masullo; Chock, T Makana; Wang, Yi; Ni, Liqiang; Schweisberger, Valarie

    2016-07-01

    This study used self-reports and physiological measures-heart rate (HR) and skin conductance level (SCL)-to examine the effects of novelty appeals, sexual appeals, narrative versus statistical evidence, and viewer's sex on cognitive and emotional processing of HIV/AIDS public service announcements (PSAs) among heterosexually active single college students. Novelty or sexual appeals differently affected self-reported attention and cognitive effort as measured by HR. High- rather than low-novelty HIV/AIDS PSAs, perceived as more attention-eliciting, did not lead to more cognitive effort. High- rather than low-sex HIV/AIDS PSAs, not perceived as more attention-eliciting, led to more cognitive effort as reflected by greater HR deceleration. Novelty or sexual appeals also affected self-reported emotional arousal and SCL differently. HIV/AIDS PSAs with high rather than low levels of novelty or sexual appeals led to greater self-reported arousal, but not greater SCL. Message evidence interacted with message appeals to affect cognitive effort. Participants exerted greater cognitive effort during high- rather than low-novelty narrative HIV/AIDS PSAs, and during low- rather than high-novelty statistical ones. The advantage of high over low sexual appeals was more obvious in statistical than in narrative HIV/AIDS PSAs. Males reported greater emotional arousal than females during high- rather than low-sex HIV/AIDS PSAs.

  16. High-temperature behavior of a deformed Fermi gas obeying interpolating statistics.

    PubMed

    Algin, Abdullah; Senay, Mustafa

    2012-04-01

    An outstanding idea originally introduced by Greenberg is to investigate whether there is equivalence between intermediate statistics, which may be different from anyonic statistics, and q-deformed particle algebra. Also, a model to be studied for addressing such an idea could possibly provide us some new consequences about the interactions of particles as well as their internal structures. Motivated mainly by this idea, in this work, we consider a q-deformed Fermi gas model whose statistical properties enable us to effectively study interpolating statistics. Starting with a generalized Fermi-Dirac distribution function, we derive several thermostatistical functions of a gas of these deformed fermions in the thermodynamical limit. We study the high-temperature behavior of the system by analyzing the effects of q deformation on the most important thermostatistical characteristics of the system such as the entropy, specific heat, and equation of state. It is shown that such a deformed fermion model in two and three spatial dimensions exhibits the interpolating statistics in a specific interval of the model deformation parameter 0 < q < 1. In particular, for two and three spatial dimensions, it is found from the behavior of the third virial coefficient of the model that the deformation parameter q interpolates completely between attractive and repulsive systems, including the free boson and fermion cases. From the results obtained in this work, we conclude that such a model could provide much physical insight into some interacting theories of fermions, and could be useful to further study the particle systems with intermediate statistics.

  17. Finding differentially expressed genes in high dimensional data: Rank based test statistic via a distance measure.

    PubMed

    Mathur, Sunil; Sadana, Ajit

    2015-12-01

    We present a rank-based test statistic for the identification of differentially expressed genes using a distance measure. The proposed test statistic is highly robust against extreme values and does not assume the distribution of parent population. Simulation studies show that the proposed test is more powerful than some of the commonly used methods, such as paired t-test, Wilcoxon signed rank test, and significance analysis of microarray (SAM) under certain non-normal distributions. The asymptotic distribution of the test statistic, and the p-value function are discussed. The application of proposed method is shown using a real-life data set. © The Author(s) 2011.

  18. Network problem threshold

    NASA Technical Reports Server (NTRS)

    Gejji, Raghvendra, R.

    1992-01-01

    Network transmission errors such as collisions, CRC errors, misalignment, etc. are statistical in nature. Although errors can vary randomly, a high level of errors does indicate specific network problems, e.g. equipment failure. In this project, we have studied the random nature of collisions theoretically as well as by gathering statistics, and established a numerical threshold above which a network problem is indicated with high probability.

  19. Which Type of Risk Information to Use for Whom? Moderating Role of Outcome-Relevant Involvement in the Effects of Statistical and Exemplified Risk Information on Risk Perceptions.

    PubMed

    So, Jiyeon; Jeong, Se-Hoon; Hwang, Yoori

    2017-04-01

    The extant empirical research examining the effectiveness of statistical and exemplar-based health information is largely inconsistent. Under the premise that the inconsistency may be due to an unacknowledged moderator (O'Keefe, 2002), this study examined a moderating role of outcome-relevant involvement (Johnson & Eagly, 1989) in the effects of statistical and exemplified risk information on risk perception. Consistent with predictions based on elaboration likelihood model (Petty & Cacioppo, 1984), findings from an experiment (N = 237) concerning alcohol consumption risks showed that statistical risk information predicted risk perceptions of individuals with high, rather than low, involvement, while exemplified risk information predicted risk perceptions of those with low, rather than high, involvement. Moreover, statistical risk information contributed to negative attitude toward drinking via increased risk perception only for highly involved individuals, while exemplified risk information influenced the attitude through the same mechanism only for individuals with low involvement. Theoretical and practical implications for health risk communication are discussed.

  20. Two-Dimensional Hermite Filters Simplify the Description of High-Order Statistics of Natural Images.

    PubMed

    Hu, Qin; Victor, Jonathan D

    2016-09-01

    Natural image statistics play a crucial role in shaping biological visual systems, understanding their function and design principles, and designing effective computer-vision algorithms. High-order statistics are critical for conveying local features, but they are challenging to study - largely because their number and variety is large. Here, via the use of two-dimensional Hermite (TDH) functions, we identify a covert symmetry in high-order statistics of natural images that simplifies this task. This emerges from the structure of TDH functions, which are an orthogonal set of functions that are organized into a hierarchy of ranks. Specifically, we find that the shape (skewness and kurtosis) of the distribution of filter coefficients depends only on the projection of the function onto a 1-dimensional subspace specific to each rank. The characterization of natural image statistics provided by TDH filter coefficients reflects both their phase and amplitude structure, and we suggest an intuitive interpretation for the special subspace within each rank.

  1. The (mis)reporting of statistical results in psychology journals.

    PubMed

    Bakker, Marjan; Wicherts, Jelte M

    2011-09-01

    In order to study the prevalence, nature (direction), and causes of reporting errors in psychology, we checked the consistency of reported test statistics, degrees of freedom, and p values in a random sample of high- and low-impact psychology journals. In a second study, we established the generality of reporting errors in a random sample of recent psychological articles. Our results, on the basis of 281 articles, indicate that around 18% of statistical results in the psychological literature are incorrectly reported. Inconsistencies were more common in low-impact journals than in high-impact journals. Moreover, around 15% of the articles contained at least one statistical conclusion that proved, upon recalculation, to be incorrect; that is, recalculation rendered the previously significant result insignificant, or vice versa. These errors were often in line with researchers' expectations. We classified the most common errors and contacted authors to shed light on the origins of the errors.

  2. Inter-comparison of multiple statistically downscaled climate datasets for the Pacific Northwest, USA

    PubMed Central

    Jiang, Yueyang; Kim, John B.; Still, Christopher J.; Kerns, Becky K.; Kline, Jeffrey D.; Cunningham, Patrick G.

    2018-01-01

    Statistically downscaled climate data have been widely used to explore possible impacts of climate change in various fields of study. Although many studies have focused on characterizing differences in the downscaling methods, few studies have evaluated actual downscaled datasets being distributed publicly. Spatially focusing on the Pacific Northwest, we compare five statistically downscaled climate datasets distributed publicly in the US: ClimateNA, NASA NEX-DCP30, MACAv2-METDATA, MACAv2-LIVNEH and WorldClim. We compare the downscaled projections of climate change, and the associated observational data used as training data for downscaling. We map and quantify the variability among the datasets and characterize the spatio-temporal patterns of agreement and disagreement among the datasets. Pair-wise comparisons of datasets identify the coast and high-elevation areas as areas of disagreement for temperature. For precipitation, high-elevation areas, rainshadows and the dry, eastern portion of the study area have high dissimilarity among the datasets. By spatially aggregating the variability measures into watersheds, we develop guidance for selecting datasets within the Pacific Northwest climate change impact studies. PMID:29461513

  3. Inter-comparison of multiple statistically downscaled climate datasets for the Pacific Northwest, USA.

    PubMed

    Jiang, Yueyang; Kim, John B; Still, Christopher J; Kerns, Becky K; Kline, Jeffrey D; Cunningham, Patrick G

    2018-02-20

    Statistically downscaled climate data have been widely used to explore possible impacts of climate change in various fields of study. Although many studies have focused on characterizing differences in the downscaling methods, few studies have evaluated actual downscaled datasets being distributed publicly. Spatially focusing on the Pacific Northwest, we compare five statistically downscaled climate datasets distributed publicly in the US: ClimateNA, NASA NEX-DCP30, MACAv2-METDATA, MACAv2-LIVNEH and WorldClim. We compare the downscaled projections of climate change, and the associated observational data used as training data for downscaling. We map and quantify the variability among the datasets and characterize the spatio-temporal patterns of agreement and disagreement among the datasets. Pair-wise comparisons of datasets identify the coast and high-elevation areas as areas of disagreement for temperature. For precipitation, high-elevation areas, rainshadows and the dry, eastern portion of the study area have high dissimilarity among the datasets. By spatially aggregating the variability measures into watersheds, we develop guidance for selecting datasets within the Pacific Northwest climate change impact studies.

  4. Teaching Challenged-Based Curriculum in a Statistics Classroom: The Effect on Motivation Orientation for Regular and Special Education Students

    ERIC Educational Resources Information Center

    Wimpey, Amanda Dickard

    2010-01-01

    Because the high school statistics curriculum is often teacher centered and lacking in innovativeness, students tend to struggle academically in statistics courses, particularly those students who are served by special education. This problem has been linked to a student's motivation to learn. The purpose of this study was to investigate the…

  5. Electrophysiological evidence of heterogeneity in visual statistical learning in young children with ASD.

    PubMed

    Jeste, Shafali S; Kirkham, Natasha; Senturk, Damla; Hasenstab, Kyle; Sugar, Catherine; Kupelian, Chloe; Baker, Elizabeth; Sanders, Andrew J; Shimizu, Christina; Norona, Amanda; Paparella, Tanya; Freeman, Stephanny F N; Johnson, Scott P

    2015-01-01

    Statistical learning is characterized by detection of regularities in one's environment without an awareness or intention to learn, and it may play a critical role in language and social behavior. Accordingly, in this study we investigated the electrophysiological correlates of visual statistical learning in young children with autism spectrum disorder (ASD) using an event-related potential shape learning paradigm, and we examined the relation between visual statistical learning and cognitive function. Compared to typically developing (TD) controls, the ASD group as a whole showed reduced evidence of learning as defined by N1 (early visual discrimination) and P300 (attention to novelty) components. Upon further analysis, in the ASD group there was a positive correlation between N1 amplitude difference and non-verbal IQ, and a positive correlation between P300 amplitude difference and adaptive social function. Children with ASD and a high non-verbal IQ and high adaptive social function demonstrated a distinctive pattern of learning. This is the first study to identify electrophysiological markers of visual statistical learning in children with ASD. Through this work we have demonstrated heterogeneity in statistical learning in ASD that maps onto non-verbal cognition and adaptive social function. © 2014 John Wiley & Sons Ltd.

  6. Cognition of and Demand for Education and Teaching in Medical Statistics in China: A Systematic Review and Meta-Analysis

    PubMed Central

    Li, Gaoming; Yi, Dali; Wu, Xiaojiao; Liu, Xiaoyu; Zhang, Yanqi; Liu, Ling; Yi, Dong

    2015-01-01

    Background Although a substantial number of studies focus on the teaching and application of medical statistics in China, few studies comprehensively evaluate the recognition of and demand for medical statistics. In addition, the results of these various studies differ and are insufficiently comprehensive and systematic. Objectives This investigation aimed to evaluate the general cognition of and demand for medical statistics by undergraduates, graduates, and medical staff in China. Methods We performed a comprehensive database search related to the cognition of and demand for medical statistics from January 2007 to July 2014 and conducted a meta-analysis of non-controlled studies with sub-group analysis for undergraduates, graduates, and medical staff. Results There are substantial differences with respect to the cognition of theory in medical statistics among undergraduates (73.5%), graduates (60.7%), and medical staff (39.6%). The demand for theory in medical statistics is high among graduates (94.6%), undergraduates (86.1%), and medical staff (88.3%). Regarding specific statistical methods, the cognition of basic statistical methods is higher than of advanced statistical methods. The demand for certain advanced statistical methods, including (but not limited to) multiple analysis of variance (ANOVA), multiple linear regression, and logistic regression, is higher than that for basic statistical methods. The use rates of the Statistical Package for the Social Sciences (SPSS) software and statistical analysis software (SAS) are only 55% and 15%, respectively. Conclusion The overall statistical competence of undergraduates, graduates, and medical staff is insufficient, and their ability to practically apply their statistical knowledge is limited, which constitutes an unsatisfactory state of affairs for medical statistics education. Because the demand for skills in this area is increasing, the need to reform medical statistics education in China has become urgent. PMID:26053876

  7. Cognition of and Demand for Education and Teaching in Medical Statistics in China: A Systematic Review and Meta-Analysis.

    PubMed

    Wu, Yazhou; Zhou, Liang; Li, Gaoming; Yi, Dali; Wu, Xiaojiao; Liu, Xiaoyu; Zhang, Yanqi; Liu, Ling; Yi, Dong

    2015-01-01

    Although a substantial number of studies focus on the teaching and application of medical statistics in China, few studies comprehensively evaluate the recognition of and demand for medical statistics. In addition, the results of these various studies differ and are insufficiently comprehensive and systematic. This investigation aimed to evaluate the general cognition of and demand for medical statistics by undergraduates, graduates, and medical staff in China. We performed a comprehensive database search related to the cognition of and demand for medical statistics from January 2007 to July 2014 and conducted a meta-analysis of non-controlled studies with sub-group analysis for undergraduates, graduates, and medical staff. There are substantial differences with respect to the cognition of theory in medical statistics among undergraduates (73.5%), graduates (60.7%), and medical staff (39.6%). The demand for theory in medical statistics is high among graduates (94.6%), undergraduates (86.1%), and medical staff (88.3%). Regarding specific statistical methods, the cognition of basic statistical methods is higher than of advanced statistical methods. The demand for certain advanced statistical methods, including (but not limited to) multiple analysis of variance (ANOVA), multiple linear regression, and logistic regression, is higher than that for basic statistical methods. The use rates of the Statistical Package for the Social Sciences (SPSS) software and statistical analysis software (SAS) are only 55% and 15%, respectively. The overall statistical competence of undergraduates, graduates, and medical staff is insufficient, and their ability to practically apply their statistical knowledge is limited, which constitutes an unsatisfactory state of affairs for medical statistics education. Because the demand for skills in this area is increasing, the need to reform medical statistics education in China has become urgent.

  8. A scan statistic to extract causal gene clusters from case-control genome-wide rare CNV data.

    PubMed

    Nishiyama, Takeshi; Takahashi, Kunihiko; Tango, Toshiro; Pinto, Dalila; Scherer, Stephen W; Takami, Satoshi; Kishino, Hirohisa

    2011-05-26

    Several statistical tests have been developed for analyzing genome-wide association data by incorporating gene pathway information in terms of gene sets. Using these methods, hundreds of gene sets are typically tested, and the tested gene sets often overlap. This overlapping greatly increases the probability of generating false positives, and the results obtained are difficult to interpret, particularly when many gene sets show statistical significance. We propose a flexible statistical framework to circumvent these problems. Inspired by spatial scan statistics for detecting clustering of disease occurrence in the field of epidemiology, we developed a scan statistic to extract disease-associated gene clusters from a whole gene pathway. Extracting one or a few significant gene clusters from a global pathway limits the overall false positive probability, which results in increased statistical power, and facilitates the interpretation of test results. In the present study, we applied our method to genome-wide association data for rare copy-number variations, which have been strongly implicated in common diseases. Application of our method to a simulated dataset demonstrated the high accuracy of this method in detecting disease-associated gene clusters in a whole gene pathway. The scan statistic approach proposed here shows a high level of accuracy in detecting gene clusters in a whole gene pathway. This study has provided a sound statistical framework for analyzing genome-wide rare CNV data by incorporating topological information on the gene pathway.

  9. Laser diagnostics of native cervix dabs with human papilloma virus in high carcinogenic risk

    NASA Astrophysics Data System (ADS)

    Peresunko, O. P.; Karpenko, Ju. G.; Burkovets, D. N.; Ivashko, P. V.; Nikorych, A. V.; Yermolenko, S. B.; Gruia, Ion; Gruia, M. J.

    2015-11-01

    The results of experimental studies of coordinate distributions of Mueller matrix elements of the following types of cervical scraping tissue are presented: rate- low-grade - highly differentiated dysplasia (CIN1-CIN3) - adenocarcinoma of high, medium and low levels of differentiation (G1-G3). The rationale for the choice of statistical points 1-4 orders polarized coherent radiation field, transformed as a result of interaction with the oncologic modified biological layers "epithelium-stroma" as a quantitative criterion of polarimetric optical differentiation state of human biological tissues are shown here. The analysis of the obtained Mueller matrix elements and statistical correlation methods, the systematized by types studied tissues is accomplished. The results of research images of Mueller matrix elements m34 for this type of pathology as low-grade dysplasia (CIN2), the results of its statistical and correlation analysis are presented.

  10. A Guerilla Guide to Common Problems in ‘Neurostatistics’: Essential Statistical Topics in Neuroscience

    PubMed Central

    Smith, Paul F.

    2017-01-01

    Effective inferential statistical analysis is essential for high quality studies in neuroscience. However, recently, neuroscience has been criticised for the poor use of experimental design and statistical analysis. Many of the statistical issues confronting neuroscience are similar to other areas of biology; however, there are some that occur more regularly in neuroscience studies. This review attempts to provide a succinct overview of some of the major issues that arise commonly in the analyses of neuroscience data. These include: the non-normal distribution of the data; inequality of variance between groups; extensive correlation in data for repeated measurements across time or space; excessive multiple testing; inadequate statistical power due to small sample sizes; pseudo-replication; and an over-emphasis on binary conclusions about statistical significance as opposed to effect sizes. Statistical analysis should be viewed as just another neuroscience tool, which is critical to the final outcome of the study. Therefore, it needs to be done well and it is a good idea to be proactive and seek help early, preferably before the study even begins. PMID:29371855

  11. A Guerilla Guide to Common Problems in 'Neurostatistics': Essential Statistical Topics in Neuroscience.

    PubMed

    Smith, Paul F

    2017-01-01

    Effective inferential statistical analysis is essential for high quality studies in neuroscience. However, recently, neuroscience has been criticised for the poor use of experimental design and statistical analysis. Many of the statistical issues confronting neuroscience are similar to other areas of biology; however, there are some that occur more regularly in neuroscience studies. This review attempts to provide a succinct overview of some of the major issues that arise commonly in the analyses of neuroscience data. These include: the non-normal distribution of the data; inequality of variance between groups; extensive correlation in data for repeated measurements across time or space; excessive multiple testing; inadequate statistical power due to small sample sizes; pseudo-replication; and an over-emphasis on binary conclusions about statistical significance as opposed to effect sizes. Statistical analysis should be viewed as just another neuroscience tool, which is critical to the final outcome of the study. Therefore, it needs to be done well and it is a good idea to be proactive and seek help early, preferably before the study even begins.

  12. Economic Statistical Design of Integrated X-bar-S Control Chart with Preventive Maintenance and General Failure Distribution

    PubMed Central

    Caballero Morales, Santiago Omar

    2013-01-01

    The application of Preventive Maintenance (PM) and Statistical Process Control (SPC) are important practices to achieve high product quality, small frequency of failures, and cost reduction in a production process. However there are some points that have not been explored in depth about its joint application. First, most SPC is performed with the X-bar control chart which does not fully consider the variability of the production process. Second, many studies of design of control charts consider just the economic aspect while statistical restrictions must be considered to achieve charts with low probabilities of false detection of failures. Third, the effect of PM on processes with different failure probability distributions has not been studied. Hence, this paper covers these points, presenting the Economic Statistical Design (ESD) of joint X-bar-S control charts with a cost model that integrates PM with general failure distribution. Experiments showed statistically significant reductions in costs when PM is performed on processes with high failure rates and reductions in the sampling frequency of units for testing under SPC. PMID:23527082

  13. Non-Markovian full counting statistics in quantum dot molecules

    PubMed Central

    Xue, Hai-Bin; Jiao, Hu-Jun; Liang, Jiu-Qing; Liu, Wu-Ming

    2015-01-01

    Full counting statistics of electron transport is a powerful diagnostic tool for probing the nature of quantum transport beyond what is obtainable from the average current or conductance measurement alone. In particular, the non-Markovian dynamics of quantum dot molecule plays an important role in the nonequilibrium electron tunneling processes. It is thus necessary to understand the non-Markovian full counting statistics in a quantum dot molecule. Here we study the non-Markovian full counting statistics in two typical quantum dot molecules, namely, serially coupled and side-coupled double quantum dots with high quantum coherence in a certain parameter regime. We demonstrate that the non-Markovian effect manifests itself through the quantum coherence of the quantum dot molecule system, and has a significant impact on the full counting statistics in the high quantum-coherent quantum dot molecule system, which depends on the coupling of the quantum dot molecule system with the source and drain electrodes. The results indicated that the influence of the non-Markovian effect on the full counting statistics of electron transport, which should be considered in a high quantum-coherent quantum dot molecule system, can provide a better understanding of electron transport through quantum dot molecules. PMID:25752245

  14. Chemical quality of bottom sediments in selected streams, Jefferson County, Kentucky, April-July 1992

    USGS Publications Warehouse

    Moore, B.L.; Evaldi, R.D.

    1995-01-01

    Bottom sediments from 25 stream sites in Jefferson County, Ky., were analyzed for percent volatile solids and concentrations of nutrients, major metals, trace elements, miscellaneous inorganic compounds, and selected organic compounds. Statistical high outliers of the constituent concentrations analyzed for in the bottom sediments were defined as a measure of possible elevated concentrations. Statistical high outliers were determined for at least 1 constituent at each of 12 sampling sites in Jefferson County. Of the 10 stream basins sampled in Jefferson County, the Middle Fork Beargrass Basin, Cedar Creek Basin, and Harrods Creek Basin were the only three basins where a statistical high outlier was not found for any of the measured constituents. In the Pennsylvania Run Basin, total volatile solids, nitrate plus nitrite, and endrin constituents were statistical high outliers. Pond Creek was the only basin where five constituents were statistical high outliers-barium, beryllium, cadmium, chromium, and silver. Nitrate plus nitrite and copper constituents were the only statistical high outliers found in the Mill Creek Basin. In the Floyds Fork Basin, nitrate plus nitrite, phosphorus, mercury, and silver constituents were the only statistical high outliers. Ammonia was the only statistical high outlier found in the South Fork Beargrass Basin. In the Goose Creek Basin, mercury and silver constituents were the only statistical high outliers. Cyanide was the only statistical high outlier in the Muddy Fork Basin.

  15. Community College Low-Income and Minority Student Completion Study: Descriptive Statistics from the 1992 High School Cohort

    ERIC Educational Resources Information Center

    Bailey, Thomas; Jenkins, Davis; Leinbach, Timothy

    2005-01-01

    This report summarizes statistics on access and attainment in higher education, focusing particularly on community college students, using data from the National Education Longitudinal Study of 1988 (NELS:88), which follows a nationally representative sample of individuals who were eighth graders in the spring of 1988. A sample of these…

  16. High Impact = High Statistical Standards? Not Necessarily So

    PubMed Central

    Tressoldi, Patrizio E.; Giofré, David; Sella, Francesco; Cumming, Geoff

    2013-01-01

    What are the statistical practices of articles published in journals with a high impact factor? Are there differences compared with articles published in journals with a somewhat lower impact factor that have adopted editorial policies to reduce the impact of limitations of Null Hypothesis Significance Testing? To investigate these questions, the current study analyzed all articles related to psychological, neuropsychological and medical issues, published in 2011 in four journals with high impact factors: Science, Nature, The New England Journal of Medicine and The Lancet, and three journals with relatively lower impact factors: Neuropsychology, Journal of Experimental Psychology-Applied and the American Journal of Public Health. Results show that Null Hypothesis Significance Testing without any use of confidence intervals, effect size, prospective power and model estimation, is the prevalent statistical practice used in articles published in Nature, 89%, followed by articles published in Science, 42%. By contrast, in all other journals, both with high and lower impact factors, most articles report confidence intervals and/or effect size measures. We interpreted these differences as consequences of the editorial policies adopted by the journal editors, which are probably the most effective means to improve the statistical practices in journals with high or low impact factors. PMID:23418533

  17. High impact  =  high statistical standards? Not necessarily so.

    PubMed

    Tressoldi, Patrizio E; Giofré, David; Sella, Francesco; Cumming, Geoff

    2013-01-01

    What are the statistical practices of articles published in journals with a high impact factor? Are there differences compared with articles published in journals with a somewhat lower impact factor that have adopted editorial policies to reduce the impact of limitations of Null Hypothesis Significance Testing? To investigate these questions, the current study analyzed all articles related to psychological, neuropsychological and medical issues, published in 2011 in four journals with high impact factors: Science, Nature, The New England Journal of Medicine and The Lancet, and three journals with relatively lower impact factors: Neuropsychology, Journal of Experimental Psychology-Applied and the American Journal of Public Health. Results show that Null Hypothesis Significance Testing without any use of confidence intervals, effect size, prospective power and model estimation, is the prevalent statistical practice used in articles published in Nature, 89%, followed by articles published in Science, 42%. By contrast, in all other journals, both with high and lower impact factors, most articles report confidence intervals and/or effect size measures. We interpreted these differences as consequences of the editorial policies adopted by the journal editors, which are probably the most effective means to improve the statistical practices in journals with high or low impact factors.

  18. Adaptive interference cancel filter for evoked potential using high-order cumulants.

    PubMed

    Lin, Bor-Shyh; Lin, Bor-Shing; Chong, Fok-Ching; Lai, Feipei

    2004-01-01

    This paper is to present evoked potential (EP) processing using adaptive interference cancel (AIC) filter with second and high order cumulants. In conventional ensemble averaging method, people have to conduct repetitively experiments to record the required data. Recently, the use of AIC structure with second statistics in processing EP has proved more efficiency than traditional averaging method, but it is sensitive to both of the reference signal statistics and the choice of step size. Thus, we proposed higher order statistics-based AIC method to improve these disadvantages. This study was experimented in somatosensory EP corrupted with EEG. Gradient type algorithm is used in AIC method. Comparisons with AIC filter on second, third, fourth order statistics are also presented in this paper. We observed that AIC filter with third order statistics has better convergent performance for EP processing and is not sensitive to the selection of step size and reference input.

  19. Comparative analysis of positive and negative attitudes toward statistics

    NASA Astrophysics Data System (ADS)

    Ghulami, Hassan Rahnaward; Ab Hamid, Mohd Rashid; Zakaria, Roslinazairimah

    2015-02-01

    Many statistics lecturers and statistics education researchers are interested to know the perception of their students' attitudes toward statistics during the statistics course. In statistics course, positive attitude toward statistics is a vital because it will be encourage students to get interested in the statistics course and in order to master the core content of the subject matters under study. Although, students who have negative attitudes toward statistics they will feel depressed especially in the given group assignment, at risk for failure, are often highly emotional, and could not move forward. Therefore, this study investigates the students' attitude towards learning statistics. Six latent constructs have been the measurement of students' attitudes toward learning statistic such as affect, cognitive competence, value, difficulty, interest, and effort. The questionnaire was adopted and adapted from the reliable and validate instrument of Survey of Attitudes towards Statistics (SATS). This study is conducted among engineering undergraduate engineering students in the university Malaysia Pahang (UMP). The respondents consist of students who were taking the applied statistics course from different faculties. From the analysis, it is found that the questionnaire is acceptable and the relationships among the constructs has been proposed and investigated. In this case, students show full effort to master the statistics course, feel statistics course enjoyable, have confidence that they have intellectual capacity, and they have more positive attitudes then negative attitudes towards statistics learning. In conclusion in terms of affect, cognitive competence, value, interest and effort construct the positive attitude towards statistics was mostly exhibited. While negative attitudes mostly exhibited by difficulty construct.

  20. High-Speed, Low-Cost Workstation for Computation-Intensive Statistics. Phase 1

    DTIC Science & Technology

    1990-06-20

    routine implementation and performance. 5 The two compiled versions given in the table were coded in an attempt to obtain an optimized compiled version...level statistics and linear algebra routines (BSAS and BLAS) that have been prototyped in this study. For each routine, both the C code ( Turbo C...OISTRIBUTION /AVAILABILITY STATEMENT 12b. DISTRIBUTION CODE Unlimited distribution 13. ABSTRACT (Maximum 200 words) High-performance and low-cost

  1. EFFICIENTLY ESTABLISHING CONCEPTS OF INFERENTIAL STATISTICS AND HYPOTHESIS DECISION MAKING THROUGH CONTEXTUALLY CONTROLLED EQUIVALENCE CLASSES

    PubMed Central

    Fienup, Daniel M; Critchfield, Thomas S

    2010-01-01

    Computerized lessons that reflect stimulus equivalence principles were used to teach college students concepts related to inferential statistics and hypothesis decision making. Lesson 1 taught participants concepts related to inferential statistics, and Lesson 2 taught them to base hypothesis decisions on a scientific hypothesis and the direction of an effect. Lesson 3 taught the conditional influence of inferential statistics over decisions regarding the scientific and null hypotheses. Participants entered the study with low scores on the targeted skills and left the study demonstrating a high level of accuracy on these skills, which involved mastering more relations than were taught formally. This study illustrates the efficiency of equivalence-based instruction in establishing academic skills in sophisticated learners. PMID:21358904

  2. Statistical Machine Learning for Structured and High Dimensional Data

    DTIC Science & Technology

    2014-09-17

    AFRL-OSR-VA-TR-2014-0234 STATISTICAL MACHINE LEARNING FOR STRUCTURED AND HIGH DIMENSIONAL DATA Larry Wasserman CARNEGIE MELLON UNIVERSITY Final...Re . 8-98) v Prescribed by ANSI Std. Z39.18 14-06-2014 Final Dec 2009 - Aug 2014 Statistical Machine Learning for Structured and High Dimensional...area of resource-constrained statistical estimation. machine learning , high-dimensional statistics U U U UU John Lafferty 773-702-3813 > Research under

  3. A weighted U-statistic for genetic association analyses of sequencing data.

    PubMed

    Wei, Changshuai; Li, Ming; He, Zihuai; Vsevolozhskaya, Olga; Schaid, Daniel J; Lu, Qing

    2014-12-01

    With advancements in next-generation sequencing technology, a massive amount of sequencing data is generated, which offers a great opportunity to comprehensively investigate the role of rare variants in the genetic etiology of complex diseases. Nevertheless, the high-dimensional sequencing data poses a great challenge for statistical analysis. The association analyses based on traditional statistical methods suffer substantial power loss because of the low frequency of genetic variants and the extremely high dimensionality of the data. We developed a Weighted U Sequencing test, referred to as WU-SEQ, for the high-dimensional association analysis of sequencing data. Based on a nonparametric U-statistic, WU-SEQ makes no assumption of the underlying disease model and phenotype distribution, and can be applied to a variety of phenotypes. Through simulation studies and an empirical study, we showed that WU-SEQ outperformed a commonly used sequence kernel association test (SKAT) method when the underlying assumptions were violated (e.g., the phenotype followed a heavy-tailed distribution). Even when the assumptions were satisfied, WU-SEQ still attained comparable performance to SKAT. Finally, we applied WU-SEQ to sequencing data from the Dallas Heart Study (DHS), and detected an association between ANGPTL 4 and very low density lipoprotein cholesterol. © 2014 WILEY PERIODICALS, INC.

  4. Estimation of integral curves from high angular resolution diffusion imaging (HARDI) data.

    PubMed

    Carmichael, Owen; Sakhanenko, Lyudmila

    2015-05-15

    We develop statistical methodology for a popular brain imaging technique HARDI based on the high order tensor model by Özarslan and Mareci [10]. We investigate how uncertainty in the imaging procedure propagates through all levels of the model: signals, tensor fields, vector fields, and fibers. We construct asymptotically normal estimators of the integral curves or fibers which allow us to trace the fibers together with confidence ellipsoids. The procedure is computationally intense as it blends linear algebra concepts from high order tensors with asymptotical statistical analysis. The theoretical results are illustrated on simulated and real datasets. This work generalizes the statistical methodology proposed for low angular resolution diffusion tensor imaging by Carmichael and Sakhanenko [3], to several fibers per voxel. It is also a pioneering statistical work on tractography from HARDI data. It avoids all the typical limitations of the deterministic tractography methods and it delivers the same information as probabilistic tractography methods. Our method is computationally cheap and it provides well-founded mathematical and statistical framework where diverse functionals on fibers, directions and tensors can be studied in a systematic and rigorous way.

  5. Estimation of integral curves from high angular resolution diffusion imaging (HARDI) data

    PubMed Central

    Carmichael, Owen; Sakhanenko, Lyudmila

    2015-01-01

    We develop statistical methodology for a popular brain imaging technique HARDI based on the high order tensor model by Özarslan and Mareci [10]. We investigate how uncertainty in the imaging procedure propagates through all levels of the model: signals, tensor fields, vector fields, and fibers. We construct asymptotically normal estimators of the integral curves or fibers which allow us to trace the fibers together with confidence ellipsoids. The procedure is computationally intense as it blends linear algebra concepts from high order tensors with asymptotical statistical analysis. The theoretical results are illustrated on simulated and real datasets. This work generalizes the statistical methodology proposed for low angular resolution diffusion tensor imaging by Carmichael and Sakhanenko [3], to several fibers per voxel. It is also a pioneering statistical work on tractography from HARDI data. It avoids all the typical limitations of the deterministic tractography methods and it delivers the same information as probabilistic tractography methods. Our method is computationally cheap and it provides well-founded mathematical and statistical framework where diverse functionals on fibers, directions and tensors can be studied in a systematic and rigorous way. PMID:25937674

  6. A systematic review of the quality of statistical methods employed for analysing quality of life data in cancer randomised controlled trials.

    PubMed

    Hamel, Jean-Francois; Saulnier, Patrick; Pe, Madeline; Zikos, Efstathios; Musoro, Jammbe; Coens, Corneel; Bottomley, Andrew

    2017-09-01

    Over the last decades, Health-related Quality of Life (HRQoL) end-points have become an important outcome of the randomised controlled trials (RCTs). HRQoL methodology in RCTs has improved following international consensus recommendations. However, no international recommendations exist concerning the statistical analysis of such data. The aim of our study was to identify and characterise the quality of the statistical methods commonly used for analysing HRQoL data in cancer RCTs. Building on our recently published systematic review, we analysed a total of 33 published RCTs studying the HRQoL methods reported in RCTs since 1991. We focussed on the ability of the methods to deal with the three major problems commonly encountered when analysing HRQoL data: their multidimensional and longitudinal structure and the commonly high rate of missing data. All studies reported HRQoL being assessed repeatedly over time for a period ranging from 2 to 36 months. Missing data were common, with compliance rates ranging from 45% to 90%. From the 33 studies considered, 12 different statistical methods were identified. Twenty-nine studies analysed each of the questionnaire sub-dimensions without type I error adjustment. Thirteen studies repeated the HRQoL analysis at each assessment time again without type I error adjustment. Only 8 studies used methods suitable for repeated measurements. Our findings show a lack of consistency in statistical methods for analysing HRQoL data. Problems related to multiple comparisons were rarely considered leading to a high risk of false positive results. It is therefore critical that international recommendations for improving such statistical practices are developed. Copyright © 2017. Published by Elsevier Ltd.

  7. Exploring the Link Between Streamflow Trends and Climate Change in Indiana, USA

    NASA Astrophysics Data System (ADS)

    Kumar, S.; Kam, J.; Thurner, K.; Merwade, V.

    2007-12-01

    Streamflow trends in Indiana are evaluated for 85 USGS streamflow gaging stations that have continuous unregulated streamflow records varying from 10 to 80 years. The trends are analyzed by using the non-parametric Mann-Kendall test with prior trend-free pre-whitening to remove serial correlation in the data. Bootstrap method is used to establish field significance of the results. Trends are computed for 12 streamflow statistics to include low-, medium- (median and mean flow), and high-flow conditions on annual and seasonal time step. The analysis is done for six study periods, ranging from 10 years to more than 65 years, all ending in 2003. The trends in annual average streamflow, for 50 years study period, are compared with annual average precipitation trends from 14 National Climatic Data Center (NCDC) stations in Indiana, that have 50 years of continuous daily record. The results show field significant positive trends in annual low and medium streamflow statistics at majority of gaging stations for study periods that include 40 or more years of records. In seasonal analysis, all flow statistics in summer and fall (low flow seasons), and only low flow statistics in winter and spring (high flow seasons) are showing positive trends. No field significant trends in annual and seasonal flow statistics are observed for study periods that include 25 or fewer years of records, except for northern Indiana where localized negative trends are observed in 10 and 15 years study periods. Further, stream flow trends are found to be highly correlated with precipitation trends on annual time step. No apparent climate change signal is observed in Indiana stream flow records.

  8. An investigation of new toxicity test method performance in validation studies: 1. Toxicity test methods that have predictive capacity no greater than chance.

    PubMed

    Bruner, L H; Carr, G J; Harbell, J W; Curren, R D

    2002-06-01

    An approach commonly used to measure new toxicity test method (NTM) performance in validation studies is to divide toxicity results into positive and negative classifications, and the identify true positive (TP), true negative (TN), false positive (FP) and false negative (FN) results. After this step is completed, the contingent probability statistics (CPS), sensitivity, specificity, positive predictive value (PPV), and negative predictive value (NPV) are calculated. Although these statistics are widely used and often the only statistics used to assess the performance of toxicity test methods, there is little specific guidance in the validation literature on what values for these statistics indicate adequate performance. The purpose of this study was to begin developing data-based answers to this question by characterizing the CPS obtained from an NTM whose data have a completely random association with a reference test method (RTM). Determining the CPS of this worst-case scenario is useful because it provides a lower baseline from which the performance of an NTM can be judged in future validation studies. It also provides an indication of relationships in the CPS that help identify random or near-random relationships in the data. The results from this study of randomly associated tests show that the values obtained for the statistics vary significantly depending on the cut-offs chosen, that high values can be obtained for individual statistics, and that the different measures cannot be considered independently when evaluating the performance of an NTM. When the association between results of an NTM and RTM is random the sum of the complementary pairs of statistics (sensitivity + specificity, NPV + PPV) is approximately 1, and the prevalence (i.e., the proportion of toxic chemicals in the population of chemicals) and PPV are equal. Given that combinations of high sensitivity-low specificity or low specificity-high sensitivity (i.e., the sum of the sensitivity and specificity equal to approximately 1) indicate lack of predictive capacity, an NTM having these performance characteristics should be considered no better for predicting toxicity than by chance alone.

  9. Manipulating measurement scales in medical statistical analysis and data mining: A review of methodologies

    PubMed Central

    Marateb, Hamid Reza; Mansourian, Marjan; Adibi, Peyman; Farina, Dario

    2014-01-01

    Background: selecting the correct statistical test and data mining method depends highly on the measurement scale of data, type of variables, and purpose of the analysis. Different measurement scales are studied in details and statistical comparison, modeling, and data mining methods are studied based upon using several medical examples. We have presented two ordinal–variables clustering examples, as more challenging variable in analysis, using Wisconsin Breast Cancer Data (WBCD). Ordinal-to-Interval scale conversion example: a breast cancer database of nine 10-level ordinal variables for 683 patients was analyzed by two ordinal-scale clustering methods. The performance of the clustering methods was assessed by comparison with the gold standard groups of malignant and benign cases that had been identified by clinical tests. Results: the sensitivity and accuracy of the two clustering methods were 98% and 96%, respectively. Their specificity was comparable. Conclusion: by using appropriate clustering algorithm based on the measurement scale of the variables in the study, high performance is granted. Moreover, descriptive and inferential statistics in addition to modeling approach must be selected based on the scale of the variables. PMID:24672565

  10. A perceptual space of local image statistics.

    PubMed

    Victor, Jonathan D; Thengone, Daniel J; Rizvi, Syed M; Conte, Mary M

    2015-12-01

    Local image statistics are important for visual analysis of textures, surfaces, and form. There are many kinds of local statistics, including those that capture luminance distributions, spatial contrast, oriented segments, and corners. While sensitivity to each of these kinds of statistics have been well-studied, much less is known about visual processing when multiple kinds of statistics are relevant, in large part because the dimensionality of the problem is high and different kinds of statistics interact. To approach this problem, we focused on binary images on a square lattice - a reduced set of stimuli which nevertheless taps many kinds of local statistics. In this 10-parameter space, we determined psychophysical thresholds to each kind of statistic (16 observers) and all of their pairwise combinations (4 observers). Sensitivities and isodiscrimination contours were consistent across observers. Isodiscrimination contours were elliptical, implying a quadratic interaction rule, which in turn determined ellipsoidal isodiscrimination surfaces in the full 10-dimensional space, and made predictions for sensitivities to complex combinations of statistics. These predictions, including the prediction of a combination of statistics that was metameric to random, were verified experimentally. Finally, check size had only a mild effect on sensitivities over the range from 2.8 to 14min, but sensitivities to second- and higher-order statistics was substantially lower at 1.4min. In sum, local image statistics form a perceptual space that is highly stereotyped across observers, in which different kinds of statistics interact according to simple rules. Copyright © 2015 Elsevier Ltd. All rights reserved.

  11. A perceptual space of local image statistics

    PubMed Central

    Victor, Jonathan D.; Thengone, Daniel J.; Rizvi, Syed M.; Conte, Mary M.

    2015-01-01

    Local image statistics are important for visual analysis of textures, surfaces, and form. There are many kinds of local statistics, including those that capture luminance distributions, spatial contrast, oriented segments, and corners. While sensitivity to each of these kinds of statistics have been well-studied, much less is known about visual processing when multiple kinds of statistics are relevant, in large part because the dimensionality of the problem is high and different kinds of statistics interact. To approach this problem, we focused on binary images on a square lattice – a reduced set of stimuli which nevertheless taps many kinds of local statistics. In this 10-parameter space, we determined psychophysical thresholds to each kind of statistic (16 observers) and all of their pairwise combinations (4 observers). Sensitivities and isodiscrimination contours were consistent across observers. Isodiscrimination contours were elliptical, implying a quadratic interaction rule, which in turn determined ellipsoidal isodiscrimination surfaces in the full 10-dimensional space, and made predictions for sensitivities to complex combinations of statistics. These predictions, including the prediction of a combination of statistics that was metameric to random, were verified experimentally. Finally, check size had only a mild effect on sensitivities over the range from 2.8 to 14 min, but sensitivities to second- and higher-order statistics was substantially lower at 1.4 min. In sum, local image statistics forms a perceptual space that is highly stereotyped across observers, in which different kinds of statistics interact according to simple rules. PMID:26130606

  12. Statistical testing of baseline differences in sports medicine RCTs: a systematic evaluation.

    PubMed

    Peterson, Ross L; Tran, Matthew; Koffel, Jonathan; Stovitz, Steven D

    2017-01-01

    The CONSORT (Consolidated Standards of Reporting Trials) statement discourages reporting statistical tests of baseline differences between groups in randomised controlled trials (RCTs). However, this practice is still common in many medical fields. Our aim was to determine the prevalence of this practice in leading sports medicine journals. We conducted a comprehensive search in Medline through PubMed to identify RCTs published in the years 2005 and 2015 from 10 high-impact sports medicine journals. Two reviewers independently confirmed the trial design and reached consensus on which articles contained statistical tests of baseline differences. Our search strategy identified a total of 324 RCTs, with 85 from the year 2005 and 239 from the year 2015. Overall, 64.8% of studies (95% CI (59.6, 70.0)) reported statistical tests of baseline differences; broken down by year, this percentage was 67.1% in 2005 (95% CI (57.1, 77.1)) and 64.0% in 2015 (95% CI (57.9, 70.1)). Although discouraged by the CONSORT statement, statistical testing of baseline differences remains highly prevalent in sports medicine RCTs. Statistical testing of baseline differences can mislead authors; for example, by failing to identify meaningful baseline differences in small studies. Journals that ask authors to follow the CONSORT statement guidelines should recognise that many manuscripts are ignoring the recommendation against statistical testing of baseline differences.

  13. [Oral health status of women with normal and high-risk pregnancies].

    PubMed

    Chaloupka, P; Korečko, V; Turek, J; Merglová, V

    2014-01-01

    The aim of this study was to compare the oral health status of women with normal pregnancies and those with high-risk pregnancies. A total of 142 women in the third trimester of pregnancy were randomly selected for this study. The pregnant women were divided into two groups: a normal pregnancy group (group F, n = 61) and a high-risk pregnancy group (group R, n = 81). The following variables were recorded for each woman: age, general health status, DMF index, CPITN index, PBI index, amounts of Streptococcus mutans in the saliva and dental treatment needs. The data obtained were analysed statistically. The Mann-Whitney test, Kruskal-Wallis test and chi square test were used, and p-values less than 0.05 were considered statistically significant. The two-sided t-test was used to compare the two cohorts. Women with high-risk pregnancies showed increased values in all measured indices and tests, but there were no statistically significant differences between the two groups in the DMF index, CPITN index and amounts of Streptococcus mutans present in the saliva. Statistically significant differences were detected between the two groups for the PBI index and dental treatment needs. In group F, the maximum PBI index value was 2.9 in group F and 3.8 in group R. Significant differences were found also in mean PBI values. Out of the entire study cohort, 94 women (66.2%) required dental treatment, including 52% (n = 32) of the women with normal pregnancies and 77% (n = 62) of the women with high-risk pregnancies. This study found that women with complications during pregnancy had severe gingivitis and needed more frequent dental treatment than women with normal pregnancies.

  14. Self-efficacy, stress, and acculturation as predictors of first year science success among Latinos at a South Texas university

    NASA Astrophysics Data System (ADS)

    McNamara, Mark W.

    The study tested the hypothesis that self-efficacy, stress, and acculturation are useful predictors of academic achievement in first year university science, independent of high school GPA and SAT scores, in a sample of Latino students at a South Texas Hispanic serving institution of higher education. The correlational study employed a mixed methods explanatory sequential model. The non-probability sample consisted of 98 university science and engineering students. The study participants had high science self-efficacy, low number of stressors, and were slightly Anglo-oriented bicultural to strongly Anglo-oriented. As expected, the control variables of SAT score and high school GPA were statistically significant predictors of the outcome measures. Together, they accounted for 19.80% of the variation in first year GPA, 13.80% of the variation in earned credit hours, and 11.30% of the variation in intent to remain in the science major. After controlling for SAT scores and high school GPAs, self-efficacy was a statistically significant predictor of credit hours earned and accounted for 5.60% of the variation; its unique contribution in explaining the variation in first year GPA and intent to remain in the science major was not statistically significant. Stress and acculturation were not statistically significant predictors of any of the outcome measures. Analysis of the qualitative data resulted in six themes (a) high science self-efficacy, (b) stressors, (c) positive role of stress, (d) Anglo-oriented, (e) bicultural, and (f) family. The quantitative and qualitative results were synthesized and practical implications were discussed.

  15. Measures of health sciences journal use: a comparison of vendor, link-resolver, and local citation statistics.

    PubMed

    De Groote, Sandra L; Blecic, Deborah D; Martin, Kristin

    2013-04-01

    Libraries require efficient and reliable methods to assess journal use. Vendors provide complete counts of articles retrieved from their platforms. However, if a journal is available on multiple platforms, several sets of statistics must be merged. Link-resolver reports merge data from all platforms into one report but only record partial use because users can access library subscriptions from other paths. Citation data are limited to publication use. Vendor, link-resolver, and local citation data were examined to determine correlation. Because link-resolver statistics are easy to obtain, the study library especially wanted to know if they correlate highly with the other measures. Vendor, link-resolver, and local citation statistics for the study institution were gathered for health sciences journals. Spearman rank-order correlation coefficients were calculated. There was a high positive correlation between all three data sets, with vendor data commonly showing the highest use. However, a small percentage of titles showed anomalous results. Link-resolver data correlate well with vendor and citation data, but due to anomalies, low link-resolver data would best be used to suggest titles for further evaluation using vendor data. Citation data may not be needed as it correlates highly with other measures.

  16. Rasch fit statistics and sample size considerations for polytomous data.

    PubMed

    Smith, Adam B; Rush, Robert; Fallowfield, Lesley J; Velikova, Galina; Sharpe, Michael

    2008-05-29

    Previous research on educational data has demonstrated that Rasch fit statistics (mean squares and t-statistics) are highly susceptible to sample size variation for dichotomously scored rating data, although little is known about this relationship for polytomous data. These statistics help inform researchers about how well items fit to a unidimensional latent trait, and are an important adjunct to modern psychometrics. Given the increasing use of Rasch models in health research the purpose of this study was therefore to explore the relationship between fit statistics and sample size for polytomous data. Data were collated from a heterogeneous sample of cancer patients (n = 4072) who had completed both the Patient Health Questionnaire - 9 and the Hospital Anxiety and Depression Scale. Ten samples were drawn with replacement for each of eight sample sizes (n = 25 to n = 3200). The Rating and Partial Credit Models were applied and the mean square and t-fit statistics (infit/outfit) derived for each model. The results demonstrated that t-statistics were highly sensitive to sample size, whereas mean square statistics remained relatively stable for polytomous data. It was concluded that mean square statistics were relatively independent of sample size for polytomous data and that misfit to the model could be identified using published recommended ranges.

  17. Rasch fit statistics and sample size considerations for polytomous data

    PubMed Central

    Smith, Adam B; Rush, Robert; Fallowfield, Lesley J; Velikova, Galina; Sharpe, Michael

    2008-01-01

    Background Previous research on educational data has demonstrated that Rasch fit statistics (mean squares and t-statistics) are highly susceptible to sample size variation for dichotomously scored rating data, although little is known about this relationship for polytomous data. These statistics help inform researchers about how well items fit to a unidimensional latent trait, and are an important adjunct to modern psychometrics. Given the increasing use of Rasch models in health research the purpose of this study was therefore to explore the relationship between fit statistics and sample size for polytomous data. Methods Data were collated from a heterogeneous sample of cancer patients (n = 4072) who had completed both the Patient Health Questionnaire – 9 and the Hospital Anxiety and Depression Scale. Ten samples were drawn with replacement for each of eight sample sizes (n = 25 to n = 3200). The Rating and Partial Credit Models were applied and the mean square and t-fit statistics (infit/outfit) derived for each model. Results The results demonstrated that t-statistics were highly sensitive to sample size, whereas mean square statistics remained relatively stable for polytomous data. Conclusion It was concluded that mean square statistics were relatively independent of sample size for polytomous data and that misfit to the model could be identified using published recommended ranges. PMID:18510722

  18. Static anthropometric dimensions in a population of Iranian high school students: considering ethnic differences.

    PubMed

    Mehrparvar, Amir Houshang; Mirmohammadi, Seyyed Jalil; Hafezi, Rahmatollah; Mostaghaci, Mehrdad; Davari, Mohammad Hossein

    2015-05-01

    Anthropometric dimensions of the end users should be measured in order to create a basis for manufacturing of different products. This study was designed to measure some static anthropometric dimensions in Iranian high school students, considering ethnic differences. Nineteen static anthropometric dimensions of high school students were measured and compared among different Iranian ethnicities (Fars, Turk, Kurd, Lor, Baluch, and Arab) and different genders. In this study, 9,476 subjects (4,703 boys and 4,773 girls) ages 15 to 18 years in six ethnicities were assessed. The difference among ethnicities was statistically significant for all dimensions (p values < .001 for each dimension). This study showed statistically significant differences in 19 static anthropometric dimensions among high school students regarding gender, age, and ethnicity. © 2014, Human Factors and Ergonomics Society.

  19. A new statistic for identifying batch effects in high-throughput genomic data that uses guided principal component analysis.

    PubMed

    Reese, Sarah E; Archer, Kellie J; Therneau, Terry M; Atkinson, Elizabeth J; Vachon, Celine M; de Andrade, Mariza; Kocher, Jean-Pierre A; Eckel-Passow, Jeanette E

    2013-11-15

    Batch effects are due to probe-specific systematic variation between groups of samples (batches) resulting from experimental features that are not of biological interest. Principal component analysis (PCA) is commonly used as a visual tool to determine whether batch effects exist after applying a global normalization method. However, PCA yields linear combinations of the variables that contribute maximum variance and thus will not necessarily detect batch effects if they are not the largest source of variability in the data. We present an extension of PCA to quantify the existence of batch effects, called guided PCA (gPCA). We describe a test statistic that uses gPCA to test whether a batch effect exists. We apply our proposed test statistic derived using gPCA to simulated data and to two copy number variation case studies: the first study consisted of 614 samples from a breast cancer family study using Illumina Human 660 bead-chip arrays, whereas the second case study consisted of 703 samples from a family blood pressure study that used Affymetrix SNP Array 6.0. We demonstrate that our statistic has good statistical properties and is able to identify significant batch effects in two copy number variation case studies. We developed a new statistic that uses gPCA to identify whether batch effects exist in high-throughput genomic data. Although our examples pertain to copy number data, gPCA is general and can be used on other data types as well. The gPCA R package (Available via CRAN) provides functionality and data to perform the methods in this article. reesese@vcu.edu

  20. A statistical approach to selecting and confirming validation targets in -omics experiments

    PubMed Central

    2012-01-01

    Background Genomic technologies are, by their very nature, designed for hypothesis generation. In some cases, the hypotheses that are generated require that genome scientists confirm findings about specific genes or proteins. But one major advantage of high-throughput technology is that global genetic, genomic, transcriptomic, and proteomic behaviors can be observed. Manual confirmation of every statistically significant genomic result is prohibitively expensive. This has led researchers in genomics to adopt the strategy of confirming only a handful of the most statistically significant results, a small subset chosen for biological interest, or a small random subset. But there is no standard approach for selecting and quantitatively evaluating validation targets. Results Here we present a new statistical method and approach for statistically validating lists of significant results based on confirming only a small random sample. We apply our statistical method to show that the usual practice of confirming only the most statistically significant results does not statistically validate result lists. We analyze an extensively validated RNA-sequencing experiment to show that confirming a random subset can statistically validate entire lists of significant results. Finally, we analyze multiple publicly available microarray experiments to show that statistically validating random samples can both (i) provide evidence to confirm long gene lists and (ii) save thousands of dollars and hundreds of hours of labor over manual validation of each significant result. Conclusions For high-throughput -omics studies, statistical validation is a cost-effective and statistically valid approach to confirming lists of significant results. PMID:22738145

  1. A Vehicle for Bivariate Data Analysis

    ERIC Educational Resources Information Center

    Roscoe, Matt B.

    2016-01-01

    Instead of reserving the study of probability and statistics for special fourth-year high school courses, the Common Core State Standards for Mathematics (CCSSM) takes a "statistics for all" approach. The standards recommend that students in grades 6-8 learn to summarize and describe data distributions, understand probability, draw…

  2. Cognitive load, emotion, and performance in high-fidelity simulation among beginning nursing students: a pilot study.

    PubMed

    Schlairet, Maura C; Schlairet, Timothy James; Sauls, Denise H; Bellflowers, Lois

    2015-03-01

    Establishing the impact of the high-fidelity simulation environment on student performance, as well as identifying factors that could predict learning, would refine simulation outcome expectations among educators. The purpose of this quasi-experimental pilot study was to explore the impact of simulation on emotion and cognitive load among beginning nursing students. Forty baccalaureate nursing students participated in teaching simulations, rated their emotional state and cognitive load, and completed evaluation simulations. Two principal components of emotion were identified representing the pleasant activation and pleasant deactivation components of affect. Mean rating of cognitive load following simulation was high. Linear regression identiffed slight but statistically nonsignificant positive associations between principal components of emotion and cognitive load. Logistic regression identified a negative but statistically nonsignificant effect of cognitive load on assessment performance. Among lower ability students, a more pronounced effect of cognitive load on assessment performance was observed; this also was statistically non-significant. Copyright 2015, SLACK Incorporated.

  3. Application of the Linux cluster for exhaustive window haplotype analysis using the FBAT and Unphased programs.

    PubMed

    Mishima, Hiroyuki; Lidral, Andrew C; Ni, Jun

    2008-05-28

    Genetic association studies have been used to map disease-causing genes. A newly introduced statistical method, called exhaustive haplotype association study, analyzes genetic information consisting of different numbers and combinations of DNA sequence variations along a chromosome. Such studies involve a large number of statistical calculations and subsequently high computing power. It is possible to develop parallel algorithms and codes to perform the calculations on a high performance computing (HPC) system. However, most existing commonly-used statistic packages for genetic studies are non-parallel versions. Alternatively, one may use the cutting-edge technology of grid computing and its packages to conduct non-parallel genetic statistical packages on a centralized HPC system or distributed computing systems. In this paper, we report the utilization of a queuing scheduler built on the Grid Engine and run on a Rocks Linux cluster for our genetic statistical studies. Analysis of both consecutive and combinational window haplotypes was conducted by the FBAT (Laird et al., 2000) and Unphased (Dudbridge, 2003) programs. The dataset consisted of 26 loci from 277 extended families (1484 persons). Using the Rocks Linux cluster with 22 compute-nodes, FBAT jobs performed about 14.4-15.9 times faster, while Unphased jobs performed 1.1-18.6 times faster compared to the accumulated computation duration. Execution of exhaustive haplotype analysis using non-parallel software packages on a Linux-based system is an effective and efficient approach in terms of cost and performance.

  4. Application of the Linux cluster for exhaustive window haplotype analysis using the FBAT and Unphased programs

    PubMed Central

    Mishima, Hiroyuki; Lidral, Andrew C; Ni, Jun

    2008-01-01

    Background Genetic association studies have been used to map disease-causing genes. A newly introduced statistical method, called exhaustive haplotype association study, analyzes genetic information consisting of different numbers and combinations of DNA sequence variations along a chromosome. Such studies involve a large number of statistical calculations and subsequently high computing power. It is possible to develop parallel algorithms and codes to perform the calculations on a high performance computing (HPC) system. However, most existing commonly-used statistic packages for genetic studies are non-parallel versions. Alternatively, one may use the cutting-edge technology of grid computing and its packages to conduct non-parallel genetic statistical packages on a centralized HPC system or distributed computing systems. In this paper, we report the utilization of a queuing scheduler built on the Grid Engine and run on a Rocks Linux cluster for our genetic statistical studies. Results Analysis of both consecutive and combinational window haplotypes was conducted by the FBAT (Laird et al., 2000) and Unphased (Dudbridge, 2003) programs. The dataset consisted of 26 loci from 277 extended families (1484 persons). Using the Rocks Linux cluster with 22 compute-nodes, FBAT jobs performed about 14.4–15.9 times faster, while Unphased jobs performed 1.1–18.6 times faster compared to the accumulated computation duration. Conclusion Execution of exhaustive haplotype analysis using non-parallel software packages on a Linux-based system is an effective and efficient approach in terms of cost and performance. PMID:18541045

  5. Antecedents of students' achievement in statistics

    NASA Astrophysics Data System (ADS)

    Awaludin, Izyan Syazana; Razak, Ruzanna Ab; Harris, Hezlin; Selamat, Zarehan

    2015-02-01

    The applications of statistics in most fields have been vast. Many degree programmes at local universities require students to enroll in at least one statistics course. The standard of these courses varies across different degree programmes. This is because of students' diverse academic backgrounds in which some comes far from the field of statistics. The high failure rate in statistics courses for non-science stream students had been concerning every year. The purpose of this research is to investigate the antecedents of students' achievement in statistics. A total of 272 students participated in the survey. Multiple linear regression was applied to examine the relationship between the factors and achievement. We found that statistics anxiety was a significant predictor of students' achievement. We also found that students' age has significant effect to achievement. Older students are more likely to achieve lowers scores in statistics. Student's level of study also has a significant impact on their achievement in statistics.

  6. Verification of statistical method CORN for modeling of microfuel in the case of high grain concentration

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Chukbar, B. K., E-mail: bchukbar@mail.ru

    Two methods of modeling a double-heterogeneity fuel are studied: the deterministic positioning and the statistical method CORN of the MCU software package. The effect of distribution of microfuel in a pebble bed on the calculation results is studied. The results of verification of the statistical method CORN for the cases of the microfuel concentration up to 170 cm{sup –3} in a pebble bed are presented. The admissibility of homogenization of the microfuel coating with the graphite matrix is studied. The dependence of the reactivity on the relative location of fuel and graphite spheres in a pebble bed is found.

  7. Statistical innovations in the medical device world sparked by the FDA.

    PubMed

    Campbell, Gregory; Yue, Lilly Q

    2016-01-01

    The world of medical devices while highly diverse is extremely innovative, and this facilitates the adoption of innovative statistical techniques. Statisticians in the Center for Devices and Radiological Health (CDRH) at the Food and Drug Administration (FDA) have provided leadership in implementing statistical innovations. The innovations discussed include: the incorporation of Bayesian methods in clinical trials, adaptive designs, the use and development of propensity score methodology in the design and analysis of non-randomized observational studies, the use of tipping-point analysis for missing data, techniques for diagnostic test evaluation, bridging studies for companion diagnostic tests, quantitative benefit-risk decisions, and patient preference studies.

  8. Preparing High School Students for Success in Advanced Placement Statistics: An Investigation of Pedagogies and Strategies Used in an Online Advanced Placement Statistics Course

    ERIC Educational Resources Information Center

    Potter, James Thomson, III

    2012-01-01

    Research into teaching practices and strategies has been performed separately in AP Statistics and in K-12 online learning (Garfield, 2002; Ferdig, DiPietro, Black & Dawson, 2009). This study seeks combine the two and build on the need for more investigation into online teaching and learning in specific content (Ferdig et al, 2009; DiPietro,…

  9. RooStatsCms: A tool for analysis modelling, combination and statistical studies

    NASA Astrophysics Data System (ADS)

    Piparo, D.; Schott, G.; Quast, G.

    2010-04-01

    RooStatsCms is an object oriented statistical framework based on the RooFit technology. Its scope is to allow the modelling, statistical analysis and combination of multiple search channels for new phenomena in High Energy Physics. It provides a variety of methods described in literature implemented as classes, whose design is oriented to the execution of multiple CPU intensive jobs on batch systems or on the Grid.

  10. Development of Composite Materials with High Passive Damping Properties

    DTIC Science & Technology

    2006-05-15

    frequency response function analysis. Sound transmission through sandwich panels was studied using the statistical energy analysis (SEA). Modal density...2.2.3 Finite element models 14 2.2.4 Statistical energy analysis method 15 CHAPTER 3 ANALYSIS OF DAMPING IN SANDWICH MATERIALS. 24 3.1 Equation of...sheets and the core. 2.2.4 Statistical energy analysis method Finite element models are generally only efficient for problems at low and middle frequencies

  11. Self-Explanation in the Domain of Statistics: An Expertise Reversal Effect

    ERIC Educational Resources Information Center

    Leppink, Jimmie; Broers, Nick J.; Imbos, Tjaart; van der Vleuten, Cees P. M.; Berger, Martijn P. F.

    2012-01-01

    This study investigated the effects of four instructional methods on cognitive load, propositional knowledge, and conceptual understanding of statistics, for low prior knowledge students and for high prior knowledge students. The instructional methods were (1) a reading-only control condition, (2) answering open-ended questions, (3) answering…

  12. An Experimental Approach to Teaching and Learning Elementary Statistical Mechanics

    ERIC Educational Resources Information Center

    Ellis, Frank B.; Ellis, David C.

    2008-01-01

    Introductory statistical mechanics is studied for a simple two-state system using an inexpensive and easily built apparatus. A large variety of demonstrations, suitable for students in high school and introductory university chemistry courses, are possible. This article details demonstrations for exothermic and endothermic reactions, the dynamic…

  13. Forecast in foreign exchange markets

    NASA Astrophysics Data System (ADS)

    Baviera, R.; Pasquini, M.; Serva, M.; Vergni, D.; Vulpiani, A.

    2001-04-01

    We perform a statistical study of weak efficiency in Deutschemark/US dollar exchange rates using high frequency data. The presence of correlations in the returns sequence implies the possibility of a statistical forecast of market behavior. We show the existence of correlations and how information theory can be relevant in this context.

  14. Non-parametric model selection for subject-specific topological organization of resting-state functional connectivity.

    PubMed

    Ferrarini, Luca; Veer, Ilya M; van Lew, Baldur; Oei, Nicole Y L; van Buchem, Mark A; Reiber, Johan H C; Rombouts, Serge A R B; Milles, J

    2011-06-01

    In recent years, graph theory has been successfully applied to study functional and anatomical connectivity networks in the human brain. Most of these networks have shown small-world topological characteristics: high efficiency in long distance communication between nodes, combined with highly interconnected local clusters of nodes. Moreover, functional studies performed at high resolutions have presented convincing evidence that resting-state functional connectivity networks exhibits (exponentially truncated) scale-free behavior. Such evidence, however, was mostly presented qualitatively, in terms of linear regressions of the degree distributions on log-log plots. Even when quantitative measures were given, these were usually limited to the r(2) correlation coefficient. However, the r(2) statistic is not an optimal estimator of explained variance, when dealing with (truncated) power-law models. Recent developments in statistics have introduced new non-parametric approaches, based on the Kolmogorov-Smirnov test, for the problem of model selection. In this work, we have built on this idea to statistically tackle the issue of model selection for the degree distribution of functional connectivity at rest. The analysis, performed at voxel level and in a subject-specific fashion, confirmed the superiority of a truncated power-law model, showing high consistency across subjects. Moreover, the most highly connected voxels were found to be consistently part of the default mode network. Our results provide statistically sound support to the evidence previously presented in literature for a truncated power-law model of resting-state functional connectivity. Copyright © 2010 Elsevier Inc. All rights reserved.

  15. Survey of editors and reviewers of high-impact psychology journals: statistical and research design problems in submitted manuscripts.

    PubMed

    Harris, Alex; Reeder, Rachelle; Hyun, Jenny

    2011-01-01

    The authors surveyed 21 editors and reviewers from major psychology journals to identify and describe the statistical and design errors they encounter most often and to get their advice regarding prevention of these problems. Content analysis of the text responses revealed themes in 3 major areas: (a) problems with research design and reporting (e.g., lack of an a priori power analysis, lack of congruence between research questions and study design/analysis, failure to adequately describe statistical procedures); (b) inappropriate data analysis (e.g., improper use of analysis of variance, too many statistical tests without adjustments, inadequate strategy for addressing missing data); and (c) misinterpretation of results. If researchers attended to these common methodological and analytic issues, the scientific quality of manuscripts submitted to high-impact psychology journals might be significantly improved.

  16. Assessing the independent contribution of maternal educational expectations to children's educational attainment in early adulthood: a propensity score matching analysis.

    PubMed

    Pingault, Jean Baptiste; Côté, Sylvana M; Petitclerc, Amélie; Vitaro, Frank; Tremblay, Richard E

    2015-01-01

    Parental educational expectations have been associated with children's educational attainment in a number of long-term longitudinal studies, but whether this relationship is causal has long been debated. The aims of this prospective study were twofold: 1) test whether low maternal educational expectations contributed to failure to graduate from high school; and 2) compare the results obtained using different strategies for accounting for confounding variables (i.e. multivariate regression and propensity score matching). The study sample included 1,279 participants from the Quebec Longitudinal Study of Kindergarten Children. Maternal educational expectations were assessed when the participants were aged 12 years. High school graduation—measuring educational attainment—was determined through the Quebec Ministry of Education when the participants were aged 22-23 years. Findings show that when using the most common statistical approach (i.e. multivariate regressions to adjust for a restricted set of potential confounders) the contribution of low maternal educational expectations to failure to graduate from high school was statistically significant. However, when using propensity score matching, the contribution of maternal expectations was reduced and remained statistically significant only for males. The results of this study are consistent with the possibility that the contribution of parental expectations to educational attainment is overestimated in the available literature. This may be explained by the use of a restricted range of potential confounding variables as well as the dearth of studies using appropriate statistical techniques and study designs in order to minimize confounding. Each of these techniques and designs, including propensity score matching, has its strengths and limitations: A more comprehensive understanding of the causal role of parental expectations will stem from a convergence of findings from studies using different techniques and designs.

  17. Initial phantom study comparing image quality in computed tomography using adaptive statistical iterative reconstruction and new adaptive statistical iterative reconstruction v.

    PubMed

    Lim, Kyungjae; Kwon, Heejin; Cho, Jinhan; Oh, Jongyoung; Yoon, Seongkuk; Kang, Myungjin; Ha, Dongho; Lee, Jinhwa; Kang, Eunju

    2015-01-01

    The purpose of this study was to assess the image quality of a novel advanced iterative reconstruction (IR) method called as "adaptive statistical IR V" (ASIR-V) by comparing the image noise, contrast-to-noise ratio (CNR), and spatial resolution from those of filtered back projection (FBP) and adaptive statistical IR (ASIR) on computed tomography (CT) phantom image. We performed CT scans at 5 different tube currents (50, 70, 100, 150, and 200 mA) using 3 types of CT phantoms. Scanned images were subsequently reconstructed in 7 different scan settings, such as FBP, and 3 levels of ASIR and ASIR-V (30%, 50%, and 70%). The image noise was measured in the first study using body phantom. The CNR was measured in the second study using contrast phantom and the spatial resolutions were measured in the third study using a high-resolution phantom. We compared the image noise, CNR, and spatial resolution among the 7 reconstructed image scan settings to determine whether noise reduction, high CNR, and high spatial resolution could be achieved at ASIR-V. At quantitative analysis of the first and second studies, it showed that the images reconstructed using ASIR-V had reduced image noise and improved CNR compared with those of FBP and ASIR (P < 0.001). At qualitative analysis of the third study, it also showed that the images reconstructed using ASIR-V had significantly improved spatial resolution than those of FBP and ASIR (P < 0.001). Our phantom studies showed that ASIR-V provides a significant reduction in image noise and a significant improvement in CNR as well as spatial resolution. Therefore, this technique has the potential to reduce the radiation dose further without compromising image quality.

  18. Non-homogeneous updates for the iterative coordinate descent algorithm

    NASA Astrophysics Data System (ADS)

    Yu, Zhou; Thibault, Jean-Baptiste; Bouman, Charles A.; Sauer, Ken D.; Hsieh, Jiang

    2007-02-01

    Statistical reconstruction methods show great promise for improving resolution, and reducing noise and artifacts in helical X-ray CT. In fact, statistical reconstruction seems to be particularly valuable in maintaining reconstructed image quality when the dosage is low and the noise is therefore high. However, high computational cost and long reconstruction times remain as a barrier to the use of statistical reconstruction in practical applications. Among the various iterative methods that have been studied for statistical reconstruction, iterative coordinate descent (ICD) has been found to have relatively low overall computational requirements due to its fast convergence. This paper presents a novel method for further speeding the convergence of the ICD algorithm, and therefore reducing the overall reconstruction time for statistical reconstruction. The method, which we call nonhomogeneous iterative coordinate descent (NH-ICD) uses spatially non-homogeneous updates to speed convergence by focusing computation where it is most needed. Experimental results with real data indicate that the method speeds reconstruction by roughly a factor of two for typical 3D multi-slice geometries.

  19. A generalized K statistic for estimating phylogenetic signal from shape and other high-dimensional multivariate data.

    PubMed

    Adams, Dean C

    2014-09-01

    Phylogenetic signal is the tendency for closely related species to display similar trait values due to their common ancestry. Several methods have been developed for quantifying phylogenetic signal in univariate traits and for sets of traits treated simultaneously, and the statistical properties of these approaches have been extensively studied. However, methods for assessing phylogenetic signal in high-dimensional multivariate traits like shape are less well developed, and their statistical performance is not well characterized. In this article, I describe a generalization of the K statistic of Blomberg et al. that is useful for quantifying and evaluating phylogenetic signal in highly dimensional multivariate data. The method (K(mult)) is found from the equivalency between statistical methods based on covariance matrices and those based on distance matrices. Using computer simulations based on Brownian motion, I demonstrate that the expected value of K(mult) remains at 1.0 as trait variation among species is increased or decreased, and as the number of trait dimensions is increased. By contrast, estimates of phylogenetic signal found with a squared-change parsimony procedure for multivariate data change with increasing trait variation among species and with increasing numbers of trait dimensions, confounding biological interpretations. I also evaluate the statistical performance of hypothesis testing procedures based on K(mult) and find that the method displays appropriate Type I error and high statistical power for detecting phylogenetic signal in high-dimensional data. Statistical properties of K(mult) were consistent for simulations using bifurcating and random phylogenies, for simulations using different numbers of species, for simulations that varied the number of trait dimensions, and for different underlying models of trait covariance structure. Overall these findings demonstrate that K(mult) provides a useful means of evaluating phylogenetic signal in high-dimensional multivariate traits. Finally, I illustrate the utility of the new approach by evaluating the strength of phylogenetic signal for head shape in a lineage of Plethodon salamanders. © The Author(s) 2014. Published by Oxford University Press, on behalf of the Society of Systematic Biologists. All rights reserved. For Permissions, please email: journals.permissions@oup.com.

  20. Statistical Learning in a Natural Language by 8-Month-Old Infants

    PubMed Central

    Pelucchi, Bruna; Hay, Jessica F.; Saffran, Jenny R.

    2013-01-01

    Numerous studies over the past decade support the claim that infants are equipped with powerful statistical language learning mechanisms. The primary evidence for statistical language learning in word segmentation comes from studies using artificial languages, continuous streams of synthesized syllables that are highly simplified relative to real speech. To what extent can these conclusions be scaled up to natural language learning? In the current experiments, English-learning 8-month-old infants’ ability to track transitional probabilities in fluent infant-directed Italian speech was tested (N = 72). The results suggest that infants are sensitive to transitional probability cues in unfamiliar natural language stimuli, and support the claim that statistical learning is sufficiently robust to support aspects of real-world language acquisition. PMID:19489896

  1. Statistical learning in a natural language by 8-month-old infants.

    PubMed

    Pelucchi, Bruna; Hay, Jessica F; Saffran, Jenny R

    2009-01-01

    Numerous studies over the past decade support the claim that infants are equipped with powerful statistical language learning mechanisms. The primary evidence for statistical language learning in word segmentation comes from studies using artificial languages, continuous streams of synthesized syllables that are highly simplified relative to real speech. To what extent can these conclusions be scaled up to natural language learning? In the current experiments, English-learning 8-month-old infants' ability to track transitional probabilities in fluent infant-directed Italian speech was tested (N = 72). The results suggest that infants are sensitive to transitional probability cues in unfamiliar natural language stimuli, and support the claim that statistical learning is sufficiently robust to support aspects of real-world language acquisition.

  2. Use Of Statistical Tools To Evaluate The Reductive Dechlorination Of High Levels Of TCE In Microcosm Studies

    EPA Science Inventory

    A large, multi-laboratory microcosm study was performed to select amendments for supporting reductive dechlorination of high levels of trichloroethylene (TCE) found at an industrial site in the United Kingdom (UK) containing dense non-aqueous phase liquid (DNAPL) TCE. The study ...

  3. Assessing Statistically Significant Heavy-Metal Concentrations in Abandoned Mine Areas via Hot Spot Analysis of Portable XRF Data

    PubMed Central

    Kim, Sung-Min; Choi, Yosoon

    2017-01-01

    To develop appropriate measures to prevent soil contamination in abandoned mining areas, an understanding of the spatial variation of the potentially toxic trace elements (PTEs) in the soil is necessary. For the purpose of effective soil sampling, this study uses hot spot analysis, which calculates a z-score based on the Getis-Ord Gi* statistic to identify a statistically significant hot spot sample. To constitute a statistically significant hot spot, a feature with a high value should also be surrounded by other features with high values. Using relatively cost- and time-effective portable X-ray fluorescence (PXRF) analysis, sufficient input data are acquired from the Busan abandoned mine and used for hot spot analysis. To calibrate the PXRF data, which have a relatively low accuracy, the PXRF analysis data are transformed using the inductively coupled plasma atomic emission spectrometry (ICP-AES) data. The transformed PXRF data of the Busan abandoned mine are classified into four groups according to their normalized content and z-scores: high content with a high z-score (HH), high content with a low z-score (HL), low content with a high z-score (LH), and low content with a low z-score (LL). The HL and LH cases may be due to measurement errors. Additional or complementary surveys are required for the areas surrounding these suspect samples or for significant hot spot areas. The soil sampling is conducted according to a four-phase procedure in which the hot spot analysis and proposed group classification method are employed to support the development of a sampling plan for the following phase. Overall, 30, 50, 80, and 100 samples are investigated and analyzed in phases 1–4, respectively. The method implemented in this case study may be utilized in the field for the assessment of statistically significant soil contamination and the identification of areas for which an additional survey is required. PMID:28629168

  4. Assessing Statistically Significant Heavy-Metal Concentrations in Abandoned Mine Areas via Hot Spot Analysis of Portable XRF Data.

    PubMed

    Kim, Sung-Min; Choi, Yosoon

    2017-06-18

    To develop appropriate measures to prevent soil contamination in abandoned mining areas, an understanding of the spatial variation of the potentially toxic trace elements (PTEs) in the soil is necessary. For the purpose of effective soil sampling, this study uses hot spot analysis, which calculates a z -score based on the Getis-Ord Gi* statistic to identify a statistically significant hot spot sample. To constitute a statistically significant hot spot, a feature with a high value should also be surrounded by other features with high values. Using relatively cost- and time-effective portable X-ray fluorescence (PXRF) analysis, sufficient input data are acquired from the Busan abandoned mine and used for hot spot analysis. To calibrate the PXRF data, which have a relatively low accuracy, the PXRF analysis data are transformed using the inductively coupled plasma atomic emission spectrometry (ICP-AES) data. The transformed PXRF data of the Busan abandoned mine are classified into four groups according to their normalized content and z -scores: high content with a high z -score (HH), high content with a low z -score (HL), low content with a high z -score (LH), and low content with a low z -score (LL). The HL and LH cases may be due to measurement errors. Additional or complementary surveys are required for the areas surrounding these suspect samples or for significant hot spot areas. The soil sampling is conducted according to a four-phase procedure in which the hot spot analysis and proposed group classification method are employed to support the development of a sampling plan for the following phase. Overall, 30, 50, 80, and 100 samples are investigated and analyzed in phases 1-4, respectively. The method implemented in this case study may be utilized in the field for the assessment of statistically significant soil contamination and the identification of areas for which an additional survey is required.

  5. Meta-Analysis of Longitudinal Cohort Studies of Suicide Risk Assessment among Psychiatric Patients: Heterogeneity in Results and Lack of Improvement over Time

    PubMed Central

    Large, Matthew; Kaneson, Muthusamy; Myles, Nicholas; Myles, Hannah; Gunaratne, Pramudie; Ryan, Christopher

    2016-01-01

    Objective It is widely assumed that the clinical care of psychiatric patients can be guided by estimates of suicide risk and by using patient characteristics to define a group of high-risk patients. However, the statistical strength and reliability of suicide risk categorization is unknown. Our objective was to investigate the odds of suicide in high-risk compared to lower-risk categories and the suicide rates in high-risk and lower-risk groups. Method We located longitudinal cohort studies where psychiatric patients or people who had made suicide attempts were stratified into high-risk and lower-risk groups for suicide with suicide mortality as the outcome by searching for peer reviewed publications indexed in PubMed or PsychINFO. Electronic searches were supplemented by hand searching of included studies and relevant review articles. Two authors independently extracted data regarding effect size, study population and study design from 53 samples of risk-assessed patients reported in 37 studies. Results The pooled odds of suicide among high-risk patients compared to lower-risk patients calculated by random effects meta-analysis was of 4.84 (95% Confidence Interval (CI) 3.79–6.20). Between-study heterogeneity was very high (I2 = 93.3). There was no evidence that more recent studies had greater statistical strength than older studies. Over an average follow up period of 63 months the proportion of suicides among the high-risk patients was 5.5% and was 0.9% among lower-risk patients. The meta-analytically derived sensitivity and specificity of a high-risk categorization were 56% and 79% respectively. There was evidence of publication bias in favour of studies that inflated the pooled odds of suicide in high-risk patients. Conclusions The strength of suicide risk categorizations based on the presence of multiple risk factors does not greatly exceed the association between individual suicide risk factors and suicide. A statistically strong and reliable method to usefully distinguish patients with a high-risk of suicide remains elusive. PMID:27285387

  6. Comparison of Vital Statistics Definitions of Suicide against a Coroner Reference Standard: A Population-Based Linkage Study.

    PubMed

    Gatov, Evgenia; Kurdyak, Paul; Sinyor, Mark; Holder, Laura; Schaffer, Ayal

    2018-03-01

    We sought to determine the utility of health administrative databases for population-based suicide surveillance, as these data are generally more accessible and more integrated with other data sources compared to coroners' records. In this retrospective validation study, we identified all coroner-confirmed suicides between 2003 and 2012 in Ontario residents aged 21 and over and linked this information to Statistics Canada's vital statistics data set. We examined the overlap between the underlying cause of death field and secondary causes of death using ICD-9 and ICD-10 codes for deliberate self-harm (i.e., suicide) and examined the sociodemographic and clinical characteristics of misclassified records. Among 10,153 linked deaths, there was a very high degree of overlap between records coded as deliberate self-harm in the vital statistics data set and coroner-confirmed suicides using both ICD-9 and ICD-10 definitions (96.88% and 96.84% sensitivity, respectively). This alignment steadily increased throughout the study period (from 95.9% to 98.8%). Other vital statistics diagnoses in primary fields included uncategorised signs and symptoms. Vital statistics records that were misclassified did not differ from valid records in terms of sociodemographic characteristics but were more likely to have had an unspecified place of injury on the death certificate ( P < 0.001), more likely to have died at a health care facility ( P < 0.001), to have had an autopsy ( P = 0.002), and to have been admitted to a psychiatric hospital in the year preceding death ( P = 0.03). A high degree of concordance between vital statistics and coroner classification of suicide deaths suggests that health administrative data can reliably be used to identify suicide deaths.

  7. Hispanic Students in American High Schools: Background Characteristics and Achievement. National Center for Education Statistics Bulletin.

    ERIC Educational Resources Information Center

    Peng, Samuel S.

    Based on data from the High School and Beyond Study, a longitudinal study of high school sophomores and seniors, this report summarizes some of the study's findings on the differences between Hispanics and non-Hispanic blacks and whites in school delay, aspirations, test scores, language usage, and socioeconomic status. Tabular data indicate that:…

  8. The European Research Elite: A Cross-National Study of Highly Productive Academics in 11 Countries

    ERIC Educational Resources Information Center

    Kwiek, Marek

    2016-01-01

    In this paper, we focus on a rare scholarly theme of highly productive academics, statistically confirming their pivotal role in knowledge production across 11 systems studied. The upper 10% of highly productive academics in 11 European countries studied (N = 17,211) provide on average almost half of all academic knowledge production. In contrast…

  9. The Development of Bayesian Theory and Its Applications in Business and Bioinformatics

    NASA Astrophysics Data System (ADS)

    Zhang, Yifei

    2018-03-01

    Bayesian Theory originated from an Essay of a British mathematician named Thomas Bayes in 1763, and after its development in 20th century, Bayesian Statistics has been taking a significant part in statistical study of all fields. Due to the recent breakthrough of high-dimensional integral, Bayesian Statistics has been improved and perfected, and now it can be used to solve problems that Classical Statistics failed to solve. This paper summarizes Bayesian Statistics’ history, concepts and applications, which are illustrated in five parts: the history of Bayesian Statistics, the weakness of Classical Statistics, Bayesian Theory and its development and applications. The first two parts make a comparison between Bayesian Statistics and Classical Statistics in a macroscopic aspect. And the last three parts focus on Bayesian Theory in specific -- from introducing some particular Bayesian Statistics’ concepts to listing their development and finally their applications.

  10. Statistics and bioinformatics in nutritional sciences: analysis of complex data in the era of systems biology⋆

    PubMed Central

    Fu, Wenjiang J.; Stromberg, Arnold J.; Viele, Kert; Carroll, Raymond J.; Wu, Guoyao

    2009-01-01

    Over the past two decades, there have been revolutionary developments in life science technologies characterized by high throughput, high efficiency, and rapid computation. Nutritionists now have the advanced methodologies for the analysis of DNA, RNA, protein, low-molecular-weight metabolites, as well as access to bioinformatics databases. Statistics, which can be defined as the process of making scientific inferences from data that contain variability, has historically played an integral role in advancing nutritional sciences. Currently, in the era of systems biology, statistics has become an increasingly important tool to quantitatively analyze information about biological macromolecules. This article describes general terms used in statistical analysis of large, complex experimental data. These terms include experimental design, power analysis, sample size calculation, and experimental errors (type I and II errors) for nutritional studies at population, tissue, cellular, and molecular levels. In addition, we highlighted various sources of experimental variations in studies involving microarray gene expression, real-time polymerase chain reaction, proteomics, and other bioinformatics technologies. Moreover, we provided guidelines for nutritionists and other biomedical scientists to plan and conduct studies and to analyze the complex data. Appropriate statistical analyses are expected to make an important contribution to solving major nutrition-associated problems in humans and animals (including obesity, diabetes, cardiovascular disease, cancer, ageing, and intrauterine fetal retardation). PMID:20233650

  11. Measures of health sciences journal use: a comparison of vendor, link-resolver, and local citation statistics*

    PubMed Central

    De Groote, Sandra L.; Blecic, Deborah D.; Martin, Kristin

    2013-01-01

    Objective: Libraries require efficient and reliable methods to assess journal use. Vendors provide complete counts of articles retrieved from their platforms. However, if a journal is available on multiple platforms, several sets of statistics must be merged. Link-resolver reports merge data from all platforms into one report but only record partial use because users can access library subscriptions from other paths. Citation data are limited to publication use. Vendor, link-resolver, and local citation data were examined to determine correlation. Because link-resolver statistics are easy to obtain, the study library especially wanted to know if they correlate highly with the other measures. Methods: Vendor, link-resolver, and local citation statistics for the study institution were gathered for health sciences journals. Spearman rank-order correlation coefficients were calculated. Results: There was a high positive correlation between all three data sets, with vendor data commonly showing the highest use. However, a small percentage of titles showed anomalous results. Discussion and Conclusions: Link-resolver data correlate well with vendor and citation data, but due to anomalies, low link-resolver data would best be used to suggest titles for further evaluation using vendor data. Citation data may not be needed as it correlates highly with other measures. PMID:23646026

  12. [Corporal Punishment. Three Works:] The Influence of Corporal Punishment on Learning: A Statistical Study. The Bible and the Rod. 1001 Alternatives to Corporal Punishment, Volume One.

    ERIC Educational Resources Information Center

    Maurer, Adah; Wallerstein, James S.

    Arguments against the use of corporal punishment in schools are presented in the three publications collected here. "The Influence of Corporal Punishment on Learning: A Statistical Study," by Adah Maurer and James S. Wallerstein, examines the relationship between rates of corporal punishment use and noncompletion of high school in the 50 states.…

  13. Statistical Approaches to Assess Biosimilarity from Analytical Data.

    PubMed

    Burdick, Richard; Coffey, Todd; Gutka, Hiten; Gratzl, Gyöngyi; Conlon, Hugh D; Huang, Chi-Ting; Boyne, Michael; Kuehne, Henriette

    2017-01-01

    Protein therapeutics have unique critical quality attributes (CQAs) that define their purity, potency, and safety. The analytical methods used to assess CQAs must be able to distinguish clinically meaningful differences in comparator products, and the most important CQAs should be evaluated with the most statistical rigor. High-risk CQA measurements assess the most important attributes that directly impact the clinical mechanism of action or have known implications for safety, while the moderate- to low-risk characteristics may have a lower direct impact and thereby may have a broader range to establish similarity. Statistical equivalence testing is applied for high-risk CQA measurements to establish the degree of similarity (e.g., highly similar fingerprint, highly similar, or similar) of selected attributes. Notably, some high-risk CQAs (e.g., primary sequence or disulfide bonding) are qualitative (e.g., the same as the originator or not the same) and therefore not amenable to equivalence testing. For biosimilars, an important step is the acquisition of a sufficient number of unique originator drug product lots to measure the variability in the originator drug manufacturing process and provide sufficient statistical power for the analytical data comparisons. Together, these analytical evaluations, along with PK/PD and safety data (immunogenicity), provide the data necessary to determine if the totality of the evidence warrants a designation of biosimilarity and subsequent licensure for marketing in the USA. In this paper, a case study approach is used to provide examples of analytical similarity exercises and the appropriateness of statistical approaches for the example data.

  14. Factors Contributing to Academic Achievement: A Bayesian Structure Equation Modelling Study

    ERIC Educational Resources Information Center

    Payandeh Najafabadi, Amir T.; Najafabadi, Maryam Omidi; Farid-Rohani, Mohammad Reza

    2013-01-01

    In Iran, high school graduates enter university after taking a very difficult entrance exam called the Konkoor. Therefore, only the top-performing students are admitted by universities to continue their bachelor's education in statistics. Surprisingly, statistically, most of such students fall into the following categories: (1) do not succeed in…

  15. Intrex Subject/Title Inverted-File Characteristics.

    ERIC Educational Resources Information Center

    Uemura, Syunsuke

    The characteristics of the Intrex subject/title inverted file are analyzed. Basic statistics of the inverted file are presented including various distributions of the index words and terms from which the file was derived, and statistics on stems, the file growth process, and redundancy measurements. A study of stems both with extremely high and…

  16. Statistical description of tectonic motions

    NASA Technical Reports Server (NTRS)

    Agnew, Duncan Carr

    1993-01-01

    This report summarizes investigations regarding tectonic motions. The topics discussed include statistics of crustal deformation, Earth rotation studies, using multitaper spectrum analysis techniques applied to both space-geodetic data and conventional astrometric estimates of the Earth's polar motion, and the development, design, and installation of high-stability geodetic monuments for use with the global positioning system.

  17. A New Global Policy Regime Founded on Invalid Statistics? Hanushek, Woessmann, PISA, and Economic Growth

    ERIC Educational Resources Information Center

    Komatsu, Hikaru; Rappleye, Jeremy

    2017-01-01

    Several recent, highly influential comparative studies have made strong statistical claims that improvements on global learning assessments such as PISA will lead to higher GDP growth rates. These claims have provided the primary source of legitimation for policy reforms championed by leading international organisations, most notably the World…

  18. New Standards Require Teaching More Statistics: Are Preservice Secondary Mathematics Teachers Ready?

    ERIC Educational Resources Information Center

    Lovett, Jennifer N.; Lee, Hollylynne S.

    2017-01-01

    Mathematics teacher education programs often need to respond to changing expectations and standards for K-12 curriculum and accreditation. New standards for high school mathematics in the United States include a strong emphasis in statistics. This article reports results from a mixed methods cross-institutional study examining the preparedness of…

  19. Feature extraction and classification algorithms for high dimensional data

    NASA Technical Reports Server (NTRS)

    Lee, Chulhee; Landgrebe, David

    1993-01-01

    Feature extraction and classification algorithms for high dimensional data are investigated. Developments with regard to sensors for Earth observation are moving in the direction of providing much higher dimensional multispectral imagery than is now possible. In analyzing such high dimensional data, processing time becomes an important factor. With large increases in dimensionality and the number of classes, processing time will increase significantly. To address this problem, a multistage classification scheme is proposed which reduces the processing time substantially by eliminating unlikely classes from further consideration at each stage. Several truncation criteria are developed and the relationship between thresholds and the error caused by the truncation is investigated. Next an approach to feature extraction for classification is proposed based directly on the decision boundaries. It is shown that all the features needed for classification can be extracted from decision boundaries. A characteristic of the proposed method arises by noting that only a portion of the decision boundary is effective in discriminating between classes, and the concept of the effective decision boundary is introduced. The proposed feature extraction algorithm has several desirable properties: it predicts the minimum number of features necessary to achieve the same classification accuracy as in the original space for a given pattern recognition problem; and it finds the necessary feature vectors. The proposed algorithm does not deteriorate under the circumstances of equal means or equal covariances as some previous algorithms do. In addition, the decision boundary feature extraction algorithm can be used both for parametric and non-parametric classifiers. Finally, some problems encountered in analyzing high dimensional data are studied and possible solutions are proposed. First, the increased importance of the second order statistics in analyzing high dimensional data is recognized. By investigating the characteristics of high dimensional data, the reason why the second order statistics must be taken into account in high dimensional data is suggested. Recognizing the importance of the second order statistics, there is a need to represent the second order statistics. A method to visualize statistics using a color code is proposed. By representing statistics using color coding, one can easily extract and compare the first and the second statistics.

  20. Mathematics Anxiety and Statistics Anxiety. Shared but Also Unshared Components and Antagonistic Contributions to Performance in Statistics

    PubMed Central

    Paechter, Manuela; Macher, Daniel; Martskvishvili, Khatuna; Wimmer, Sigrid; Papousek, Ilona

    2017-01-01

    In many social science majors, e.g., psychology, students report high levels of statistics anxiety. However, these majors are often chosen by students who are less prone to mathematics and who might have experienced difficulties and unpleasant feelings in their mathematics courses at school. The present study investigates whether statistics anxiety is a genuine form of anxiety that impairs students' achievements or whether learners mainly transfer previous experiences in mathematics and their anxiety in mathematics to statistics. The relationship between mathematics anxiety and statistics anxiety, their relationship to learning behaviors and to performance in a statistics examination were investigated in a sample of 225 undergraduate psychology students (164 women, 61 men). Data were recorded at three points in time: At the beginning of term students' mathematics anxiety, general proneness to anxiety, school grades, and demographic data were assessed; 2 weeks before the end of term, they completed questionnaires on statistics anxiety and their learning behaviors. At the end of term, examination scores were recorded. Mathematics anxiety and statistics anxiety correlated highly but the comparison of different structural equation models showed that they had genuine and even antagonistic contributions to learning behaviors and performance in the examination. Surprisingly, mathematics anxiety was positively related to performance. It might be that students realized over the course of their first term that knowledge and skills in higher secondary education mathematics are not sufficient to be successful in statistics. Part of mathematics anxiety may then have strengthened positive extrinsic effort motivation by the intention to avoid failure and may have led to higher effort for the exam preparation. However, via statistics anxiety mathematics anxiety also had a negative contribution to performance. Statistics anxiety led to higher procrastination in the structural equation model and, therefore, contributed indirectly and negatively to performance. Furthermore, it had a direct negative impact on performance (probably via increased tension and worry in the exam). The results of the study speak for shared but also unique components of statistics anxiety and mathematics anxiety. They are also important for instruction and give recommendations to learners as well as to instructors. PMID:28790938

  1. Mathematics Anxiety and Statistics Anxiety. Shared but Also Unshared Components and Antagonistic Contributions to Performance in Statistics.

    PubMed

    Paechter, Manuela; Macher, Daniel; Martskvishvili, Khatuna; Wimmer, Sigrid; Papousek, Ilona

    2017-01-01

    In many social science majors, e.g., psychology, students report high levels of statistics anxiety. However, these majors are often chosen by students who are less prone to mathematics and who might have experienced difficulties and unpleasant feelings in their mathematics courses at school. The present study investigates whether statistics anxiety is a genuine form of anxiety that impairs students' achievements or whether learners mainly transfer previous experiences in mathematics and their anxiety in mathematics to statistics. The relationship between mathematics anxiety and statistics anxiety, their relationship to learning behaviors and to performance in a statistics examination were investigated in a sample of 225 undergraduate psychology students (164 women, 61 men). Data were recorded at three points in time: At the beginning of term students' mathematics anxiety, general proneness to anxiety, school grades, and demographic data were assessed; 2 weeks before the end of term, they completed questionnaires on statistics anxiety and their learning behaviors. At the end of term, examination scores were recorded. Mathematics anxiety and statistics anxiety correlated highly but the comparison of different structural equation models showed that they had genuine and even antagonistic contributions to learning behaviors and performance in the examination. Surprisingly, mathematics anxiety was positively related to performance. It might be that students realized over the course of their first term that knowledge and skills in higher secondary education mathematics are not sufficient to be successful in statistics. Part of mathematics anxiety may then have strengthened positive extrinsic effort motivation by the intention to avoid failure and may have led to higher effort for the exam preparation. However, via statistics anxiety mathematics anxiety also had a negative contribution to performance. Statistics anxiety led to higher procrastination in the structural equation model and, therefore, contributed indirectly and negatively to performance. Furthermore, it had a direct negative impact on performance (probably via increased tension and worry in the exam). The results of the study speak for shared but also unique components of statistics anxiety and mathematics anxiety. They are also important for instruction and give recommendations to learners as well as to instructors.

  2. Towards Direct Simulation of Future Tropical Cyclone Statistics in a High-Resolution Global Atmospheric Model

    DOE PAGES

    Wehner, Michael F.; Bala, G.; Duffy, Phillip; ...

    2010-01-01

    We present a set of high-resolution global atmospheric general circulation model (AGCM) simulations focusing on the model's ability to represent tropical storms and their statistics. We find that the model produces storms of hurricane strength with realistic dynamical features. We also find that tropical storm statistics are reasonable, both globally and in the north Atlantic, when compared to recent observations. The sensitivity of simulated tropical storm statistics to increases in sea surface temperature (SST) is also investigated, revealing that a credible late 21st century SST increase produced increases in simulated tropical storm numbers and intensities in all ocean basins. Whilemore » this paper supports previous high-resolution model and theoretical findings that the frequency of very intense storms will increase in a warmer climate, it differs notably from previous medium and high-resolution model studies that show a global reduction in total tropical storm frequency. However, we are quick to point out that this particular model finding remains speculative due to a lack of radiative forcing changes in our time-slice experiments as well as a focus on the Northern hemisphere tropical storm seasons.« less

  3. Food consumption and the actual statistics of cardiovascular diseases: an epidemiological comparison of 42 European countries

    PubMed Central

    Grasgruber, Pavel; Sebera, Martin; Hrazdira, Eduard; Hrebickova, Sylva; Cacek, Jan

    2016-01-01

    Background The aim of this ecological study was to identify the main nutritional factors related to the prevalence of cardiovascular diseases (CVDs) in Europe, based on a comparison of international statistics. Design The mean consumption of 62 food items from the FAOSTAT database (1993–2008) was compared with the actual statistics of five CVD indicators in 42 European countries. Several other exogenous factors (health expenditure, smoking, body mass index) and the historical stability of results were also examined. Results We found exceptionally strong relationships between some of the examined factors, the highest being a correlation between raised cholesterol in men and the combined consumption of animal fat and animal protein (r=0.92, p<0.001). The most significant dietary correlate of low CVD risk was high total fat and animal protein consumption. Additional statistical analyses further highlighted citrus fruits, high-fat dairy (cheese) and tree nuts. Among other non-dietary factors, health expenditure showed by far the highest correlation coefficients. The major correlate of high CVD risk was the proportion of energy from carbohydrates and alcohol, or from potato and cereal carbohydrates. Similar patterns were observed between food consumption and CVD statistics from the period 1980–2000, which shows that these relationships are stable over time. However, we found striking discrepancies in men's CVD statistics from 1980 and 1990, which can probably explain the origin of the ‘saturated fat hypothesis’ that influenced public health policies in the following decades. Conclusion Our results do not support the association between CVDs and saturated fat, which is still contained in official dietary guidelines. Instead, they agree with data accumulated from recent studies that link CVD risk with the high glycaemic index/load of carbohydrate-based diets. In the absence of any scientific evidence connecting saturated fat with CVDs, these findings show that current dietary recommendations regarding CVDs should be seriously reconsidered. PMID:27680091

  4. Differentiation of women with premenstrual dysphoric disorder, recurrent brief depression, and healthy controls by daily mood rating dynamics.

    PubMed

    Pincus, Steven M; Schmidt, Peter J; Palladino-Negro, Paula; Rubinow, David R

    2008-04-01

    Enhanced statistical characterization of mood-rating data holds the potential to more precisely classify and sub-classify recurrent mood disorders like premenstrual dysphoric disorder (PMDD) and recurrent brief depressive disorder (RBD). We applied several complementary statistical methods to differentiate mood rating dynamics among women with PMDD, RBD, and normal controls (NC). We compared three subgroups of women: NC (n=8); PMDD (n=15); and RBD (n=9) on the basis of daily self-ratings of sadness, study lengths between 50 and 120 days. We analyzed mean levels; overall variability, SD; sequential irregularity, approximate entropy (ApEn); and a quantification of the extent of brief and staccato dynamics, denoted 'Spikiness'. For each of SD, irregularity (ApEn), and Spikiness, we showed highly significant subgroup differences, ANOVA0.001 for each statistic; additionally, many paired subgroup comparisons showed highly significant differences. In contrast, mean levels were indistinct among the subgroups. For SD, normal controls had much smaller levels than the other subgroups, with RBD intermediate. ApEn showed PMDD to be significantly more regular than the other subgroups. Spikiness showed NC and RBD data sets to be much more staccato than their PMDD counterparts, and appears to suitably characterize the defining feature of RBD dynamics. Compound criteria based on these statistical measures discriminated diagnostic subgroups with high sensitivity and specificity. Taken together, the statistical suite provides well-defined specifications of each subgroup. This can facilitate accurate diagnosis, and augment the prediction and evaluation of response to treatment. The statistical methodologies have broad and direct applicability to behavioral studies for many psychiatric disorders, and indeed to similar analyses of associated biological signals across multiple axes.

  5. A high-resolution open biomass burning emission inventory based on statistical data and MODIS observations in mainland China

    NASA Astrophysics Data System (ADS)

    Xu, Y.; Fan, M.; Huang, Z.; Zheng, J.; Chen, L.

    2017-12-01

    Open biomass burning which has adverse effects on air quality and human health is an important source of gas and particulate matter (PM) in China. Current emission estimations of open biomass burning are generally based on single source (alternative to statistical data and satellite-derived data) and thus contain large uncertainty due to the limitation of data. In this study, to quantify the 2015-based amount of open biomass burning, we established a new estimation method for open biomass burning activity levels by combining the bottom-up statistical data and top-down MODIS observations. And three sub-category sources which used different activity data were considered. For open crop residue burning, the "best estimate" of activity data was obtained by averaging the statistical data from China statistical yearbooks and satellite observations from MODIS burned area product MCD64A1 weighted by their uncertainties. For the forest and grassland fires, their activity levels were represented by the combination of statistical data and MODIS active fire product MCD14ML. Using the fire radiative power (FRP) which is considered as a better indicator of active fire level as the spatial allocation surrogate, coarse gridded emissions were reallocated into 3km ×3km grids to get a high-resolution emission inventory. Our results showed that emissions of CO, NOx, SO2, NH3, VOCs, PM2.5, PM10, BC and OC in mainland China were 6607, 427, 84, 79, 1262, 1198, 1222, 159 and 686 Gg/yr, respectively. Among all provinces of China, Henan, Shandong and Heilongjiang were the top three contributors to the total emissions. In this study, the developed open biomass burning emission inventory with a high-resolution could support air quality modeling and policy-making for pollution control.

  6. Equilibrium statistical-thermal models in high-energy physics

    NASA Astrophysics Data System (ADS)

    Tawfik, Abdel Nasser

    2014-05-01

    We review some recent highlights from the applications of statistical-thermal models to different experimental measurements and lattice QCD thermodynamics that have been made during the last decade. We start with a short review of the historical milestones on the path of constructing statistical-thermal models for heavy-ion physics. We discovered that Heinz Koppe formulated in 1948, an almost complete recipe for the statistical-thermal models. In 1950, Enrico Fermi generalized this statistical approach, in which he started with a general cross-section formula and inserted into it, the simplifying assumptions about the matrix element of the interaction process that likely reflects many features of the high-energy reactions dominated by density in the phase space of final states. In 1964, Hagedorn systematically analyzed the high-energy phenomena using all tools of statistical physics and introduced the concept of limiting temperature based on the statistical bootstrap model. It turns to be quite often that many-particle systems can be studied with the help of statistical-thermal methods. The analysis of yield multiplicities in high-energy collisions gives an overwhelming evidence for the chemical equilibrium in the final state. The strange particles might be an exception, as they are suppressed at lower beam energies. However, their relative yields fulfill statistical equilibrium, as well. We review the equilibrium statistical-thermal models for particle production, fluctuations and collective flow in heavy-ion experiments. We also review their reproduction of the lattice QCD thermodynamics at vanishing and finite chemical potential. During the last decade, five conditions have been suggested to describe the universal behavior of the chemical freeze-out parameters. The higher order moments of multiplicity have been discussed. They offer deep insights about particle production and to critical fluctuations. Therefore, we use them to describe the freeze-out parameters and suggest the location of the QCD critical endpoint. Various extensions have been proposed in order to take into consideration the possible deviations of the ideal hadron gas. We highlight various types of interactions, dissipative properties and location-dependences (spatial rapidity). Furthermore, we review three models combining hadronic with partonic phases; quasi-particle model, linear sigma model with Polyakov potentials and compressible bag model.

  7. Body Weight Reducing Effect of Oral Boric Acid Intake

    PubMed Central

    Aysan, Erhan; Sahin, Fikrettin; Telci, Dilek; Yalvac, Mehmet Emir; Emre, Sinem Hocaoglu; Karaca, Cetin; Muslumanoglu, Mahmut

    2011-01-01

    Background: Boric acid is widely used in biology, but its body weight reducing effect is not researched. Methods: Twenty mice were divided into two equal groups. Control group mice drank standard tap water, but study group mice drank 0.28mg/250ml boric acid added tap water over five days. Total body weight changes, major organ histopathology, blood biochemistry, urine and feces analyses were compared. Results: Study group mice lost body weight mean 28.1% but in control group no weight loss and also weight gained mean 0.09% (p<0.001). Total drinking water and urine outputs were not statistically different. Cholesterol, LDL, AST, ALT, LDH, amylase and urobilinogen levels were statistically significantly high in the study group. Other variables were not statistically different. No histopathologic differences were detected in evaluations of all resected major organs. Conclusion: Low dose oral boric acid intake cause serious body weight reduction. Blood and urine analyses support high glucose, lipid and middle protein catabolisms, but the mechanism is unclear. PMID:22135611

  8. Body weight reducing effect of oral boric acid intake.

    PubMed

    Aysan, Erhan; Sahin, Fikrettin; Telci, Dilek; Yalvac, Mehmet Emir; Emre, Sinem Hocaoglu; Karaca, Cetin; Muslumanoglu, Mahmut

    2011-01-01

    Boric acid is widely used in biology, but its body weight reducing effect is not researched. Twenty mice were divided into two equal groups. Control group mice drank standard tap water, but study group mice drank 0.28mg/250ml boric acid added tap water over five days. Total body weight changes, major organ histopathology, blood biochemistry, urine and feces analyses were compared. Study group mice lost body weight mean 28.1% but in control group no weight loss and also weight gained mean 0.09% (p<0.001). Total drinking water and urine outputs were not statistically different. Cholesterol, LDL, AST, ALT, LDH, amylase and urobilinogen levels were statistically significantly high in the study group. Other variables were not statistically different. No histopathologic differences were detected in evaluations of all resected major organs. Low dose oral boric acid intake cause serious body weight reduction. Blood and urine analyses support high glucose, lipid and middle protein catabolisms, but the mechanism is unclear.

  9. Correlation spectrometer for filtering of (quasi) elastic neutron scattering with variable resolution

    NASA Astrophysics Data System (ADS)

    Magazù, Salvatore; Mezei, Ferenc; Migliardo, Federica

    2018-05-01

    In a variety of applications of inelastic neutron scattering spectroscopy the goal is to single out the elastic scattering contribution from the total scattered spectrum as a function of momentum transfer and sample environment parameters. The elastic part of the spectrum is defined in such a case by the energy resolution of the spectrometer. Variable elastic energy resolution offers a way to distinguish between elastic and quasi-elastic intensities. Correlation spectroscopy lends itself as an efficient, high intensity approach for accomplishing this both at continuous and pulsed neutron sources. On the one hand, in beam modulation methods the Liouville theorem coupling between intensity and resolution is relaxed and time-of-flight velocity analysis of the neutron velocity distribution can be performed with 50 % duty factor exposure for all available resolutions. On the other hand, the (quasi)elastic part of the spectrum generally contains the major part of the integrated intensity at a given detector, and thus correlation spectroscopy can be applied with most favorable signal to statistical noise ratio. The novel spectrometer CORELLI at SNS is an example for this type of application of the correlation technique at a pulsed source. On a continuous neutron source a statistical chopper can be used for quasi-random time dependent beam modulation and the total time-of-flight of the neutron from the statistical chopper to detection is determined by the analysis of the correlation between the temporal fluctuation of the neutron detection rate and the statistical chopper beam modulation pattern. The correlation analysis can either be used for the determination of the incoming neutron velocity or for the scattered neutron velocity, depending of the position of the statistical chopper along the neutron trajectory. These two options are considered together with an evaluation of spectrometer performance compared to conventional spectroscopy, in particular for variable resolution elastic neutron scattering (RENS) studies of relaxation processes and the evolution of mean square displacements. A particular focus of our analysis is the unique feature of correlation spectroscopy of delivering high and resolution independent beam intensity, thus the same statistical chopper scan contains both high intensity and high resolution information at the same time, and can be evaluated both ways. This flexibility for variable resolution data handling represents an additional asset for correlation spectroscopy in variable resolution work. Changing the beam width for the same statistical chopper allows us to additionally trade resolution for intensity in two different experimental runs, similarly for conventional single slit chopper spectroscopy. The combination of these two approaches is a capability of particular value in neutron spectroscopy studies requiring variable energy resolution, such as the systematic study of quasi-elastic scattering and mean square displacement. Furthermore the statistical chopper approach is particularly advantageous for studying samples with low scattering intensity in the presence of a high, sample independent background.

  10. Probabilistic performance estimators for computational chemistry methods: The empirical cumulative distribution function of absolute errors

    NASA Astrophysics Data System (ADS)

    Pernot, Pascal; Savin, Andreas

    2018-06-01

    Benchmarking studies in computational chemistry use reference datasets to assess the accuracy of a method through error statistics. The commonly used error statistics, such as the mean signed and mean unsigned errors, do not inform end-users on the expected amplitude of prediction errors attached to these methods. We show that, the distributions of model errors being neither normal nor zero-centered, these error statistics cannot be used to infer prediction error probabilities. To overcome this limitation, we advocate for the use of more informative statistics, based on the empirical cumulative distribution function of unsigned errors, namely, (1) the probability for a new calculation to have an absolute error below a chosen threshold and (2) the maximal amplitude of errors one can expect with a chosen high confidence level. Those statistics are also shown to be well suited for benchmarking and ranking studies. Moreover, the standard error on all benchmarking statistics depends on the size of the reference dataset. Systematic publication of these standard errors would be very helpful to assess the statistical reliability of benchmarking conclusions.

  11. Properties of different selection signature statistics and a new strategy for combining them.

    PubMed

    Ma, Y; Ding, X; Qanbari, S; Weigend, S; Zhang, Q; Simianer, H

    2015-11-01

    Identifying signatures of recent or ongoing selection is of high relevance in livestock population genomics. From a statistical perspective, determining a proper testing procedure and combining various test statistics is challenging. On the basis of extensive simulations in this study, we discuss the statistical properties of eight different established selection signature statistics. In the considered scenario, we show that a reasonable power to detect selection signatures is achieved with high marker density (>1 SNP/kb) as obtained from sequencing, while rather small sample sizes (~15 diploid individuals) appear to be sufficient. Most selection signature statistics such as composite likelihood ratio and cross population extended haplotype homozogysity have the highest power when fixation of the selected allele is reached, while integrated haplotype score has the highest power when selection is ongoing. We suggest a novel strategy, called de-correlated composite of multiple signals (DCMS) to combine different statistics for detecting selection signatures while accounting for the correlation between the different selection signature statistics. When examined with simulated data, DCMS consistently has a higher power than most of the single statistics and shows a reliable positional resolution. We illustrate the new statistic to the established selective sweep around the lactase gene in human HapMap data providing further evidence of the reliability of this new statistic. Then, we apply it to scan selection signatures in two chicken samples with diverse skin color. Our analysis suggests that a set of well-known genes such as BCO2, MC1R, ASIP and TYR were involved in the divergent selection for this trait.

  12. An experimental study of the surface elevation probability distribution and statistics of wind-generated waves

    NASA Technical Reports Server (NTRS)

    Huang, N. E.; Long, S. R.

    1980-01-01

    Laboratory experiments were performed to measure the surface elevation probability density function and associated statistical properties for a wind-generated wave field. The laboratory data along with some limited field data were compared. The statistical properties of the surface elevation were processed for comparison with the results derived from the Longuet-Higgins (1963) theory. It is found that, even for the highly non-Gaussian cases, the distribution function proposed by Longuet-Higgins still gives good approximations.

  13. Weighted Statistical Binning: Enabling Statistically Consistent Genome-Scale Phylogenetic Analyses

    PubMed Central

    Bayzid, Md Shamsuzzoha; Mirarab, Siavash; Boussau, Bastien; Warnow, Tandy

    2015-01-01

    Because biological processes can result in different loci having different evolutionary histories, species tree estimation requires multiple loci from across multiple genomes. While many processes can result in discord between gene trees and species trees, incomplete lineage sorting (ILS), modeled by the multi-species coalescent, is considered to be a dominant cause for gene tree heterogeneity. Coalescent-based methods have been developed to estimate species trees, many of which operate by combining estimated gene trees, and so are called "summary methods". Because summary methods are generally fast (and much faster than more complicated coalescent-based methods that co-estimate gene trees and species trees), they have become very popular techniques for estimating species trees from multiple loci. However, recent studies have established that summary methods can have reduced accuracy in the presence of gene tree estimation error, and also that many biological datasets have substantial gene tree estimation error, so that summary methods may not be highly accurate in biologically realistic conditions. Mirarab et al. (Science 2014) presented the "statistical binning" technique to improve gene tree estimation in multi-locus analyses, and showed that it improved the accuracy of MP-EST, one of the most popular coalescent-based summary methods. Statistical binning, which uses a simple heuristic to evaluate "combinability" and then uses the larger sets of genes to re-calculate gene trees, has good empirical performance, but using statistical binning within a phylogenomic pipeline does not have the desirable property of being statistically consistent. We show that weighting the re-calculated gene trees by the bin sizes makes statistical binning statistically consistent under the multispecies coalescent, and maintains the good empirical performance. Thus, "weighted statistical binning" enables highly accurate genome-scale species tree estimation, and is also statistically consistent under the multi-species coalescent model. New data used in this study are available at DOI: http://dx.doi.org/10.6084/m9.figshare.1411146, and the software is available at https://github.com/smirarab/binning. PMID:26086579

  14. HIGH SCHOOL ENROLLMENTS IN LATIN, 1964-65.

    ERIC Educational Resources Information Center

    GOLDBERG, SAMUEL A.

    A MODERN LANGUAGE ASSOCIATION (MLA) STATISTICAL SURVEY SHOWS THE NUMBER OF STUDENTS STUDYING FRENCH, SPANISH, GERMAN, OR LATIN IN THE SECONDARY SCHOOLS DURING EACH SCHOOL YEAR FROM 1958-59 TO 1964-65, THE PERCENTAGE STUDYING EACH LANGUAGE IN RELATION TO THE TOTAL HIGH SCHOOL POPULATION, AND THE PERCENTAGE STUDYING LATIN IN RELATION TO THE TOTAL…

  15. High Dimensional Classification Using Features Annealed Independence Rules.

    PubMed

    Fan, Jianqing; Fan, Yingying

    2008-01-01

    Classification using high-dimensional features arises frequently in many contemporary statistical studies such as tumor classification using microarray or other high-throughput data. The impact of dimensionality on classifications is largely poorly understood. In a seminal paper, Bickel and Levina (2004) show that the Fisher discriminant performs poorly due to diverging spectra and they propose to use the independence rule to overcome the problem. We first demonstrate that even for the independence classification rule, classification using all the features can be as bad as the random guessing due to noise accumulation in estimating population centroids in high-dimensional feature space. In fact, we demonstrate further that almost all linear discriminants can perform as bad as the random guessing. Thus, it is paramountly important to select a subset of important features for high-dimensional classification, resulting in Features Annealed Independence Rules (FAIR). The conditions under which all the important features can be selected by the two-sample t-statistic are established. The choice of the optimal number of features, or equivalently, the threshold value of the test statistics are proposed based on an upper bound of the classification error. Simulation studies and real data analysis support our theoretical results and demonstrate convincingly the advantage of our new classification procedure.

  16. The effects of modeling instruction on high school physics academic achievement

    NASA Astrophysics Data System (ADS)

    Wright, Tiffanie L.

    The purpose of this study was to explore whether Modeling Instruction, compared to traditional lecturing, is an effective instructional method to promote academic achievement in selected high school physics classes at a rural middle Tennessee high school. This study used an ex post facto , quasi-experimental research methodology. The independent variables in this study were the instructional methods of teaching. The treatment variable was Modeling Instruction and the control variable was traditional lecture instruction. The Treatment Group consisted of participants in Physical World Concepts who received Modeling Instruction. The Control Group consisted of participants in Physical Science who received traditional lecture instruction. The dependent variable was gains scores on the Force Concepts Inventory (FCI). The participants for this study were 133 students each in both the Treatment and Control Groups (n = 266), who attended a public, high school in rural middle Tennessee. The participants were administered the Force Concepts Inventory (FCI) prior to being taught the mechanics of physics. The FCI data were entered into the computer-based Statistical Package for the Social Science (SPSS). Two independent samples t-tests were conducted to answer the research questions. There was a statistically significant difference between the treatment and control groups concerning the instructional method. Modeling Instructional methods were found to be effective in increasing the academic achievement of students in high school physics. There was no statistically significant difference between FCI gains scores for gender. Gender was found to have no effect on the academic achievement of students in high school physics classes. However, even though there was not a statistically significant difference, female students' gains scores were higher than male students' gains scores when Modeling Instructional methods of teaching were used. Based on these findings, it is recommended that high school science teachers should use Modeling Instructional methods of teaching daily in their classrooms. A recommendation for further research is to expand the Modeling Instructional methods of teaching into different content areas, (i.e., reading and language arts) to explore academic achievement gains.

  17. Human movement stochastic variability leads to diagnostic biomarkers In Autism Spectrum Disorders (ASD)

    NASA Astrophysics Data System (ADS)

    Wu, Di; Torres, Elizabeth B.; Jose, Jorge V.

    2015-03-01

    ASD is a spectrum of neurodevelopmental disorders. The high heterogeneity of the symptoms associated with the disorder impedes efficient diagnoses based on human observations. Recent advances with high-resolution MEM wearable sensors enable accurate movement measurements that may escape the naked eye. It calls for objective metrics to extract physiological relevant information from the rapidly accumulating data. In this talk we'll discuss the statistical analysis of movement data continuously collected with high-resolution sensors at 240Hz. We calculated statistical properties of speed fluctuations within the millisecond time range that closely correlate with the subjects' cognitive abilities. We computed the periodicity and synchronicity of the speed fluctuations' from their power spectrum and ensemble averaged two-point cross-correlation function. We built a two-parameter phase space from the temporal statistical analyses of the nearest neighbor fluctuations that provided a quantitative biomarker for ASD and adult normal subjects and further classified ASD severity. We also found age related developmental statistical signatures and potential ASD parental links in our movement dynamical studies. Our results may have direct clinical applications.

  18. External validation of the Probability of repeated admission (Pra) risk prediction tool in older community-dwelling people attending general practice: a prospective cohort study.

    PubMed

    Wallace, Emma; McDowell, Ronald; Bennett, Kathleen; Fahey, Tom; Smith, Susan M

    2016-11-14

    Emergency admission is associated with the potential for adverse events in older people and risk prediction models are available to identify those at highest risk of admission. The aim of this study was to externally validate and compare the performance of the Probability of repeated admission (Pra) risk model and a modified version (incorporating a multimorbidity measure) in predicting emergency admission in older community-dwelling people. 15 general practices (GPs) in the Republic of Ireland. n=862, ≥70 years, community-dwelling people prospectively followed up for 2 years (2010-2012). Pra risk model (original and modified) calculated for baseline year where ≥0.5 denoted high risk (patient questionnaire, GP medical record review) of future emergency admission. Emergency admission over 1 year (GP medical record review). descriptive statistics, model discrimination (c-statistic) and calibration (Hosmer-Lemeshow statistic). Of 862 patients, a total of 154 (18%) had ≥1 emergency admission(s) in the follow-up year. 63 patients (7%) were classified as high risk by the original Pra and of these 26 (41%) were admitted. The modified Pra classified 391 (45%) patients as high risk and 103 (26%) were subsequently admitted. Both models demonstrated only poor discrimination (original Pra: c-statistic 0.65 (95% CI 0.61 to 0.70); modified Pra: c-statistic 0.67 (95% CI 0.62 to 0.72)). When categorised according to risk-category model, specificity was highest for the original Pra at cut-point of ≥0.5 denoting high risk (95%), and for the modified Pra at cut-point of ≥0.7 (95%). Both models overestimated the number of admissions across all risk strata. While the original Pra model demonstrated poor discrimination, model specificity was high and a small number of patients identified as high risk. Future validation studies should examine higher cut-points denoting high risk for the modified Pra, which has practical advantages in terms of application in GP. The original Pra tool may have a role in identifying higher-risk community-dwelling older people for inclusion in future trials aiming to reduce emergency admissions. Published by the BMJ Publishing Group Limited. For permission to use (where not already granted under a licence) please go to http://www.bmj.com/company/products-services/rights-and-licensing/.

  19. Topographic relationships for design rainfalls over Australia

    NASA Astrophysics Data System (ADS)

    Johnson, F.; Hutchinson, M. F.; The, C.; Beesley, C.; Green, J.

    2016-02-01

    Design rainfall statistics are the primary inputs used to assess flood risk across river catchments. These statistics normally take the form of Intensity-Duration-Frequency (IDF) curves that are derived from extreme value probability distributions fitted to observed daily, and sub-daily, rainfall data. The design rainfall relationships are often required for catchments where there are limited rainfall records, particularly catchments in remote areas with high topographic relief and hence some form of interpolation is required to provide estimates in these areas. This paper assesses the topographic dependence of rainfall extremes by using elevation-dependent thin plate smoothing splines to interpolate the mean annual maximum rainfall, for periods from one to seven days, across Australia. The analyses confirm the important impact of topography in explaining the spatial patterns of these extreme rainfall statistics. Continent-wide residual and cross validation statistics are used to demonstrate the 100-fold impact of elevation in relation to horizontal coordinates in explaining the spatial patterns, consistent with previous rainfall scaling studies and observational evidence. The impact of the complexity of the fitted spline surfaces, as defined by the number of knots, and the impact of applying variance stabilising transformations to the data, were also assessed. It was found that a relatively large number of 3570 knots, suitably chosen from 8619 gauge locations, was required to minimise the summary error statistics. Square root and log data transformations were found to deliver marginally superior continent-wide cross validation statistics, in comparison to applying no data transformation, but detailed assessments of residuals in complex high rainfall regions with high topographic relief showed that no data transformation gave superior performance in these regions. These results are consistent with the understanding that in areas with modest topographic relief, as for most of the Australian continent, extreme rainfall is closely aligned with elevation, but in areas with high topographic relief the impacts of topography on rainfall extremes are more complex. The interpolated extreme rainfall statistics, using no data transformation, have been used by the Australian Bureau of Meteorology to produce new IDF data for the Australian continent. The comprehensive methods presented for the evaluation of gridded design rainfall statistics will be useful for similar studies, in particular the importance of balancing the need for a continentally-optimum solution that maintains sufficient definition at the local scale.

  20. Statistical Capability Study of a Helical Grinding Machine Producing Screw Rotors

    NASA Astrophysics Data System (ADS)

    Holmes, C. S.; Headley, M.; Hart, P. W.

    2017-08-01

    Screw compressors depend for their efficiency and reliability on the accuracy of the rotors, and therefore on the machinery used in their production. The machinery has evolved over more than half a century in response to customer demands for production accuracy, efficiency, and flexibility, and is now at a high level on all three criteria. Production equipment and processes must be capable of maintaining accuracy over a production run, and this must be assessed statistically under strictly controlled conditions. This paper gives numerical data from such a study of an innovative machine tool and shows that it is possible to meet the demanding statistical capability requirements.

  1. Coping, Stress, and Job Satisfaction as Predictors of Advanced Placement Statistics Teachers' Intention to Leave the Field

    ERIC Educational Resources Information Center

    McCarthy, Christopher J.; Lambert, Richard G.; Crowe, Elizabeth W.; McCarthy, Colleen J.

    2010-01-01

    This study examined the relationship of teachers' perceptions of coping resources and demands to job satisfaction factors. Participants were 158 Advanced Placement Statistics high school teachers who completed measures of personal resources for stress prevention, classroom demands and resources, job satisfaction, and intention to leave the field…

  2. Biosignature Discovery for Substance Use Disorders Using Statistical Learning.

    PubMed

    Baurley, James W; McMahan, Christopher S; Ervin, Carolyn M; Pardamean, Bens; Bergen, Andrew W

    2018-02-01

    There are limited biomarkers for substance use disorders (SUDs). Traditional statistical approaches are identifying simple biomarkers in large samples, but clinical use cases are still being established. High-throughput clinical, imaging, and 'omic' technologies are generating data from SUD studies and may lead to more sophisticated and clinically useful models. However, analytic strategies suited for high-dimensional data are not regularly used. We review strategies for identifying biomarkers and biosignatures from high-dimensional data types. Focusing on penalized regression and Bayesian approaches, we address how to leverage evidence from existing studies and knowledge bases, using nicotine metabolism as an example. We posit that big data and machine learning approaches will considerably advance SUD biomarker discovery. However, translation to clinical practice, will require integrated scientific efforts. Copyright © 2017 Elsevier Ltd. All rights reserved.

  3. Concussion Education for High School Football Players: A Pilot Study

    ERIC Educational Resources Information Center

    Manasse-Cohick, Nancy J.; Shapley, Kathy L.

    2014-01-01

    This survey study compared high school football players' knowledge and attitudes about concussion before and after receiving concussion education. There were no significant changes in the Concussion Attitude Index. Results revealed a statistically significant difference in the athletes' scores for the Concussion Knowledge Index, "t"(244)…

  4. Curbing-The Metallic Mode In-between: An empirical study qualifying and categorizing restrained sounds known as Curbing based on audio perception, laryngostroboscopic imaging, acoustics, LTAS, and EGG.

    PubMed

    Thuesen, Mathias Aaen; McGlashan, Julian; Sadolin, Cathrine

    2017-09-01

    This study aims to study the categorization Curbing from the pedagogical method Complete Vocal Technique as a reduced metallic mode compared with the full metallic modes Overdrive and Edge by means of audio perception, laryngostroboscopic imaging, acoustics, long-term average spectrum (LTAS), and electroglottography (EGG). Twenty singers were recorded singing sustained vowels in a restrained character known as Curbing. Two studies were performed: (1) laryngostroboscopic examination using a videonasoendoscopic camera system and the Laryngostrobe program; and (2) simultaneous recording of EGG and acoustic signals using Speech Studio. Images were analyzed based on consensus agreement. Statistical analysis of acoustic, LTAS, and EGG parameters was undertaken using Student paired t tests. The reduced metallic singing mode Curbing has an identifiable laryngeal gesture. Curbing has a more open setting than Overdrive and Edge, with high visibility of the vocal folds, and the false folds giving a rectangular appearance. LTAS showed statistically significant differences between Curbing and the full metallic modes, with less energy across all spectra, yielding a high second and a low third harmonic. Statistically significant differences were identified on Max Qx, Average Qx, Shimmer+, Shimmer-, Shimmer dB, normalized noise energy, cepstral peak prominence, harmonics-to-noise ratio, and mean sound pressure level (P ≤ 0.05). Curbing as a voice production strategy is statistically significantly different from Overdrive and Edge, and can be categorized based on audio perception. This study demonstrates consistently different laryngeal gestures between Curbing and Overdrive and Edge, with high corresponding differences in LTAS, EGG and acoustic measures. Copyright © 2017 The Voice Foundation. Published by Elsevier Inc. All rights reserved.

  5. Assessment of oral health parameters among students attending special schools of Mangalore city.

    PubMed

    Peter, Tom; Cherian, Deepthi Anna; Peter, Tim

    2017-01-01

    The aim of the study was to assess the oral health status and treatment needs and correlation between dental caries susceptibility and salivary pH, buffering capacity and total antioxidant capacity among students attending special schools of Mangalore city. In this study 361 subjects in the age range of 12-18 years were divided into normal ( n = 84), physically challenged ( n = 68), and mentally challenged ( n = 209) groups. Their oral health status and treatment needs were recorded using the modified WHO oral health assessment proforma. Saliva was collected to estimate the salivary parameters. Statistical analysis was done using Statistical Package for Social Sciences version 17. Chicago. On examining, the dentition status of the study subjects, the mean number of decayed teeth was 1.57 for the normal, 2.54 for the physically challenged and 4.41 for the mentally challenged study subjects. These results were highly statistically significant ( P < 0.001). The treatment needs of the study subjects revealed that the mean number of teeth requiring pulp care and restoration were 1 for the normal, 0.12 for the physically challenged, and 1.21 for the mentally challenged study subjects. These results were highly statistically significant ( P < 0.001). The mean salivary pH and buffering capacity were found to be lowest among the mentally challenged subjects. Physically challenged group had the lowest mean total antioxidant capacity among the study subjects. Among the study subjects, normal students had the highest mean salivary pH, buffering capacity, and total antioxidant capacity. These results were highly statistically significant ( P < 0.001). This better dentition status of the normal compared to the physically and mentally challenged study subjects could be due to their improved quality of oral health practices. The difference in the treatment needs could be due to the higher prevalence of untreated dental caries and also due to the neglected oral health care among the mentally challenged study subjects. The salivary pH and buffering capacity were comparatively lower among the physically and mentally challenged study subjects which could contribute to their increased caries experience compared to the normal study subjects. However, further studies are needed to establish a more conclusive result on the total anti-oxidant capacity of the saliva and dental caries.

  6. Statistical learning and language acquisition

    PubMed Central

    Romberg, Alexa R.; Saffran, Jenny R.

    2011-01-01

    Human learners, including infants, are highly sensitive to structure in their environment. Statistical learning refers to the process of extracting this structure. A major question in language acquisition in the past few decades has been the extent to which infants use statistical learning mechanisms to acquire their native language. There have been many demonstrations showing infants’ ability to extract structures in linguistic input, such as the transitional probability between adjacent elements. This paper reviews current research on how statistical learning contributes to language acquisition. Current research is extending the initial findings of infants’ sensitivity to basic statistical information in many different directions, including investigating how infants represent regularities, learn about different levels of language, and integrate information across situations. These current directions emphasize studying statistical language learning in context: within language, within the infant learner, and within the environment as a whole. PMID:21666883

  7. A Selective Overview of Variable Selection in High Dimensional Feature Space

    PubMed Central

    Fan, Jianqing

    2010-01-01

    High dimensional statistical problems arise from diverse fields of scientific research and technological development. Variable selection plays a pivotal role in contemporary statistical learning and scientific discoveries. The traditional idea of best subset selection methods, which can be regarded as a specific form of penalized likelihood, is computationally too expensive for many modern statistical applications. Other forms of penalized likelihood methods have been successfully developed over the last decade to cope with high dimensionality. They have been widely applied for simultaneously selecting important variables and estimating their effects in high dimensional statistical inference. In this article, we present a brief account of the recent developments of theory, methods, and implementations for high dimensional variable selection. What limits of the dimensionality such methods can handle, what the role of penalty functions is, and what the statistical properties are rapidly drive the advances of the field. The properties of non-concave penalized likelihood and its roles in high dimensional statistical modeling are emphasized. We also review some recent advances in ultra-high dimensional variable selection, with emphasis on independence screening and two-scale methods. PMID:21572976

  8. Does objective cluster analysis serve as a useful precursor to seasonal precipitation prediction at local scale? Application to western Ethiopia

    NASA Astrophysics Data System (ADS)

    Zhang, Ying; Moges, Semu; Block, Paul

    2018-01-01

    Prediction of seasonal precipitation can provide actionable information to guide management of various sectoral activities. For instance, it is often translated into hydrological forecasts for better water resources management. However, many studies assume homogeneity in precipitation across an entire study region, which may prove ineffective for operational and local-level decisions, particularly for locations with high spatial variability. This study proposes advancing local-level seasonal precipitation predictions by first conditioning on regional-level predictions, as defined through objective cluster analysis, for western Ethiopia. To our knowledge, this is the first study predicting seasonal precipitation at high resolution in this region, where lives and livelihoods are vulnerable to precipitation variability given the high reliance on rain-fed agriculture and limited water resources infrastructure. The combination of objective cluster analysis, spatially high-resolution prediction of seasonal precipitation, and a modeling structure spanning statistical and dynamical approaches makes clear advances in prediction skill and resolution, as compared with previous studies. The statistical model improves versus the non-clustered case or dynamical models for a number of specific clusters in northwestern Ethiopia, with clusters having regional average correlation and ranked probability skill score (RPSS) values of up to 0.5 and 33 %, respectively. The general skill (after bias correction) of the two best-performing dynamical models over the entire study region is superior to that of the statistical models, although the dynamical models issue predictions at a lower resolution and the raw predictions require bias correction to guarantee comparable skills.

  9. 'Dignity therapy', a promising intervention in palliative care: A comprehensive systematic literature review.

    PubMed

    Martínez, Marina; Arantzamendi, María; Belar, Alazne; Carrasco, José Miguel; Carvajal, Ana; Rullán, María; Centeno, Carlos

    2017-06-01

    Dignity therapy is psychotherapy to relieve psychological and existential distress in patients at the end of life. Little is known about its effect. To analyse the outcomes of dignity therapy in patients with advanced life-threatening diseases. Systematic review was conducted. Three authors extracted data of the articles and evaluated quality using Critical Appraisal Skills Programme. Data were synthesized, considering study objectives. PubMed, CINAHL, Cochrane Library and PsycINFO. The years searched were 2002 (year of dignity therapy development) to January 2016. 'Dignity therapy' was used as search term. Studies with patients with advanced life-threatening diseases were included. Of 121 studies, 28 were included. Quality of studies is high. Results were grouped into effectiveness, satisfaction, suitability and feasibility, and adaptability to different diseases and cultures. Two of five randomized control trials applied dignity therapy to patients with high levels of baseline psychological distress. One showed statistically significant decrease on patients' anxiety and depression scores over time. The other showed statistical decrease on anxiety scores pre-post dignity therapy, not on depression. Nonrandomized studies suggested statistically significant improvements in existential and psychosocial measurements. Patients, relatives and professionals perceived it improved end-of-life experience. Evidence suggests that dignity therapy is beneficial. One randomized controlled trial with patients with high levels of psychological distress shows DT efficacy in anxiety and depression scores. Other design studies report beneficial outcomes in terms of end-of-life experience. Further research should understand how dignity therapy functions to establish a means for measuring its impact and assessing whether high level of distress patients can benefit most from this therapy.

  10. Robustness of S1 statistic with Hodges-Lehmann for skewed distributions

    NASA Astrophysics Data System (ADS)

    Ahad, Nor Aishah; Yahaya, Sharipah Soaad Syed; Yin, Lee Ping

    2016-10-01

    Analysis of variance (ANOVA) is a common use parametric method to test the differences in means for more than two groups when the populations are normally distributed. ANOVA is highly inefficient under the influence of non- normal and heteroscedastic settings. When the assumptions are violated, researchers are looking for alternative such as Kruskal-Wallis under nonparametric or robust method. This study focused on flexible method, S1 statistic for comparing groups using median as the location estimator. S1 statistic was modified by substituting the median with Hodges-Lehmann and the default scale estimator with the variance of Hodges-Lehmann and MADn to produce two different test statistics for comparing groups. Bootstrap method was used for testing the hypotheses since the sampling distributions of these modified S1 statistics are unknown. The performance of the proposed statistic in terms of Type I error was measured and compared against the original S1 statistic, ANOVA and Kruskal-Wallis. The propose procedures show improvement compared to the original statistic especially under extremely skewed distribution.

  11. Hospital graduate social work field work programs: a study in New York City.

    PubMed

    Showers, N

    1990-02-01

    Twenty-seven hospital field work programs in New York City were studied. Questionnaires were administered to program coordinators and 238 graduate social work students participating in study programs. High degrees of program structural complexity and variation were found, indicating a state of art well beyond that described in the general field work literature. High rates of student satisfaction with learning, field instructors, programs, and the overall field work experience found suggest that the complexity of study programs may be more effective than traditional field work models. Statistically nonsignificant study findings indicate areas in which hospital social work departments may develop field work programs consistent with shifting organizational needs, without undue risk to educational effectiveness. Statistically significant findings suggest areas in which inflexibility in program design may be more beneficial in the diagnostic related groups era.

  12. A Systematic Review and Meta-Regression Analysis of Lung Cancer Risk and Inorganic Arsenic in Drinking Water.

    PubMed

    Lamm, Steven H; Ferdosi, Hamid; Dissen, Elisabeth K; Li, Ji; Ahn, Jaeil

    2015-12-07

    High levels (> 200 µg/L) of inorganic arsenic in drinking water are known to be a cause of human lung cancer, but the evidence at lower levels is uncertain. We have sought the epidemiological studies that have examined the dose-response relationship between arsenic levels in drinking water and the risk of lung cancer over a range that includes both high and low levels of arsenic. Regression analysis, based on six studies identified from an electronic search, examined the relationship between the log of the relative risk and the log of the arsenic exposure over a range of 1-1000 µg/L. The best-fitting continuous meta-regression model was sought and found to be a no-constant linear-quadratic analysis where both the risk and the exposure had been logarithmically transformed. This yielded both a statistically significant positive coefficient for the quadratic term and a statistically significant negative coefficient for the linear term. Sub-analyses by study design yielded results that were similar for both ecological studies and non-ecological studies. Statistically significant X-intercepts consistently found no increased level of risk at approximately 100-150 µg/L arsenic.

  13. A Systematic Review and Meta-Regression Analysis of Lung Cancer Risk and Inorganic Arsenic in Drinking Water

    PubMed Central

    Lamm, Steven H.; Ferdosi, Hamid; Dissen, Elisabeth K.; Li, Ji; Ahn, Jaeil

    2015-01-01

    High levels (> 200 µg/L) of inorganic arsenic in drinking water are known to be a cause of human lung cancer, but the evidence at lower levels is uncertain. We have sought the epidemiological studies that have examined the dose-response relationship between arsenic levels in drinking water and the risk of lung cancer over a range that includes both high and low levels of arsenic. Regression analysis, based on six studies identified from an electronic search, examined the relationship between the log of the relative risk and the log of the arsenic exposure over a range of 1–1000 µg/L. The best-fitting continuous meta-regression model was sought and found to be a no-constant linear-quadratic analysis where both the risk and the exposure had been logarithmically transformed. This yielded both a statistically significant positive coefficient for the quadratic term and a statistically significant negative coefficient for the linear term. Sub-analyses by study design yielded results that were similar for both ecological studies and non-ecological studies. Statistically significant X-intercepts consistently found no increased level of risk at approximately 100–150 µg/L arsenic. PMID:26690190

  14. Statistical Patterns of Ionospheric Convection Derived From Mid-Latitude, High-Latitude, and Polar SuperDARN HF Radar Observations

    NASA Astrophysics Data System (ADS)

    Thomas, E. G.; Shepherd, S. G.

    2017-12-01

    Global patterns of ionospheric convection have been widely studied in terms of the interplanetary magnetic field (IMF) magnitude and orientation in both the Northern and Southern Hemispheres using observations from the Super Dual Auroral Radar Network (SuperDARN). The dynamic range of driving conditions under which existing SuperDARN statistical models are valid is currently limited to periods when the high-latitude convection pattern remains above about 60° geomagnetic latitude. Cousins and Shepherd [2010] found this to correspond to intervals when the solar wind electric field Esw < 4.1 mV/m and IMF Bz is negative. Conversely, under northward IMF conditions (Bz > 0) the high-latitude radars often experience difficulties in measuring convection above about 85° geomagnetic latitude. In this presentation, we introduce a new statistical model of ionospheric convection which is valid for much more dominant IMF Bz conditions than was previously possible by including velocity measurements from the newly constructed tiers of radars in the Northern Hemisphere at midlatitudes and in the polar cap. This new model (TS17) is compared to previous statistical models derived from high-latitude SuperDARN observations (RG96, PSR10, CS10) and its impact on instantaneous Map Potential solutions is examined.

  15. Evaluation of the reproducibility of a protocol for the pharmacokinetic study of breast tumors by dynamic magnetic resonance imaging.

    PubMed

    Etxano, J; García-Lallana Valbuena, A; Antón Ibáñez, I; Elizalde, A; Pina, L; García-Foncillas, J; Boni, V

    2015-01-01

    To evaluate the reproducibility of a protocol for dynamic contrast-enhanced magnetic resonance imaging (DCE-MRI) for the pharmacokinetic study of breast tumors. We carried out this prospective study from October 2009 through December 2009. We studied 12 patients with stage ii-iii invasive breast cancer without prior treatment. Our center's research ethics committee approved the study. The 12 patients underwent on two consecutive days DCE-MRI with a high temporal resolution protocol (21 acquisitions/minute). The data obtained in an ROI traced around the largest diameter of the tumor (ROI 1) and in another ROI traced around the area of the lesion's highest K(trans) intensity (ROI 2) were analyzed separately. We used parametric and nonparametric statistical tests to study the reproducibility and concordance of the principal pharmacokinetic variables (K(trans), Kep, Ve and AUC90). The correlations were very high (r>.80; P<.01) for all the variables for ROI 1 and high (r=.70-.80; P<.01) for all the variables for ROI 2, with the exception of Ve both in ROI 1 (r=.44; P=.07) and in ROI 2 (r=.13; P=.235). There were no statistically significant differences between the two studies in the values obtained for K(trans), Kep and AUC90 (P>.05 for each), but there was a statistically significant difference between the two studies in the values obtained for Ve in ROI 2 (P=.008). The high temporal resolution protocol for DCE-MRI used at out center is very reproducible for the principal pharmacokinetic constants of breast. Copyright © 2012 SERAM. Published by Elsevier España, S.L.U. All rights reserved.

  16. The Forest Biomass Resource of the United States

    Treesearch

    Noel D. Cost; James O. Howard; Bert Mead; William H. McWilliams; W. Brad Smith; Dwane D. van Hooser; Eric H. Wharton

    1990-01-01

    Over the last decade, biomass statistics have been published for most states. However, the existing aggregate data are either limited or out of date. The most recent statistics on biomass were for 1980 (U.S. Department of Agriculture 1981). The development of such data continues to lag even though user interest is high. This study was initiated to provide current...

  17. Accounting for isotopic clustering in Fourier transform mass spectrometry data analysis for clinical diagnostic studies.

    PubMed

    Kakourou, Alexia; Vach, Werner; Nicolardi, Simone; van der Burgt, Yuri; Mertens, Bart

    2016-10-01

    Mass spectrometry based clinical proteomics has emerged as a powerful tool for high-throughput protein profiling and biomarker discovery. Recent improvements in mass spectrometry technology have boosted the potential of proteomic studies in biomedical research. However, the complexity of the proteomic expression introduces new statistical challenges in summarizing and analyzing the acquired data. Statistical methods for optimally processing proteomic data are currently a growing field of research. In this paper we present simple, yet appropriate methods to preprocess, summarize and analyze high-throughput MALDI-FTICR mass spectrometry data, collected in a case-control fashion, while dealing with the statistical challenges that accompany such data. The known statistical properties of the isotopic distribution of the peptide molecules are used to preprocess the spectra and translate the proteomic expression into a condensed data set. Information on either the intensity level or the shape of the identified isotopic clusters is used to derive summary measures on which diagnostic rules for disease status allocation will be based. Results indicate that both the shape of the identified isotopic clusters and the overall intensity level carry information on the class outcome and can be used to predict the presence or absence of the disease.

  18. Statistics of natural movements are reflected in motor errors.

    PubMed

    Howard, Ian S; Ingram, James N; Körding, Konrad P; Wolpert, Daniel M

    2009-09-01

    Humans use their arms to engage in a wide variety of motor tasks during everyday life. However, little is known about the statistics of these natural arm movements. Studies of the sensory system have shown that the statistics of sensory inputs are key to determining sensory processing. We hypothesized that the statistics of natural everyday movements may, in a similar way, influence motor performance as measured in laboratory-based tasks. We developed a portable motion-tracking system that could be worn by subjects as they went about their daily routine outside of a laboratory setting. We found that the well-documented symmetry bias is reflected in the relative incidence of movements made during everyday tasks. Specifically, symmetric and antisymmetric movements are predominant at low frequencies, whereas only symmetric movements are predominant at high frequencies. Moreover, the statistics of natural movements, that is, their relative incidence, correlated with subjects' performance on a laboratory-based phase-tracking task. These results provide a link between natural movement statistics and motor performance and confirm that the symmetry bias documented in laboratory studies is a natural feature of human movement.

  19. Assessing the Independent Contribution of Maternal Educational Expectations to Children’s Educational Attainment in Early Adulthood: A Propensity Score Matching Analysis

    PubMed Central

    Pingault, Jean Baptiste; Côté, Sylvana M.; Petitclerc, Amélie; Vitaro, Frank; Tremblay, Richard E.

    2015-01-01

    Background Parental educational expectations have been associated with children’s educational attainment in a number of long-term longitudinal studies, but whether this relationship is causal has long been debated. The aims of this prospective study were twofold: 1) test whether low maternal educational expectations contributed to failure to graduate from high school; and 2) compare the results obtained using different strategies for accounting for confounding variables (i.e. multivariate regression and propensity score matching). Methodology/Principal Findings The study sample included 1,279 participants from the Quebec Longitudinal Study of Kindergarten Children. Maternal educational expectations were assessed when the participants were aged 12 years. High school graduation – measuring educational attainment – was determined through the Quebec Ministry of Education when the participants were aged 22–23 years. Findings show that when using the most common statistical approach (i.e. multivariate regressions to adjust for a restricted set of potential confounders) the contribution of low maternal educational expectations to failure to graduate from high school was statistically significant. However, when using propensity score matching, the contribution of maternal expectations was reduced and remained statistically significant only for males. Conclusions/Significance The results of this study are consistent with the possibility that the contribution of parental expectations to educational attainment is overestimated in the available literature. This may be explained by the use of a restricted range of potential confounding variables as well as the dearth of studies using appropriate statistical techniques and study designs in order to minimize confounding. Each of these techniques and designs, including propensity score matching, has its strengths and limitations: A more comprehensive understanding of the causal role of parental expectations will stem from a convergence of findings from studies using different techniques and designs. PMID:25803867

  20. A Third Moment Adjusted Test Statistic for Small Sample Factor Analysis.

    PubMed

    Lin, Johnny; Bentler, Peter M

    2012-01-01

    Goodness of fit testing in factor analysis is based on the assumption that the test statistic is asymptotically chi-square; but this property may not hold in small samples even when the factors and errors are normally distributed in the population. Robust methods such as Browne's asymptotically distribution-free method and Satorra Bentler's mean scaling statistic were developed under the presumption of non-normality in the factors and errors. This paper finds new application to the case where factors and errors are normally distributed in the population but the skewness of the obtained test statistic is still high due to sampling error in the observed indicators. An extension of Satorra Bentler's statistic is proposed that not only scales the mean but also adjusts the degrees of freedom based on the skewness of the obtained test statistic in order to improve its robustness under small samples. A simple simulation study shows that this third moment adjusted statistic asymptotically performs on par with previously proposed methods, and at a very small sample size offers superior Type I error rates under a properly specified model. Data from Mardia, Kent and Bibby's study of students tested for their ability in five content areas that were either open or closed book were used to illustrate the real-world performance of this statistic.

  1. A Survey of Factors Influencing High School Start Times

    ERIC Educational Resources Information Center

    Wolfson, Amy R.; Carskadon, Mary A.

    2005-01-01

    The present study surveyed high school personnel regarding high school start times, factors influencing school start times, and decision making around school schedules. Surveys were analyzed from 345 secondary schools selected at random from the National Center for Educational Statistics database. Factors affecting reported start times included…

  2. Comparing High School Students' and Adults' Perceptions of Technological Literacy

    ERIC Educational Resources Information Center

    Harrison, Henry Ladson, III

    2009-01-01

    This study compared high school student's perceptions of technology and technological literacy to those perceptions of the general public. Additionally, individual student groups were compared statistically to determine significant differences between the groups. The "ITEA/Gallup Poll" instrument was used to survey high school student's…

  3. Using Geographic Information Science to Explore Associations between Air Pollution, Environmental Amenities, and Preterm Births

    PubMed Central

    Ogneva-Himmelberger, Yelena; Dahlberg, Tyler; Kelly, Kristen; Simas, Tiffany A. Moore

    2015-01-01

    The study uses geographic information science (GIS) and statistics to find out if there are statistical differences between full term and preterm births to non-Hispanic white, non-Hispanic Black, and Hispanic mothers in their exposure to air pollution and access to environmental amenities (green space and vendors of healthy food) in the second largest city in New England, Worcester, Massachusetts. Proximity to a Toxic Release Inventory site has a statistically significant effect on preterm birth regardless of race. The air-pollution hazard score from the Risk Screening Environmental Indicators Model is also a statistically significant factor when preterm births are categorized into three groups based on the degree of prematurity. Proximity to green space and to a healthy food vendor did not have an effect on preterm births. The study also used cluster analysis and found statistically significant spatial clusters of high preterm birth volume for non-Hispanic white, non-Hispanic Black, and Hispanic mothers. PMID:29546120

  4. Using Geographic Information Science to Explore Associations between Air Pollution, Environmental Amenities, and Preterm Births.

    PubMed

    Ogneva-Himmelberger, Yelena; Dahlberg, Tyler; Kelly, Kristen; Simas, Tiffany A Moore

    2015-01-01

    The study uses geographic information science (GIS) and statistics to find out if there are statistical differences between full term and preterm births to non-Hispanic white, non-Hispanic Black, and Hispanic mothers in their exposure to air pollution and access to environmental amenities (green space and vendors of healthy food) in the second largest city in New England, Worcester, Massachusetts. Proximity to a Toxic Release Inventory site has a statistically significant effect on preterm birth regardless of race. The air-pollution hazard score from the Risk Screening Environmental Indicators Model is also a statistically significant factor when preterm births are categorized into three groups based on the degree of prematurity. Proximity to green space and to a healthy food vendor did not have an effect on preterm births. The study also used cluster analysis and found statistically significant spatial clusters of high preterm birth volume for non-Hispanic white, non-Hispanic Black, and Hispanic mothers.

  5. High School Students' Affective Reaction to English Speaking Activities

    ERIC Educational Resources Information Center

    Jorquera Torres, Oliver Camilo; Mendoza Zapata, Jhon Eliot; Díaz Larenas, Claudio Heraldo

    2017-01-01

    This study aims to measure fifty-two high school students' affective reactions after doing individual and pair-based speaking activities then completing a semantic differential scale of nine bipolar adjectives. Results do not show significant statistical differences between the two types of activities or the schools involved in this study, but…

  6. End-of-High-School Mathematics Attainment: How Did Students Get There?

    ERIC Educational Resources Information Center

    Newton, Xiaoxia A.

    2010-01-01

    Background: Many studies have looked at students' mathematics achievement in the middle and high school years and the kinds of factors that are associated with their achievement. Within this domain, however, most research utilized cross-sectional data. Cross-sectional designs have both statistical and conceptual limitations. Few studies used…

  7. Non-Gaussian statistics and nanosecond dynamics of electrostatic fluctuations affecting optical transitions in proteins.

    PubMed

    Martin, Daniel R; Matyushov, Dmitry V

    2012-08-30

    We show that electrostatic fluctuations of the protein-water interface are globally non-Gaussian. The electrostatic component of the optical transition energy (energy gap) in a hydrated green fluorescent protein is studied here by classical molecular dynamics simulations. The distribution of the energy gap displays a high excess in the breadth of electrostatic fluctuations over the prediction of the Gaussian statistics. The energy gap dynamics include a nanosecond component. When simulations are repeated with frozen protein motions, the statistics shifts to the expectations of linear response and the slow dynamics disappear. We therefore suggest that both the non-Gaussian statistics and the nanosecond dynamics originate largely from global, low-frequency motions of the protein coupled to the interfacial water. The non-Gaussian statistics can be experimentally verified from the temperature dependence of the first two spectral moments measured at constant-volume conditions. Simulations at different temperatures are consistent with other indicators of the non-Gaussian statistics. In particular, the high-temperature part of the energy gap variance (second spectral moment) scales linearly with temperature and extrapolates to zero at a temperature characteristic of the protein glass transition. This result, violating the classical limit of the fluctuation-dissipation theorem, leads to a non-Boltzmann statistics of the energy gap and corresponding non-Arrhenius kinetics of radiationless electronic transitions, empirically described by the Vogel-Fulcher-Tammann law.

  8. Extracting chemical information from high-resolution Kβ X-ray emission spectroscopy

    NASA Astrophysics Data System (ADS)

    Limandri, S.; Robledo, J.; Tirao, G.

    2018-06-01

    High-resolution X-ray emission spectroscopy allows studying the chemical environment of a wide variety of materials. Chemical information can be obtained by fitting the X-ray spectra and observing the behavior of some spectral features. Spectral changes can also be quantified by means of statistical parameters calculated by considering the spectrum as a probability distribution. Another possibility is to perform statistical multivariate analysis, such as principal component analysis. In this work the performance of these procedures for extracting chemical information in X-ray emission spectroscopy spectra for mixtures of Mn2+ and Mn4+ oxides are studied. A detail analysis of the parameters obtained, as well as the associated uncertainties is shown. The methodologies are also applied for Mn oxidation state characterization of double perovskite oxides Ba1+xLa1-xMnSbO6 (with 0 ≤ x ≤ 0.7). The results show that statistical parameters and multivariate analysis are the most suitable for the analysis of this kind of spectra.

  9. Application of multivariate statistical techniques in microbial ecology

    PubMed Central

    Paliy, O.; Shankar, V.

    2016-01-01

    Recent advances in high-throughput methods of molecular analyses have led to an explosion of studies generating large scale ecological datasets. Especially noticeable effect has been attained in the field of microbial ecology, where new experimental approaches provided in-depth assessments of the composition, functions, and dynamic changes of complex microbial communities. Because even a single high-throughput experiment produces large amounts of data, powerful statistical techniques of multivariate analysis are well suited to analyze and interpret these datasets. Many different multivariate techniques are available, and often it is not clear which method should be applied to a particular dataset. In this review we describe and compare the most widely used multivariate statistical techniques including exploratory, interpretive, and discriminatory procedures. We consider several important limitations and assumptions of these methods, and we present examples of how these approaches have been utilized in recent studies to provide insight into the ecology of the microbial world. Finally, we offer suggestions for the selection of appropriate methods based on the research question and dataset structure. PMID:26786791

  10. Oral health status of women with high-risk pregnancies.

    PubMed

    Merglova, Vlasta; Hecova, Hana; Stehlikova, Jaroslava; Chaloupka, Pavel

    2012-12-01

    The aim of this study was to investigate the oral health status of women with high-risk pregnancies. A case-control study of 142 pregnant women was conducted. The case group included 81 pregnant women with high-risk pregnancies, while 61 women with normal pregnancies served as controls. The following variables were recorded for each woman: age, general health status, DMF, CPITN, and PBI index, amounts of Streptococcus mutans in the saliva and dental treatment needs. The Mann-Whitney test, Kruskal-Wallis test, t-test and chi-squared test were used for statistical analyses. Statistically significant differences were detected between the PBI indices and dental treatment needs of the two groups. Out of the entire study cohort, 77% of the women in the case group and 52% of the women in the control group required dental treatment. In this study, women with complications during pregnancy had severe gingivitis and needed more frequent dental treatment than those in the control group.

  11. Therapeutic whole-body hypothermia reduces mortality in severe traumatic brain injury if the cooling index is sufficiently high: meta-analyses of the effect of single cooling parameters and their integrated measure.

    PubMed

    Olah, Emoke; Poto, Laszlo; Hegyi, Peter; Szabo, Imre; Hartmann, Petra; Solymar, Margit; Petervari, Erika; Balasko, Marta; Habon, Tamas; Rumbus, Zoltan; Tenk, Judit; Rostas, Ildiko; Weinberg, Jordan; Romanovsky, Andrej A; Garami, Andras

    2018-04-21

    Therapeutic hypothermia was investigated repeatedly as a tool to improve the outcome of severe traumatic brain injury (TBI), but previous clinical trials and meta-analyses found contradictory results. We aimed to determine the effectiveness of therapeutic whole-body hypothermia on the mortality of adult patients with severe TBI by using a novel approach of meta-analysis. We searched the PubMed, EMBASE, and Cochrane Library databases from inception to February 2017. The identified human studies were evaluated regarding statistical, clinical, and methodological designs to ensure inter-study homogeneity. We extracted data on TBI severity, body temperature, mortality, and cooling parameters; then we calculated the cooling index, an integrated measure of therapeutic hypothermia. Forest plot of all identified studies showed no difference in the outcome of TBI between cooled and not cooled patients, but inter-study heterogeneity was high. On the contrary, by meta-analysis of RCTs which were homogenous with regards to statistical, clinical designs and precisely reported the cooling protocol, we showed decreased odds ratio for mortality in therapeutic hypothermia compared to no cooling. As independent factors, milder and longer cooling, and rewarming at < 0.25°C/h were associated with better outcome. Therapeutic hypothermia was beneficial only if the cooling index (measure of combination of cooling parameters) was sufficiently high. We conclude that high methodological and statistical inter-study heterogeneity could underlie the contradictory results obtained in previous studies. By analyzing methodologically homogenous studies, we show that cooling improves the outcome of severe TBI and this beneficial effect depends on certain cooling parameters and on their integrated measure, the cooling index.

  12. Incidence of cardiovascular complications in knee arthroplasty patients before and after implementation of a ropivacaine local infiltration analgesia protocol: A retrospective study.

    PubMed

    Lameijer, Joost R C; Verboom, Frederik; Grefkens, Joost; Jansen, Joris

    2016-10-01

    Local infiltration analgesia (LIA) during total knee arthroplasty has been shown to give statistically significant reduction in post-operative pain. The effects of using high volumes of ropivacaine combined with adrenaline as LIA on cardiovascular parameters in knee replacement have not been described before. The objective of this study was to investigate the cardiovascular safety of ropivacaine as part of high volume local infiltration analgesia (LIA) in total knee replacement surgery. This is a retrospective observational comparative cohort study conducted in two independent cohorts, one treated without and one treated with a local infiltration analgesia protocol, containing a total of 744 patients with a mean age of 68years (42 to 89) and 68years (21 to 88) respectively with a follow-up of 12months. No statistical difference in bradycardia during surgery, post-operative cardiovascular complications, and mortality was found after use of LIA. A statistically significant lower incidence of hypotension was found in the LIA group (P<0.01). This result has to be interpreted with care, due to the use of adrenaline in the LIA mixture, which could mask possible hypotension. No statistical difference was found in the occurrence of hypertension or tachycardia, despite the addition of adrenaline to the LIA mixture. No difference in mortality was found between the two groups (P=0.11). These results show safe use of high volume ropivacaine with adrenaline as local infiltration analgesia during total knee replacement surgery. Copyright © 2016 Elsevier B.V. All rights reserved.

  13. The use and misuse of statistical methodologies in pharmacology research.

    PubMed

    Marino, Michael J

    2014-01-01

    Descriptive, exploratory, and inferential statistics are necessary components of hypothesis-driven biomedical research. Despite the ubiquitous need for these tools, the emphasis on statistical methods in pharmacology has become dominated by inferential methods often chosen more by the availability of user-friendly software than by any understanding of the data set or the critical assumptions of the statistical tests. Such frank misuse of statistical methodology and the quest to reach the mystical α<0.05 criteria has hampered research via the publication of incorrect analysis driven by rudimentary statistical training. Perhaps more critically, a poor understanding of statistical tools limits the conclusions that may be drawn from a study by divorcing the investigator from their own data. The net result is a decrease in quality and confidence in research findings, fueling recent controversies over the reproducibility of high profile findings and effects that appear to diminish over time. The recent development of "omics" approaches leading to the production of massive higher dimensional data sets has amplified these issues making it clear that new approaches are needed to appropriately and effectively mine this type of data. Unfortunately, statistical education in the field has not kept pace. This commentary provides a foundation for an intuitive understanding of statistics that fosters an exploratory approach and an appreciation for the assumptions of various statistical tests that hopefully will increase the correct use of statistics, the application of exploratory data analysis, and the use of statistical study design, with the goal of increasing reproducibility and confidence in the literature. Copyright © 2013. Published by Elsevier Inc.

  14. Helicity statistics in homogeneous and isotropic turbulence and turbulence models

    NASA Astrophysics Data System (ADS)

    Sahoo, Ganapati; De Pietro, Massimo; Biferale, Luca

    2017-02-01

    We study the statistical properties of helicity in direct numerical simulations of fully developed homogeneous and isotropic turbulence and in a class of turbulence shell models. We consider correlation functions based on combinations of vorticity and velocity increments that are not invariant under mirror symmetry. We also study the scaling properties of high-order structure functions based on the moments of the velocity increments projected on a subset of modes with either positive or negative helicity (chirality). We show that mirror symmetry is recovered at small scales, i.e., chiral terms are subleading and they are well captured by a dimensional argument plus anomalous corrections. These findings are also supported by a high Reynolds numbers study of helical shell models with the same chiral symmetry of Navier-Stokes equations.

  15. Effect of Embolization Material in the Calculation of Dose Deposition in Arteriovenous Malformations

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    De la Cruz, O. O. Galvan; Moreno-Jimenez, S.; Larraga-Gutierrez, J. M.

    2010-12-07

    In this work it is studied the impact of the incorporation of high Z materials (embolization material) in the dose calculation for stereotactic radiosurgery treatment for arteriovenous malformations. A statistical analysis is done to establish the variables that may impact in the dose calculation. To perform the comparison pencil beam (PB) and Monte Carlo (MC) calculation algorithms were used. The comparison between both dose calculations shows that PB overestimates the dose deposited. The statistical analysis, for the quantity of patients of the study (20), shows that the variable that may impact in the dose calculation is the volume of themore » high Z material in the arteriovenous malformation. Further studies have to be done to establish the clinical impact with the radiosurgery result.« less

  16. Physical fitness modulates incidental but not intentional statistical learning of simultaneous auditory sequences during concurrent physical exercise.

    PubMed

    Daikoku, Tatsuya; Takahashi, Yuji; Futagami, Hiroko; Tarumoto, Nagayoshi; Yasuda, Hideki

    2017-02-01

    In real-world auditory environments, humans are exposed to overlapping auditory information such as those made by human voices and musical instruments even during routine physical activities such as walking and cycling. The present study investigated how concurrent physical exercise affects performance of incidental and intentional learning of overlapping auditory streams, and whether physical fitness modulates the performances of learning. Participants were grouped with 11 participants with lower and higher fitness each, based on their Vo 2 max value. They were presented simultaneous auditory sequences with a distinct statistical regularity each other (i.e. statistical learning), while they were pedaling on the bike and seating on a bike at rest. In experiment 1, they were instructed to attend to one of the two sequences and ignore to the other sequence. In experiment 2, they were instructed to attend to both of the two sequences. After exposure to the sequences, learning effects were evaluated by familiarity test. In the experiment 1, performance of statistical learning of ignored sequences during concurrent pedaling could be higher in the participants with high than low physical fitness, whereas in attended sequence, there was no significant difference in performance of statistical learning between high than low physical fitness. Furthermore, there was no significant effect of physical fitness on learning while resting. In the experiment 2, the both participants with high and low physical fitness could perform intentional statistical learning of two simultaneous sequences in the both exercise and rest sessions. The improvement in physical fitness might facilitate incidental but not intentional statistical learning of simultaneous auditory sequences during concurrent physical exercise.

  17. Statistical analysis of secondary particle distributions in relativistic nucleus-nucleus collisions

    NASA Technical Reports Server (NTRS)

    Mcguire, Stephen C.

    1987-01-01

    The use is described of several statistical techniques to characterize structure in the angular distributions of secondary particles from nucleus-nucleus collisions in the energy range 24 to 61 GeV/nucleon. The objective of this work was to determine whether there are correlations between emitted particle intensity and angle that may be used to support the existence of the quark gluon plasma. The techniques include chi-square null hypothesis tests, the method of discrete Fourier transform analysis, and fluctuation analysis. We have also used the method of composite unit vectors to test for azimuthal asymmetry in a data set of 63 JACEE-3 events. Each method is presented in a manner that provides the reader with some practical detail regarding its application. Of those events with relatively high statistics, Fe approaches 0 at 55 GeV/nucleon was found to possess an azimuthal distribution with a highly non-random structure. No evidence of non-statistical fluctuations was found in the pseudo-rapidity distributions of the events studied. It is seen that the most effective application of these methods relies upon the availability of many events or single events that possess very high multiplicities.

  18. High School Longitudinal Study of 2009 (HSLS:09): Base-Year Data File Documentation. NCES 2011-328

    ERIC Educational Resources Information Center

    Ingels, Steven J.; Pratt, Daniel J.; Herget, Deborah R.; Burns, Laura J.; Dever, Jill A.; Ottem, Randolph; Rogers, James E.; Jin, Ying; Leinwand, Steve

    2011-01-01

    The High School Longitudinal Study of 2009 (HSLS:09) is the fifth in a series of National Center for Education Statistics (NCES) secondary longitudinal studies. The core research questions for HSLS:09 explore secondary to postsecondary transition plans and the evolution of those plans; the paths into and out of science, technology, engineering,…

  19. Synthesis of instrumentally and historically recorded earthquakes and studying their spatial statistical relationship (A case study: Dasht-e-Biaz, Eastern Iran)

    NASA Astrophysics Data System (ADS)

    Jalali, Mohammad; Ramazi, Hamidreza

    2018-06-01

    Earthquake catalogues are the main source of statistical seismology for the long term studies of earthquake occurrence. Therefore, studying the spatiotemporal problems is important to reduce the related uncertainties in statistical seismology studies. A statistical tool, time normalization method, has been determined to revise time-frequency relationship in one of the most active regions of Asia, Eastern Iran and West of Afghanistan, (a and b were calculated around 8.84 and 1.99 in the exponential scale, not logarithmic scale). Geostatistical simulation method has been further utilized to reduce the uncertainties in the spatial domain. A geostatistical simulation produces a representative, synthetic catalogue with 5361 events to reduce spatial uncertainties. The synthetic database is classified using a Geographical Information System, GIS, based on simulated magnitudes to reveal the underlying seismicity patterns. Although some regions with highly seismicity correspond to known faults, significantly, as far as seismic patterns are concerned, the new method highlights possible locations of interest that have not been previously identified. It also reveals some previously unrecognized lineation and clusters in likely future strain release.

  20. Using public control genotype data to increase power and decrease cost of case-control genetic association studies.

    PubMed

    Ho, Lindsey A; Lange, Ethan M

    2010-12-01

    Genome-wide association (GWA) studies are a powerful approach for identifying novel genetic risk factors associated with human disease. A GWA study typically requires the inclusion of thousands of samples to have sufficient statistical power to detect single nucleotide polymorphisms that are associated with only modest increases in risk of disease given the heavy burden of a multiple test correction that is necessary to maintain valid statistical tests. Low statistical power and the high financial cost of performing a GWA study remains prohibitive for many scientific investigators anxious to perform such a study using their own samples. A number of remedies have been suggested to increase statistical power and decrease cost, including the utilization of free publicly available genotype data and multi-stage genotyping designs. Herein, we compare the statistical power and relative costs of alternative association study designs that use cases and screened controls to study designs that are based only on, or additionally include, free public control genotype data. We describe a novel replication-based two-stage study design, which uses free public control genotype data in the first stage and follow-up genotype data on case-matched controls in the second stage that preserves many of the advantages inherent when using only an epidemiologically matched set of controls. Specifically, we show that our proposed two-stage design can substantially increase statistical power and decrease cost of performing a GWA study while controlling the type-I error rate that can be inflated when using public controls due to differences in ancestry and batch genotype effects.

  1. Tree-Ring Widths and Snow Cover Depth in High Tauern

    NASA Astrophysics Data System (ADS)

    Falarz, Malgorzata

    2017-12-01

    The aim of the study is to examine the correlation of Norway spruce tree-ring widths and the snow cover depth in the High Tauern mountains. The average standardized tree-ring widths indices for Nowary spruce posted by Bednarz and Niedzwiedz (2006) were taken into account. Increment cores were collected from 39 Norway spruces growing in the High Tauern near the upper limit of the forest at altitude of 1700-1800 m, 3 km from the meteorological station at Sonnblick. Moreover, the maximum of snow cover depth in Sonnblick (3105 m a.s.l.) for each winter season in the period from 1938/39 to 1994/95 (57 winter seasons) was taken into account. The main results of the research are as follows: (1) tree-ring widths in a given year does not reveal statistically significant dependency on the maximum snow cover depth observed in the winter season, which ended this year; (2) however, the tested relationship is statistically significant in the case of correlating of the tree-ring widths in a given year with a maximum snow cover depth in a season of previous year. The correlation coefficient for the entire period of the study is not very high (r=0.27) but shows a statistical significance at the 0.05 level; (3) the described relationship is not stable over time. 30-year moving correlations showed no significant dependencies till 1942 and after 1982 (probably due to the so-called divergence phenomenon). However, during the period of 1943-1981 the values of correlation coefficient for moving 30-year periods are statistically significant and range from 0.37 to 0.45; (4) the correlation coefficient between real and calibrated (on the base of the regression equation) values of maximum snow cover depth is statistically significant for calibration period and not significant for verification one; (5) due to a quite short period of statistically significant correlations and not very strict dependencies, the reconstruction of snow cover on Sonnblick for the period before regular measurements seems to be not reasonable.

  2. Depression, anxiety and sexual satisfaction in breast cancer patients and their partners-Izmir oncology group study.

    PubMed

    Alacacioglu, Ahmet; Ulger, Eda; Varol, Umut; Yildiz, Ibrahim; Salman, Tarik; Bayoglu, Vedat; Dirican, Ahmet; Demir, Lutfiye; Akyol, Murat; Yildiz, Yasar; Kucukzeybek, Yuksel; Ataman, Gorkem; Can, Huseyin; Alacacioglu, Inci; Tarhan, Mustafa Oktay

    2014-01-01

    We aimed to investigate anxiety, depression and sexual satisfaction levels and the effects of depression and anxiety upon the sexual satisfaction of Turkish breast cancer patients and their partners. Data were collected from one hundred breast cancer patients and their partners, using three forms: one covering information about socio-demographic characteristics of the patients, the Hospital Anxiety and Depression Scale (HADs) and the Golombok-Rust Inventory of Sexual Satisfaction (GRISS). The frequencies, avoidance and touch subscores were statistically significantly high in the patients. Among those with high anxiety scores, the frequency, communication, satisfaction, touch, and anorgasmic subscale scores of GRISS were found to be significantly high. Among the partners whose anxiety scores were high, only the premature ejaculation subscale was statistically significant. It was determined that for partners with higher depression scores, the communication, satisfaction, avoidance, premature ejaculation and erectile dysfunction subscores of GRISS were statistically higher compared to partners with lower depression scores. Patients' quality of life may be increased by taking precautions to reduce their and their partners' psychosocial and psychosexual concerns.

  3. Comparison of high-sensitivity C-reactive protein and fetuin-A levels before and after treatment for subjects with subclinical hyperthyroidism.

    PubMed

    Bilgir, Oktay; Bilgir, Ferda; Topcuoglu, Tuba; Calan, Mehmet; Calan, Ozlem

    2014-03-01

    This study was designed to show the effect of propylthiouracil treatment on sCD40L, high-sensitivity C-reactive protein, and fetuin-A levels on subjects with subclinical hyperthyroidism. After checking sCD40L, high-sensitivity C-reactive protein, and fetuin-A levels of 35 patients with subclinical hyperthyroidism, each was given 50 mg tablets of propylthiouracil three times daily. After 3 months, sCD40L, high-sensitivity C-reactive protein, and fetuin-A levels were then compared to the levels before treatment. Although high-sensitivity C-reactive protein and sCD40L levels were normal in the subclinical hyperthyroidism patients compared to the healthy controls, fetuin-A levels were statistically significantly higher (*p = 0.022). After treatment, fetuin-A levels of subclinical hyperthyroidism patients decreased statistically significantly compared to the levels before treatment (**p = 0.026). sCD40L and high-sensitivity C-reactive protein levels did not have a statistically significant difference compared to the control group and post-propylthiouracil treatment. In subclinical hyperthyroidism patients, high fetuin-A levels before propylthiouracil treatment and decreases in these levels after treatment in cases with subclinical hyperthyroidism indicated the possibility of preventing long-term cardiac complications with propylthiouracil treatment.

  4. The effects of household income distribution on stroke prevalence and its risk factors of high blood pressure and smoking: a cross-sectional study in Saskatchewan, Canada.

    PubMed

    Bird, Yelena; Lemstra, Mark; Rogers, Marla

    2017-03-01

    Stroke is a major chronic disease and a common cause of adult disability and mortality. Although there are many known risk factors for stroke, lower income is not one that is often discussed. To determine the unadjusted and adjusted association of income distribution on the prevalence of stroke in Saskatchewan, Canada. Information was collected from the Canadian Community Health Survey conducted by Statistics Canada for 2000-2008. In total, 178 variables were analysed for their association with stroke. Prior to statistical adjustment, stroke was seven times more common for lower income residents than higher income residents. After statistical adjustment, only four covariates were independently associated with stroke prevalence, including having high blood pressure (odds ratio (OR) = 2.62; 95% confidence interval (CI) = 2.12-3.24), having a household income below CAD$30,000 per year (OR = 2.49; 95% CI = 1.88-3.29), being a daily smoker (OR = 1.36; 95% CI = 1.16-1.58) and being physically inactive (OR = 1.27; 95% CI = 1.13-1.43). After statistical adjustment, there were five covariates independently associated with high blood pressure prevalence, including having a household income below CAD$30,000 per year (OR = 1.52; 95% CI = 1.41-1.63). After statistical adjustment, there were five covariates independently associated with daily smoking prevalence, including having a household income below CAD$30,000 per year (OR = 1.29; 95% CI = 1.25-1.33). Knowledge of disparities in the prevalence, severity, disability and mortality of stroke is critically important to medical and public health professionals. Our study found that income distribution was strongly associated with stroke, its main disease intermediary - high blood pressure - and its main risk factor - smoking. As such, income is an important variable worthy of public debate as a modifiable risk factor for stroke.

  5. Attitude of teaching faculty towards statistics at a medical university in Karachi, Pakistan.

    PubMed

    Khan, Nazeer; Mumtaz, Yasmin

    2009-01-01

    Statistics is mainly used in biological research to verify the clinicians and researchers findings and feelings, and gives scientific validity for their inferences. In Pakistan, the educational curriculum is developed in such a way that the students who are interested in entering in the field of biological sciences do not study mathematics after grade 10. Therefore, due to their fragile background of mathematical skills, the Pakistani medical professionals feel that they do not have adequate base to understand the basic concepts of statistical techniques when they try to use it in their research or read a scientific article. The aim of the study was to assess the attitude of medical faculty towards statistics. A questionnaire containing 42 close-ended and 4 open-ended questions, related to the attitude and knowledge of statistics, was distributed among the teaching faculty of Dow University of Health Sciences (DUHS). One hundred and sixty-seven filled questionnaires were returned from 374 faculty members (response rate 44.7%). Forty-three percent of the respondents claimed that they had 'introductive' level of statistics courses, 63% of the respondents strongly agreed that a good researcher must have some training in statistics, 82% of the faculty was in favour (strongly agreed or agreed) that statistics was really useful for research. Only 17% correctly stated that statistics is the science of uncertainty. Half of the respondents accepted that they have problem of writing the statistical section of the article. 64% of the subjects indicated that statistical teaching methods were the main reasons for the impression of its difficulties. 53% of the faculty indicated that the co-authorship of the statistician should depend upon his/her contribution in the study. Gender did not show any significant difference among the responses. However, senior faculty showed higher level of the importance for the use of statistics and difficulties of writing result section of articles as compared to junior faculty. The study showed a low level of knowledge, but high level of the awareness for the use of statistical techniques in research and exhibited a good level of motivation for further training.

  6. When the Single Matters more than the Group (II): Addressing the Problem of High False Positive Rates in Single Case Voxel Based Morphometry Using Non-parametric Statistics.

    PubMed

    Scarpazza, Cristina; Nichols, Thomas E; Seramondi, Donato; Maumet, Camille; Sartori, Giuseppe; Mechelli, Andrea

    2016-01-01

    In recent years, an increasing number of studies have used Voxel Based Morphometry (VBM) to compare a single patient with a psychiatric or neurological condition of interest against a group of healthy controls. However, the validity of this approach critically relies on the assumption that the single patient is drawn from a hypothetical population with a normal distribution and variance equal to that of the control group. In a previous investigation, we demonstrated that family-wise false positive error rate (i.e., the proportion of statistical comparisons yielding at least one false positive) in single case VBM are much higher than expected (Scarpazza et al., 2013). Here, we examine whether the use of non-parametric statistics, which does not rely on the assumptions of normal distribution and equal variance, would enable the investigation of single subjects with good control of false positive risk. We empirically estimated false positive rates (FPRs) in single case non-parametric VBM, by performing 400 statistical comparisons between a single disease-free individual and a group of 100 disease-free controls. The impact of smoothing (4, 8, and 12 mm) and type of pre-processing (Modulated, Unmodulated) was also examined, as these factors have been found to influence FPRs in previous investigations using parametric statistics. The 400 statistical comparisons were repeated using two independent, freely available data sets in order to maximize the generalizability of the results. We found that the family-wise error rate was 5% for increases and 3.6% for decreases in one data set; and 5.6% for increases and 6.3% for decreases in the other data set (5% nominal). Further, these results were not dependent on the level of smoothing and modulation. Therefore, the present study provides empirical evidence that single case VBM studies with non-parametric statistics are not susceptible to high false positive rates. The critical implication of this finding is that VBM can be used to characterize neuroanatomical alterations in individual subjects as long as non-parametric statistics are employed.

  7. ‘Dignity therapy’, a promising intervention in palliative care: A comprehensive systematic literature review

    PubMed Central

    Martínez, Marina; Arantzamendi, María; Belar, Alazne; Carrasco, José Miguel; Carvajal, Ana; Rullán, María; Centeno, Carlos

    2016-01-01

    Background: Dignity therapy is psychotherapy to relieve psychological and existential distress in patients at the end of life. Little is known about its effect. Aim: To analyse the outcomes of dignity therapy in patients with advanced life-threatening diseases. Design: Systematic review was conducted. Three authors extracted data of the articles and evaluated quality using Critical Appraisal Skills Programme. Data were synthesized, considering study objectives. Data sources: PubMed, CINAHL, Cochrane Library and PsycINFO. The years searched were 2002 (year of dignity therapy development) to January 2016. ‘Dignity therapy’ was used as search term. Studies with patients with advanced life-threatening diseases were included. Results: Of 121 studies, 28 were included. Quality of studies is high. Results were grouped into effectiveness, satisfaction, suitability and feasibility, and adaptability to different diseases and cultures. Two of five randomized control trials applied dignity therapy to patients with high levels of baseline psychological distress. One showed statistically significant decrease on patients’ anxiety and depression scores over time. The other showed statistical decrease on anxiety scores pre–post dignity therapy, not on depression. Nonrandomized studies suggested statistically significant improvements in existential and psychosocial measurements. Patients, relatives and professionals perceived it improved end-of-life experience. Conclusion: Evidence suggests that dignity therapy is beneficial. One randomized controlled trial with patients with high levels of psychological distress shows DT efficacy in anxiety and depression scores. Other design studies report beneficial outcomes in terms of end-of-life experience. Further research should understand how dignity therapy functions to establish a means for measuring its impact and assessing whether high level of distress patients can benefit most from this therapy. PMID:27566756

  8. The prior statistics of object colors.

    PubMed

    Koenderink, Jan J

    2010-02-01

    The prior statistics of object colors is of much interest because extensive statistical investigations of reflectance spectra reveal highly non-uniform structure in color space common to several very different databases. This common structure is due to the visual system rather than to the statistics of environmental structure. Analysis involves an investigation of the proper sample space of spectral reflectance factors and of the statistical consequences of the projection of spectral reflectances on the color solid. Even in the case of reflectance statistics that are translationally invariant with respect to the wavelength dimension, the statistics of object colors is highly non-uniform. The qualitative nature of this non-uniformity is due to trichromacy.

  9. Genital Chlamydia Prevalence in Europe and Non-European High Income Countries: Systematic Review and Meta-Analysis

    PubMed Central

    Redmond, Shelagh M.; Alexander-Kisslig, Karin; Woodhall, Sarah C.; van den Broek, Ingrid V. F.; van Bergen, Jan; Ward, Helen; Uusküla, Anneli; Herrmann, Björn; Andersen, Berit; Götz, Hannelore M.; Sfetcu, Otilia; Low, Nicola

    2015-01-01

    Background Accurate information about the prevalence of Chlamydia trachomatis is needed to assess national prevention and control measures. Methods We systematically reviewed population-based cross-sectional studies that estimated chlamydia prevalence in European Union/European Economic Area (EU/EEA) Member States and non-European high income countries from January 1990 to August 2012. We examined results in forest plots, explored heterogeneity using the I2 statistic, and conducted random effects meta-analysis if appropriate. Meta-regression was used to examine the relationship between study characteristics and chlamydia prevalence estimates. Results We included 25 population-based studies from 11 EU/EEA countries and 14 studies from five other high income countries. Four EU/EEA Member States reported on nationally representative surveys of sexually experienced adults aged 18–26 years (response rates 52–71%). In women, chlamydia point prevalence estimates ranged from 3.0–5.3%; the pooled average of these estimates was 3.6% (95% CI 2.4, 4.8, I2 0%). In men, estimates ranged from 2.4–7.3% (pooled average 3.5%; 95% CI 1.9, 5.2, I2 27%). Estimates in EU/EEA Member States were statistically consistent with those in other high income countries (I2 0% for women, 6% for men). There was statistical evidence of an association between survey response rate and estimated chlamydia prevalence; estimates were higher in surveys with lower response rates, (p = 0.003 in women, 0.018 in men). Conclusions Population-based surveys that estimate chlamydia prevalence are at risk of participation bias owing to low response rates. Estimates obtained in nationally representative samples of the general population of EU/EEA Member States are similar to estimates from other high income countries. PMID:25615574

  10. Statistical analysis of hydrological response in urbanising catchments based on adaptive sampling using inter-amount times

    NASA Astrophysics Data System (ADS)

    ten Veldhuis, Marie-Claire; Schleiss, Marc

    2017-04-01

    In this study, we introduced an alternative approach for analysis of hydrological flow time series, using an adaptive sampling framework based on inter-amount times (IATs). The main difference with conventional flow time series is the rate at which low and high flows are sampled: the unit of analysis for IATs is a fixed flow amount, instead of a fixed time window. We analysed statistical distributions of flows and IATs across a wide range of sampling scales to investigate sensitivity of statistical properties such as quantiles, variance, skewness, scaling parameters and flashiness indicators to the sampling scale. We did this based on streamflow time series for 17 (semi)urbanised basins in North Carolina, US, ranging from 13 km2 to 238 km2 in size. Results showed that adaptive sampling of flow time series based on inter-amounts leads to a more balanced representation of low flow and peak flow values in the statistical distribution. While conventional sampling gives a lot of weight to low flows, as these are most ubiquitous in flow time series, IAT sampling gives relatively more weight to high flow values, when given flow amounts are accumulated in shorter time. As a consequence, IAT sampling gives more information about the tail of the distribution associated with high flows, while conventional sampling gives relatively more information about low flow periods. We will present results of statistical analyses across a range of subdaily to seasonal scales and will highlight some interesting insights that can be derived from IAT statistics with respect to basin flashiness and impact urbanisation on hydrological response.

  11. Elemental, microstructural, and mechanical characterization of high gold orthodontic brackets after intraoral aging.

    PubMed

    Hersche, Sepp; Sifakakis, Iosif; Zinelis, Spiros; Eliades, Theodore

    2017-02-01

    The purpose of the present study was to investigate the elemental composition, the microstructure, and the selected mechanical properties of high gold orthodontic brackets after intraoral aging. Thirty Incognito™ (3M Unitek, Bad Essen, Germany) lingual brackets were studied, 15 brackets as received (control group) and 15 brackets retrieved from different patients after orthodontic treatment. The surface of the wing area was examined by scanning electron microscopy (SEM). Backscattered electron imaging (BEI) was performed, and the elemental composition was determined by X-ray EDS analysis (EDX). After appropriate metallographic preparation, the mechanical properties tested were Martens hardness (HM), indentation modulus (EIT), elastic index (ηIT), and Vickers hardness (HV). These properties were determined employing instrumented indentation testing (IIT) with a Vickers indenter. The results were statistically analyzed by unpaired t-test (α=0.05). There were no statistically significant differences evidenced in surface morphology and elemental content between the control and the experimental group. These two groups of brackets showed no statistically significant difference in surface morphology. Moreover, the mean values of HM, EIT, ηIT, and HV did not reach statistical significance between the groups (p>0.05). Under the limitations of this study, it may be concluded that the surface elemental content and microstructure as well as the evaluated mechanical properties of the Incognito™ lingual brackets remain unaffected by intraoral aging.

  12. Association of pentraxin and high-sensitive C-reactive protein as inflammatory biomarkers in patients with chronic periodontitis and peripheral arterial disease.

    PubMed

    Boyapati, Ramanarayana; Chinthalapani, Srikanth; Ramisetti, Arpita; Salavadhi, Shyam Sunder; Ramachandran, Radhika

    2018-01-01

    Inflammation is a common feature of both peripheral artery disease (PAD) and periodontal disease. The aim of this study is to evaluate the relationship between PAD and periodontal disease by examining the levels of inflammatory cytokines, pentraxin-3 (PTX-3), and high-sensitive C-reactive protein from serum. A total of 50 patients were included in this cross-sectional study. Patients were divided into two groups: those with PAD (test group) and those with the non-PAD group (control group) based on ankle-brachial index values. Periodontal examinations and biochemical analysis for PTX-3 and high-sensitive C-reactive protein were performed to compare the two groups. All the obtained data were sent for statistical analyses using SPSS version 18. In the clinical parameters, there is statistically significant difference present between plaque index, clinical attachment loss, and periodontal inflammatory surface area with higher mean values in patients with PAD having periodontitis. There is statistical significant ( P < 0.01) difference in all biochemical parameters ( P < 0.05) considered in the study between PAD patients and non-PAD patients with higher mean values of total cholesterol (TC), low-density lipoprotein (LDL), high-sensitive C-reactive protein (hs-CRP), and PTX-3. PTX-3 and acute-phase cytokine such as hs-CRP can be regarded as one of the best indicators to show the association between the PAD and periodontitis followed by hs-CRP, TC, very LDL (VLDL), and LDL. However, high-density lipoprotein (HDL) is a poor indicator for its association with chronic periodontitis and PAD.

  13. Statistical Approach To Estimate Vaccinia-Specific Neutralizing Antibody Titers Using a High-Throughput Assay▿

    PubMed Central

    Kennedy, Richard; Pankratz, V. Shane; Swanson, Eric; Watson, David; Golding, Hana; Poland, Gregory A.

    2009-01-01

    Because of the bioterrorism threat posed by agents such as variola virus, considerable time, resources, and effort have been devoted to biodefense preparation. One avenue of this research has been the development of rapid, sensitive, high-throughput assays to validate immune responses to poxviruses. Here we describe the adaptation of a β-galactosidase reporter-based vaccinia virus neutralization assay to large-scale use in a study that included over 1,000 subjects. We also describe the statistical methods involved in analyzing the large quantity of data generated. The assay and its associated methods should prove useful tools in monitoring immune responses to next-generation smallpox vaccines, studying poxvirus immunity, and evaluating therapeutic agents such as vaccinia virus immune globulin. PMID:19535540

  14. Results of a joint NOAA/NASA sounder simulation study

    NASA Technical Reports Server (NTRS)

    Phillips, N.; Susskind, Joel; Mcmillin, L.

    1988-01-01

    This paper presents the results of a joint NOAA and NASA sounder simulation study in which the accuracies of atmospheric temperature profiles and surface skin temperature measuremnents retrieved from two sounders were compared: (1) the currently used IR temperature sounder HIRS2 (High-resolution Infrared Radiation Sounder 2); and (2) the recently proposed high-spectral-resolution IR sounder AMTS (Advanced Moisture and Temperature Sounder). Simulations were conducted for both clear and partial cloud conditions. Data were analyzed at NASA using a physical inversion technique and at NOAA using a statistical technique. Results show significant improvement of AMTS compared to HIRS2 for both clear and cloudy conditions. The improvements are indicated by both methods of data analysis, but the physical retrievals outperform the statistical retrievals.

  15. Experimental design and statistical methods for improved hit detection in high-throughput screening.

    PubMed

    Malo, Nathalie; Hanley, James A; Carlile, Graeme; Liu, Jing; Pelletier, Jerry; Thomas, David; Nadon, Robert

    2010-09-01

    Identification of active compounds in high-throughput screening (HTS) contexts can be substantially improved by applying classical experimental design and statistical inference principles to all phases of HTS studies. The authors present both experimental and simulated data to illustrate how true-positive rates can be maximized without increasing false-positive rates by the following analytical process. First, the use of robust data preprocessing methods reduces unwanted variation by removing row, column, and plate biases. Second, replicate measurements allow estimation of the magnitude of the remaining random error and the use of formal statistical models to benchmark putative hits relative to what is expected by chance. Receiver Operating Characteristic (ROC) analyses revealed superior power for data preprocessed by a trimmed-mean polish method combined with the RVM t-test, particularly for small- to moderate-sized biological hits.

  16. Statistics of Private High Schools and Academies, 1919-20. Bulletin, 1922, No. 9

    ERIC Educational Resources Information Center

    Bonner, H. R.

    1922-01-01

    The included tables present the statistics of 2,093 private high schools and academies in the continental United States and of 4 such schools in Hawaii and Puerto Rico. Throughout the summary tables the totals for the United States do not include the statistics of these 4 schools in the outlying possessions. No reports from private high schools…

  17. Demography as Destiny the Role of Parental Connoisseurship and Mathematics Course Taking Patterns among High School Students

    ERIC Educational Resources Information Center

    Degner, Katherine Marie

    2012-01-01

    This study uses data from the National Center of Education Statistics (NCES) High School Longitudinal Study of 2009 (HSLS:09). Parent responses to the Parent Involvement survey, given as part of the NCES study were considered, along with their child's socio-economic status and self-reported level of mathematics course enrollment during their…

  18. Interventions for reducing self-stigma in people with mental illnesses: a systematic review of randomized controlled trials.

    PubMed

    Büchter, Roland Brian; Messer, Melanie

    2017-01-01

    Background: Self-stigma occurs when people with mental illnesses internalize negative stereotypes and prejudices about their condition. It can reduce help-seeking behaviour and treatment adherence. The effectiveness of interventions aimed at reducing self-stigma in people with mental illness is systematically reviewed. Results are discussed in the context of a logic model of the broader social context of mental illness stigma. Methods: Medline, Embase, PsycINFO, ERIC, and CENTRAL were searched for randomized controlled trials in November 2013. Studies were assessed with the Cochrane risk of bias tool. Results: Five trials were eligible for inclusion, four of which provided data for statistical analyses. Four studies had a high risk of bias. The quality of evidence was very low for each set of interventions and outcomes. The interventions studied included various group based anti-stigma interventions and an anti-stigma booklet. The intensity and fidelity of most interventions was high. Two studies were considered to be sufficiently homogeneous to be pooled for the outcome self-stigma. The meta-analysis did not find a statistically significant effect (SMD [95% CI] at 3 months: -0.26 [-0.64, 0.12], I 2 =0%, n=108). None of the individual studies found sustainable effects on other outcomes, including recovery, help-seeking behaviour and self-stigma. Conclusions: The effectiveness of interventions against self-stigma is uncertain. Previous studies lacked statistical power, used questionable outcome measures and had a high risk of bias. Future studies should be based on robust methods and consider practical implications regarding intervention development (relevance, implementability, and placement in routine services).

  19. Statistical trends of episiotomy around the world: Comparative systematic review of changing practices.

    PubMed

    Clesse, Christophe; Lighezzolo-Alnot, Joëlle; De Lavergne, Sylvie; Hamlin, Sandrine; Scheffler, Michèle

    2018-06-01

    The authors' purpose for this article is to identify, review and interpret all publications about the episiotomy rates worldwide. Based on the criteria from the PRISMA guidelines, twenty databases were scrutinized. All studies which include national statistics related to episiotomy were selected, as well as studies presenting estimated data. Sixty-one papers were selected with publication dates between 1995 and 2016. A static and dynamic analysis of all the results was carried out. The assumption for the decline in the number of episiotomies is discussed and confirmed, recalling that nowadays high rates of episiotomy remain in less industrialized countries and East Asia. Finally, our analysis aims to investigate the potential determinants which influence apparent statistical disparities.

  20. Statistical Energy Analysis (SEA) and Energy Finite Element Analysis (EFEA) Predictions for a Floor-Equipped Composite Cylinder

    NASA Technical Reports Server (NTRS)

    Grosveld, Ferdinand W.; Schiller, Noah H.; Cabell, Randolph H.

    2011-01-01

    Comet Enflow is a commercially available, high frequency vibroacoustic analysis software founded on Energy Finite Element Analysis (EFEA) and Energy Boundary Element Analysis (EBEA). Energy Finite Element Analysis (EFEA) was validated on a floor-equipped composite cylinder by comparing EFEA vibroacoustic response predictions with Statistical Energy Analysis (SEA) and experimental results. Statistical Energy Analysis (SEA) predictions were made using the commercial software program VA One 2009 from ESI Group. The frequency region of interest for this study covers the one-third octave bands with center frequencies from 100 Hz to 4000 Hz.

  1. Estimation of elastic moduli of graphene monolayer in lattice statics approach at nonzero temperature

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Zubko, I. Yu., E-mail: zoubko@list.ru; Kochurov, V. I.

    2015-10-27

    For the aim of the crystal temperature control the computational-statistical approach to studying thermo-mechanical properties for finite sized crystals is presented. The approach is based on the combination of the high-performance computational techniques and statistical analysis of the crystal response on external thermo-mechanical actions for specimens with the statistically small amount of atoms (for instance, nanoparticles). The heat motion of atoms is imitated in the statics approach by including the independent degrees of freedom for atoms connected with their oscillations. We obtained that under heating, graphene material response is nonsymmetric.

  2. The statistical reporting quality of articles published in 2010 in five dental journals.

    PubMed

    Vähänikkilä, Hannu; Tjäderhane, Leo; Nieminen, Pentti

    2015-01-01

    Statistical methods play an important role in medical and dental research. In earlier studies it has been observed that current use of methods and reporting of statistics are responsible for some of the errors in the interpretation of results. The aim of this study was to investigate the quality of statistical reporting in dental research articles. A total of 200 articles published in 2010 were analysed covering five dental journals: Journal of Dental Research, Caries Research, Community Dentistry and Oral Epidemiology, Journal of Dentistry and Acta Odontologica Scandinavica. Each paper underwent careful scrutiny for the use of statistical methods and reporting. A paper with at least one poor reporting item has been classified as 'problems with reporting statistics' and a paper without any poor reporting item as 'acceptable'. The investigation showed that 18 (9%) papers were acceptable and 182 (91%) papers contained at least one poor reporting item. The proportion of at least one poor reporting item in this survey was high (91%). The authors of dental journals should be encouraged to improve the statistical section of their research articles and to present the results in such a way that it is in line with the policy and presentation of the leading dental journals.

  3. A longitudinal analysis of burnout in middle and high school Korean teachers.

    PubMed

    Park, Yang Min; Lee, Sang Min

    2013-12-01

    This study examines longitudinal relationships among three burnout dimensions in middle and high school teachers. For this study, 419 middle and high school teachers participated in a panel survey, which was conducted in three waves. Using Amos 7.0, we performed autoregressive cross-lagged modeling to obtain a complete picture of the longitudinal relationships among the three factors of the Maslach Burnout Inventory-Educator Survey. Results indicated that the paths from emotional exhaustion at Time1 and Time2 to depersonalization at Time2 and Time3 were statistically significant. In addition, the paths from personal accomplishment at Time1 and Time2 to depersonalization at Time2 and Time3 were also statistically significant. Empirically identifying the process by which burnout occurs could help practitioners and policy makers to design burnout prevention strategies. Copyright © 2012 John Wiley & Sons, Ltd.

  4. Educational Leadership as Best Practice in Highly Effective Schools in the Autonomous Region of the Basque County (Spain)

    ERIC Educational Resources Information Center

    Intxausti, Nahia; Joaristi, Luis; Lizasoain, Luis

    2016-01-01

    This study presents part of a research project currently underway which aims to characterise the best practices of highly effective schools in the Autonomous Region of the Basque Country (Spain). Multilevel statistical modelling and hierarchical linear models were used to select 32 highly effective schools, with highly effective being taken to…

  5. A Third Moment Adjusted Test Statistic for Small Sample Factor Analysis

    PubMed Central

    Lin, Johnny; Bentler, Peter M.

    2012-01-01

    Goodness of fit testing in factor analysis is based on the assumption that the test statistic is asymptotically chi-square; but this property may not hold in small samples even when the factors and errors are normally distributed in the population. Robust methods such as Browne’s asymptotically distribution-free method and Satorra Bentler’s mean scaling statistic were developed under the presumption of non-normality in the factors and errors. This paper finds new application to the case where factors and errors are normally distributed in the population but the skewness of the obtained test statistic is still high due to sampling error in the observed indicators. An extension of Satorra Bentler’s statistic is proposed that not only scales the mean but also adjusts the degrees of freedom based on the skewness of the obtained test statistic in order to improve its robustness under small samples. A simple simulation study shows that this third moment adjusted statistic asymptotically performs on par with previously proposed methods, and at a very small sample size offers superior Type I error rates under a properly specified model. Data from Mardia, Kent and Bibby’s study of students tested for their ability in five content areas that were either open or closed book were used to illustrate the real-world performance of this statistic. PMID:23144511

  6. A new u-statistic with superior design sensitivity in matched observational studies.

    PubMed

    Rosenbaum, Paul R

    2011-09-01

    In an observational or nonrandomized study of treatment effects, a sensitivity analysis indicates the magnitude of bias from unmeasured covariates that would need to be present to alter the conclusions of a naïve analysis that presumes adjustments for observed covariates suffice to remove all bias. The power of sensitivity analysis is the probability that it will reject a false hypothesis about treatment effects allowing for a departure from random assignment of a specified magnitude; in particular, if this specified magnitude is "no departure" then this is the same as the power of a randomization test in a randomized experiment. A new family of u-statistics is proposed that includes Wilcoxon's signed rank statistic but also includes other statistics with substantially higher power when a sensitivity analysis is performed in an observational study. Wilcoxon's statistic has high power to detect small effects in large randomized experiments-that is, it often has good Pitman efficiency-but small effects are invariably sensitive to small unobserved biases. Members of this family of u-statistics that emphasize medium to large effects can have substantially higher power in a sensitivity analysis. For example, in one situation with 250 pair differences that are Normal with expectation 1/2 and variance 1, the power of a sensitivity analysis that uses Wilcoxon's statistic is 0.08 while the power of another member of the family of u-statistics is 0.66. The topic is examined by performing a sensitivity analysis in three observational studies, using an asymptotic measure called the design sensitivity, and by simulating power in finite samples. The three examples are drawn from epidemiology, clinical medicine, and genetic toxicology. © 2010, The International Biometric Society.

  7. High-Throughput Nanoindentation for Statistical and Spatial Property Determination

    NASA Astrophysics Data System (ADS)

    Hintsala, Eric D.; Hangen, Ude; Stauffer, Douglas D.

    2018-04-01

    Standard nanoindentation tests are "high throughput" compared to nearly all other mechanical tests, such as tension or compression. However, the typical rates of tens of tests per hour can be significantly improved. These higher testing rates enable otherwise impractical studies requiring several thousands of indents, such as high-resolution property mapping and detailed statistical studies. However, care must be taken to avoid systematic errors in the measurement, including choosing of the indentation depth/spacing to avoid overlap of plastic zones, pileup, and influence of neighboring microstructural features in the material being tested. Furthermore, since fast loading rates are required, the strain rate sensitivity must also be considered. A review of these effects is given, with the emphasis placed on making complimentary standard nanoindentation measurements to address these issues. Experimental applications of the technique, including mapping of welds, microstructures, and composites with varying length scales, along with studying the effect of surface roughness on nominally homogeneous specimens, will be presented.

  8. A study of the relationships between "highly qualified" status, instructional practices, and students' science achievement in three high poverty Louisiana school systems

    NASA Astrophysics Data System (ADS)

    Clayton, Michelle

    Using a mixed methods research design, the author examined the relationships between "highly qualified" status, instructional practices, and students' science achievement for six third grade teachers in three high poverty Louisiana school systems. The study analyzed qualitative and quantitative data for three science classes taught by "highly qualified" teachers and three science classes taught by "non-highly qualified" teachers. The qualitative portion of the study was conducted through classroom observations, teacher interviews, and lesson plan reviews. The qualitative data was coded and triangulated to determine whether the instructional practices of each teacher were more "teacher-centered" or "student-centered." The qualitative data analysis indicated various patterns and consistencies in the instructional practices used by the "highly qualified" and "non-highly qualified" teachers selected for this study. The quantitative portion of the study involved analysis of the students' science achievement data for the six third grade science teachers selected for the study. Science achievement was measured by the third grade Integrated Louisiana Education Assessment Program (iLEAP) scores. A two-way ANOVA indicated that there were statistically significant differences in the mean scores of the three high poverty Louisiana school systems as well as the students taught by "highly qualified" and "non-highly qualified" teachers and the interactions between the two: F(2, 123) = 46.99, p < 0.01; F(1, 123) = 4.54, p = 0.035; F(2, 123) = 3.73, p = 0.027. A separate one-way ANOVA indicated that statistically significant differences existed between the six participating teachers in the study: F (5, 123) = 20.386, p < 0.01). Tukey's HSD post-hoc tests and homogeneous subset analyses were conducted in order to determine which teachers' scores significantly differed from each other.

  9. Musical Experience Influences Statistical Learning of a Novel Language

    PubMed Central

    Shook, Anthony; Marian, Viorica; Bartolotti, James; Schroeder, Scott R.

    2014-01-01

    Musical experience may benefit learning a new language by enhancing the fidelity with which the auditory system encodes sound. In the current study, participants with varying degrees of musical experience were exposed to two statistically-defined languages consisting of auditory Morse-code sequences which varied in difficulty. We found an advantage for highly-skilled musicians, relative to less-skilled musicians, in learning novel Morse-code based words. Furthermore, in the more difficult learning condition, performance of lower-skilled musicians was mediated by their general cognitive abilities. We suggest that musical experience may lead to enhanced processing of statistical information and that musicians’ enhanced ability to learn statistical probabilities in a novel Morse-code language may extend to natural language learning. PMID:23505962

  10. High Agreement and High Prevalence: The Paradox of Cohen's Kappa.

    PubMed

    Zec, Slavica; Soriani, Nicola; Comoretto, Rosanna; Baldi, Ileana

    2017-01-01

    Cohen's Kappa is the most used agreement statistic in literature. However, under certain conditions, it is affected by a paradox which returns biased estimates of the statistic itself. The aim of the study is to provide sufficient information which allows the reader to make an informed choice of the correct agreement measure, by underlining some optimal properties of Gwet's AC1 in comparison to Cohen's Kappa, using a real data example. During the process of literature review, we have asked a panel of three evaluators to come up with a judgment on the quality of 57 randomized controlled trials assigning a score to each trial using the Jadad scale. The quality was evaluated according to the following dimensions: adopted design, randomization unit, type of primary endpoint. With respect to each of the above described features, the agreement between the three evaluators has been calculated using Cohen's Kappa statistic and Gwet's AC1 statistic and, finally, the values have been compared with the observed agreement. The values of the Cohen's Kappa statistic would lead to believe that the agreement levels for the variables Unit, Design and Primary Endpoints are totally unsatisfactory. The AC1 statistic, on the contrary, shows plausible values which are in line with the respective values of the observed concordance. We conclude that it would always be appropriate to adopt the AC1 statistic, thus bypassing any risk of incurring the paradox and drawing wrong conclusions about the results of agreement analysis.

  11. Patients and medical statistics. Interest, confidence, and ability.

    PubMed

    Woloshin, Steven; Schwartz, Lisa M; Welch, H Gilbert

    2005-11-01

    People are increasingly presented with medical statistics. There are no existing measures to assess their level of interest or confidence in using medical statistics. To develop 2 new measures, the STAT-interest and STAT-confidence scales, and assess their reliability and validity. Survey with retest after approximately 2 weeks. Two hundred and twenty-four people were recruited from advertisements in local newspapers, an outpatient clinic waiting area, and a hospital open house. We developed and revised 5 items on interest in medical statistics and 3 on confidence understanding statistics. Study participants were mostly college graduates (52%); 25% had a high school education or less. The mean age was 53 (range 20 to 84) years. Most paid attention to medical statistics (6% paid no attention). The mean (SD) STAT-interest score was 68 (17) and ranged from 15 to 100. Confidence in using statistics was also high: the mean (SD) STAT-confidence score was 65 (19) and ranged from 11 to 100. STAT-interest and STAT-confidence scores were moderately correlated (r=.36, P<.001). Both scales demonstrated good test-retest repeatability (r=.60, .62, respectively), internal consistency reliability (Cronbach's alpha=0.70 and 0.78), and usability (individual item nonresponse ranged from 0% to 1.3%). Scale scores correlated only weakly with scores on a medical data interpretation test (r=.15 and .26, respectively). The STAT-interest and STAT-confidence scales are usable and reliable. Interest and confidence were only weakly related to the ability to actually use data.

  12. Patients and Medical Statistics

    PubMed Central

    Woloshin, Steven; Schwartz, Lisa M; Welch, H Gilbert

    2005-01-01

    BACKGROUND People are increasingly presented with medical statistics. There are no existing measures to assess their level of interest or confidence in using medical statistics. OBJECTIVE To develop 2 new measures, the STAT-interest and STAT-confidence scales, and assess their reliability and validity. DESIGN Survey with retest after approximately 2 weeks. SUBJECTS Two hundred and twenty-four people were recruited from advertisements in local newspapers, an outpatient clinic waiting area, and a hospital open house. MEASURES We developed and revised 5 items on interest in medical statistics and 3 on confidence understanding statistics. RESULTS Study participants were mostly college graduates (52%); 25% had a high school education or less. The mean age was 53 (range 20 to 84) years. Most paid attention to medical statistics (6% paid no attention). The mean (SD) STAT-interest score was 68 (17) and ranged from 15 to 100. Confidence in using statistics was also high: the mean (SD) STAT-confidence score was 65 (19) and ranged from 11 to 100. STAT-interest and STAT-confidence scores were moderately correlated (r=.36, P<.001). Both scales demonstrated good test–retest repeatability (r=.60, .62, respectively), internal consistency reliability (Cronbach's α=0.70 and 0.78), and usability (individual item nonresponse ranged from 0% to 1.3%). Scale scores correlated only weakly with scores on a medical data interpretation test (r=.15 and .26, respectively). CONCLUSION The STAT-interest and STAT-confidence scales are usable and reliable. Interest and confidence were only weakly related to the ability to actually use data. PMID:16307623

  13. Spin flip statistics and spin wave interference patterns in Ising ferromagnetic films: A Monte Carlo study.

    PubMed

    Acharyya, Muktish

    2017-07-01

    The spin wave interference is studied in two dimensional Ising ferromagnet driven by two coherent spherical magnetic field waves by Monte Carlo simulation. The spin waves are found to propagate and interfere according to the classic rule of interference pattern generated by two point sources. The interference pattern of spin wave is observed in one boundary of the lattice. The interference pattern is detected and studied by spin flip statistics at high and low temperatures. The destructive interference is manifested as the large number of spin flips and vice versa.

  14. An Analysis of High School Students' Performance on Five Integrated Science Process Skills

    NASA Astrophysics Data System (ADS)

    Beaumont-Walters, Yvonne; Soyibo, Kola

    2001-02-01

    This study determined Jamaican high school students' level of performance on five integrated science process skills and if there were statistically significant differences in their performance linked to their gender, grade level, school location, school type, student type and socio-economic background (SEB). The 305 subjects comprised 133 males, 172 females, 146 ninth graders, 159 10th graders, 150 traditional and 155 comprehensive high school students, 164 students from the Reform of Secondary Education (ROSE) project and 141 non-ROSE students, 166 urban and 139 rural students and 110 students from a high SEB and 195 from a low SEB. Data were collected with the authors' constructed integrated science process skills test the results indicated that the subjects' mean score was low and unsatisfactory; their performance in decreasing order was: interpreting data, recording data, generalising, formulating hypotheses and identifying variables; there were statistically significant differences in their performance based on their grade level, school type, student type, and SEB in favour of the 10th graders, traditional high school students, ROSE students and students from a high SEB. There was a positive, statistically significant and fairly strong relationship between their performance and school type, but weak relationships among their student type, grade level and SEB and performance.

  15. Is the association between high strain work and depressive symptoms modified by private life social support: a cohort study of 1,074 Danish employees?

    PubMed Central

    2014-01-01

    Background Previous studies have shown that psychosocial working conditions characterized by high psychological demands and low decision latitude (i.e., high strain work) are associated with increased risk of depressive symptoms. Little is known, however, concerning how this association may be modified by factors outside the working environment. This article examines the modifying role of private life social support in the relation between high strain work and the development of severe depressive symptoms. Methods Data were questionnaire-based, collected from a cross-occupational sample of 1,074 Danish employees. At baseline, all participants were free of severe depressive symptoms, measured by the Mental Health Inventory. High strain work was defined by the combination of high psychological demands at work and low control, measured with multi-dimensional scales. Private life social support was operationalized as the number of life domains with confidants and dichotomized as low (0–1 domains) or high (2 or more domains). Using logistic regression we examined the risk of onset of severe depressive symptoms, adjusting for sex, age, occupational position, and prior depressive symptoms. Results Separately, neither high strain work nor low private life social support statistically significantly predicted depressive symptoms. However, participants with joint exposure to high strain work and low private life social support had an Odds ratio (OR) for severe depressive symptoms of 3.41 (95% CI: 1.36-8.58), compared to participants with no work strain and high private life social support. There was no increased risk for participants with high strain work and high private life social support (OR = 1.32, 95% CI: 0.65-2.68). The interaction term for departure from additivity was, however, not statistically significant (p = 0.18). Conclusions Our findings suggest that high strain work may increase risk of depressive symptoms in individuals with low private life social support, although the effect-modification was statistically non-significant. Larger studies are needed to further establish the role of private life social support in the relation between high strain work and depression. PMID:25005843

  16. Is the association between high strain work and depressive symptoms modified by private life social support: a cohort study of 1,074 Danish employees?

    PubMed

    Madsen, Ida E H; Jorgensen, Anette F B; Borritz, Marianne; Nielsen, Martin L; Rugulies, Reiner

    2014-07-08

    Previous studies have shown that psychosocial working conditions characterized by high psychological demands and low decision latitude (i.e., high strain work) are associated with increased risk of depressive symptoms. Little is known, however, concerning how this association may be modified by factors outside the working environment. This article examines the modifying role of private life social support in the relation between high strain work and the development of severe depressive symptoms. Data were questionnaire-based, collected from a cross-occupational sample of 1,074 Danish employees. At baseline, all participants were free of severe depressive symptoms, measured by the Mental Health Inventory. High strain work was defined by the combination of high psychological demands at work and low control, measured with multi-dimensional scales. Private life social support was operationalized as the number of life domains with confidants and dichotomized as low (0-1 domains) or high (2 or more domains). Using logistic regression we examined the risk of onset of severe depressive symptoms, adjusting for sex, age, occupational position, and prior depressive symptoms. Separately, neither high strain work nor low private life social support statistically significantly predicted depressive symptoms. However, participants with joint exposure to high strain work and low private life social support had an Odds ratio (OR) for severe depressive symptoms of 3.41 (95% CI: 1.36-8.58), compared to participants with no work strain and high private life social support. There was no increased risk for participants with high strain work and high private life social support (OR = 1.32, 95% CI: 0.65-2.68). The interaction term for departure from additivity was, however, not statistically significant (p = 0.18). Our findings suggest that high strain work may increase risk of depressive symptoms in individuals with low private life social support, although the effect-modification was statistically non-significant. Larger studies are needed to further establish the role of private life social support in the relation between high strain work and depression.

  17. Selenium- or vitamin E-related gene variants, interaction with supplementation, and risk of high-grade prostate cancer in SELECT

    PubMed Central

    Chan, June M.; Darke, Amy K.; Penney, Kathryn L.; Tangen, Catherine M.; Goodman, Phyllis J.; Lee, Gwo-Shu Mary; Sun, Tong; Peisch, Sam; Tinianow, Alex M.; Rae, James M.; Klein, Eric A.; Thompson, Ian M.

    2016-01-01

    Background Epidemiological studies and secondary analyses of randomized trials supported the hypothesis that selenium and vitamin E lower prostate cancer risk. However, the Selenium and Vitamin E Cancer Prevention Trial (SELECT) showed no benefit of either supplement. Genetic variants involved in selenium or vitamin E metabolism or transport may underlie the complex associations of selenium and vitamin E. Methods We undertook a case-cohort study of SELECT participants randomized to placebo, selenium or vitamin E. The subcohort included 1,434 men; our primary outcome was high-grade prostate cancer (N=278 cases, Gleason 7 or higher cancer). We used weighted Cox regression to examine the association between SNPs and high-grade prostate cancer risk. To assess effect modification, we created interaction terms between randomization arm and genotype and calculated log likelihood statistics. Results We noted statistically significant (p<0.05) interactions between selenium assignment, SNPs in CAT, SOD2, PRDX6, SOD3, and TXNRD2 and high-grade prostate cancer risk. Statistically significant SNPs that modified the association of vitamin E assignment and high-grade prostate cancer included SEC14L2, SOD1, and TTPA. In the placebo arm, several SNPs, hypothesized to interact with supplement assignment and risk of high-grade prostate cancer, were also directly associated with outcome. Conclusion Variants in selenium and vitamin E metabolism/transport genes may influence risk of overall and high-grade prostate cancer, and may modify an individual man’s response to vitamin E or selenium supplementation with regards to these risks. Impact The effect of selenium or vitamin E supplementation on high-grade prostate cancer risk may vary by genotype. PMID:27197287

  18. Testing high SPF sunscreens: a demonstration of the accuracy and reproducibility of the results of testing high SPF formulations by two methods and at different testing sites.

    PubMed

    Agin, Patricia Poh; Edmonds, Susan H

    2002-08-01

    The goals of this study were (i) to demonstrate that existing and widely used sun protection factor (SPF) test methodologies can produce accurate and reproducible results for high SPF formulations and (ii) to provide data on the number of test-subjects needed, the variability of the data, and the appropriate exposure increments needed for testing high SPF formulations. Three high SPF formulations were tested, according to the Food and Drug Administration's (FDA) 1993 tentative final monograph (TFM) 'very water resistant' test method and/or the 1978 proposed monograph 'waterproof' test method, within one laboratory. A fourth high SPF formulation was tested at four independent SPF testing laboratories, using the 1978 waterproof SPF test method. All laboratories utilized xenon arc solar simulators. The data illustrate that the testing conducted within one laboratory, following either the 1978 proposed or the 1993 TFM SPF test method, was able to reproducibly determine the SPFs of the formulations tested, using either the statistical analysis method in the proposed monograph or the statistical method described in the TFM. When one formulation was tested at four different laboratories, the anticipated variation in the data owing to the equipment and other operational differences was minimized through the use of the statistical method described in the 1993 monograph. The data illustrate that either the 1978 proposed monograph SPF test method or the 1993 TFM SPF test method can provide accurate and reproducible results for high SPF formulations. Further, these results can be achieved with panels of 20-25 subjects with an acceptable level of variability. Utilization of the statistical controls from the 1993 sunscreen monograph can help to minimize lab-to-lab variability for well-formulated products.

  19. The U.S. Air Force Photorefractive Keratectomy (PRK) Study: Evaluation of Residual Refractive Error and High- and Low-Contrast Visual Acuity

    DTIC Science & Technology

    2006-07-01

    values for statistical analyses in terms of Snellen equivalent VA (Ref 44) and lines gained vs . lost after PRK . The Snellen VA values shown in the...AFRL-SA-BR-TR-2010-0011 THE U.S. AIR FORCE PHOTOREFRACTIVE KERATECTOMY ( PRK ) STUDY: Evaluation of Residual Refractive Error and High...July 2006 4. TITLE AND SUBTITLE THE U.S. AIR FORCE PHOTOREFRACTIVE KERATECTOMY ( PRK ) STUDY: Evaluation of Residual Refractive Error and High- and

  20. Hypothesis testing for band size detection of high-dimensional banded precision matrices.

    PubMed

    An, Baiguo; Guo, Jianhua; Liu, Yufeng

    2014-06-01

    Many statistical analysis procedures require a good estimator for a high-dimensional covariance matrix or its inverse, the precision matrix. When the precision matrix is banded, the Cholesky-based method often yields a good estimator of the precision matrix. One important aspect of this method is determination of the band size of the precision matrix. In practice, crossvalidation is commonly used; however, we show that crossvalidation not only is computationally intensive but can be very unstable. In this paper, we propose a new hypothesis testing procedure to determine the band size in high dimensions. Our proposed test statistic is shown to be asymptotically normal under the null hypothesis, and its theoretical power is studied. Numerical examples demonstrate the effectiveness of our testing procedure.

  1. Mass action at the single-molecule level.

    PubMed

    Shon, Min Ju; Cohen, Adam E

    2012-09-05

    We developed a system to reversibly encapsulate small numbers of molecules in an array of nanofabricated "dimples". This system enables highly parallel, long-term, and attachment-free studies of molecular dynamics via single-molecule fluorescence. In studies of bimolecular reactions of small numbers of confined molecules, we see phenomena that, while expected from basic statistical mechanics, are not observed in bulk chemistry. Statistical fluctuations in the occupancy of sealed reaction chambers lead to steady-state fluctuations in reaction equilibria and rates. These phenomena are likely to be important whenever reactions happen in confined geometries.

  2. Video game addiction in children and teenagers in Taiwan.

    PubMed

    Chiu, Shao-I; Lee, Jie-Zhi; Huang, Der-Hsiang

    2004-10-01

    Video game addiction in children and teenagers in Taiwan is associated with levels of animosity, social skills, and academic achievement. This study suggests that video game addiction can be statistically predicted on measures of hostility, and a group with high video game addiction has more hostility than others. Both gender and video game addiction are negatively associated with academic achievement. Family function, sensation seeking, gender, and boredom have statistically positive relationships with levels of social skills. Current models of video game addiction do not seem to fit the findings of this study.

  3. High School Grades and University Performance: A Case Study

    ERIC Educational Resources Information Center

    Cyrenne, Philippe; Chan, Alan

    2012-01-01

    A critical issue facing a number of colleges and universities is how to allocate first year places to incoming students. The decision to admit students is often based on a number of factors, but a key statistic is a student's high school grades. This paper reports on a case study of the subsequent performance at the University of Winnipeg of high…

  4. The Perceptions of Principals and Teachers Regarding Mental Health Providers' Impact on Student Achievement in High Poverty Schools

    ERIC Educational Resources Information Center

    Perry, Teresa

    2012-01-01

    This study examined the perceptions of principals and teachers regarding mental health provider's impact on student achievement and behavior in high poverty schools using descriptive statistics, t-test, and two-way ANOVA. Respondents in this study shared similar views concerning principal and teacher satisfaction and levels of support for the…

  5. Should I Stay or Should I Go? A Comparison Study of Intention to Leave among Public Child Welfare Systems with High and Low Turnover Rates

    ERIC Educational Resources Information Center

    Strolin-Goltzman, Jessica

    2008-01-01

    This comparison study analyzes the commonalties, similarities, and differences on supervisory and organizational factors between a group of high turnover systems and a group of low turnover systems. Significant differences on organizational factors, but not on supervisory factors, emerged from the statistical analysis. Additionally, this study…

  6. Altered Development of White Matter in Youth at High Familial Risk for Bipolar Disorder: A Diffusion Tensor Imaging Study

    ERIC Educational Resources Information Center

    Versace, Amelia; Ladouceur, Cecile D.; Romero, Soledad; Birmaher, Boris; Axelson, David A.; Kupfer, David J.; Phillips, Mary L.

    2010-01-01

    Objective: To study white matter (WM) development in youth at high familial risk for bipolar disorder (BD). WM alterations are reported in youth and adults with BD. WM undergoes important maturational changes in adolescence. Age-related changes in WM microstructure using diffusion tensor imaging with tract-based spatial statistics in healthy…

  7. The Relationship between the Catholic Teacher's Faith and Commitment in the Catholic High School

    ERIC Educational Resources Information Center

    Cho, Young Kwan

    2012-01-01

    This study investigates the relationship between Catholic teachers' faith and their school commitment in Catholic high schools. A national sample of 751 teachers from 39 Catholic high schools in 15 archdioceses in the United States participated in a self-administered website survey. Data were analyzed using descriptive statistics and the Pearson…

  8. The Effects of Various High School Scheduling Models on Student Achievement in Michigan

    ERIC Educational Resources Information Center

    Pickell, Russell E.

    2017-01-01

    This study reviews research and data to determine whether student achievement is affected by the high school scheduling model, and whether changes in scheduling models result in statistically significant changes in student achievement, as measured by the ACT Composite, ACT English Language Arts, and ACT Math scores. The high school scheduling…

  9. Statistical and Spatial Analysis of Bathymetric Data for the St. Clair River, 1971-2007

    USGS Publications Warehouse

    Bennion, David

    2009-01-01

    To address questions concerning ongoing geomorphic processes in the St. Clair River, selected bathymetric datasets spanning 36 years were analyzed. Comparisons of recent high-resolution datasets covering the upper river indicate a highly variable, active environment. Although statistical and spatial comparisons of the datasets show that some changes to the channel size and shape have taken place during the study period, uncertainty associated with various survey methods and interpolation processes limit the statistically certain results. The methods used to spatially compare the datasets are sensitive to small variations in position and depth that are within the range of uncertainty associated with the datasets. Characteristics of the data, such as the density of measured points and the range of values surveyed, can also influence the results of spatial comparison. With due consideration of these limitations, apparently active and ongoing areas of elevation change in the river are mapped and discussed.

  10. A survey of work engagement and psychological capital levels.

    PubMed

    Bonner, Lynda

    2016-08-11

    To evaluate the relationship between work engagement and psychological capital (PsyCap) levels reported by registered nurses. PsyCap is a developable human resource. Research on PsyCap as an antecedent to work engagement in nurses is needed. A convenience sample of 137 registered nurses participated in this quantitative cross-sectional survey. Questionnaires measured self-reported levels of work engagement and psychological capital. Descriptive and inferential statistics were used for data analysis. There was a statistically significant correlation between work engagement and PsyCap scores (r=0.633, p<0.01). Nurses working at band 5 level reported statistically significantly lower PsyCap scores compared with nurses working at band 6 and 7 levels. Nurses reporting high levels of work engagement also reported high levels of PsyCap. Band 5 nurses might benefit most from interventions to increase their PsyCap. This study supports PsyCap as an antecedent to work engagement.

  11. Mid-infrared interferometry of AGNs: A statistical view into the dusty nuclear environment of the Seyfert Galaxies.

    NASA Astrophysics Data System (ADS)

    Lopez-Gonzaga, N.

    2015-09-01

    The high resolution achieved by the instrument MIDI at the VLTI allowed to obtain more detail information about the geometry and structure of the nuclear mid-infrared emission of AGNs, but due to the lack of real images, the interpretation of the results is not an easy task. To profit more from the high resolution data, we developed a statistical tool that allows interpret these data using clumpy torus models. A statistical approach is needed to overcome effects such as, the randomness in the position of the clouds and the uncertainty of the true position angle on the sky. Our results, obtained by studying the mid-infrared emission at the highest resolution currently available, suggest that the dusty environment of Type I objects is formed by a lower number of clouds than Type II objects.

  12. The effect of inclusion classrooms on the science achievement of general education students

    NASA Astrophysics Data System (ADS)

    Dodd, Matthew Robert

    General education and Special Education students from three high schools in Rutherford County were sampled to determine the effect on their academic achievement on the Tennessee Biology I Gateway Exam in Inclusion classrooms. Each student's predicted and actual Gateway Exam scores from the academic year 2006--2007 were used to determine the effect the student's classroom had on his academic achievement. Independent variables used in the study were gender, ethnicity, socioeconomic level, grade point average, type of classroom (general or Inclusion), and type student (General Education or Special Education). The statistical tests used in this study were a t-test and a Mann--Whitney U Test. From this study, the effect of the Inclusion classroom on general education students was not significant statistically. Although the Inclusion classroom allows the special education student to succeed in the classroom, the effect on general education students is negligible. This study also provided statistical data that the Inclusion classroom did not improve the special education students' academic performances on the Gateway Exam. Students in a general education classroom with a GPA above 3.000 and those from a household without a low socioeconomic status performed at a statistically different level in this study.

  13. Multiple category-lot quality assurance sampling: a new classification system with application to schistosomiasis control.

    PubMed

    Olives, Casey; Valadez, Joseph J; Brooker, Simon J; Pagano, Marcello

    2012-01-01

    Originally a binary classifier, Lot Quality Assurance Sampling (LQAS) has proven to be a useful tool for classification of the prevalence of Schistosoma mansoni into multiple categories (≤10%, >10 and <50%, ≥50%), and semi-curtailed sampling has been shown to effectively reduce the number of observations needed to reach a decision. To date the statistical underpinnings for Multiple Category-LQAS (MC-LQAS) have not received full treatment. We explore the analytical properties of MC-LQAS, and validate its use for the classification of S. mansoni prevalence in multiple settings in East Africa. We outline MC-LQAS design principles and formulae for operating characteristic curves. In addition, we derive the average sample number for MC-LQAS when utilizing semi-curtailed sampling and introduce curtailed sampling in this setting. We also assess the performance of MC-LQAS designs with maximum sample sizes of n=15 and n=25 via a weighted kappa-statistic using S. mansoni data collected in 388 schools from four studies in East Africa. Overall performance of MC-LQAS classification was high (kappa-statistic of 0.87). In three of the studies, the kappa-statistic for a design with n=15 was greater than 0.75. In the fourth study, where these designs performed poorly (kappa-statistic less than 0.50), the majority of observations fell in regions where potential error is known to be high. Employment of semi-curtailed and curtailed sampling further reduced the sample size by as many as 0.5 and 3.5 observations per school, respectively, without increasing classification error. This work provides the needed analytics to understand the properties of MC-LQAS for assessing the prevalance of S. mansoni and shows that in most settings a sample size of 15 children provides a reliable classification of schools.

  14. An instrument to assess the statistical intensity of medical research papers.

    PubMed

    Nieminen, Pentti; Virtanen, Jorma I; Vähänikkilä, Hannu

    2017-01-01

    There is widespread evidence that statistical methods play an important role in original research articles, especially in medical research. The evaluation of statistical methods and reporting in journals suffers from a lack of standardized methods for assessing the use of statistics. The objective of this study was to develop and evaluate an instrument to assess the statistical intensity in research articles in a standardized way. A checklist-type measure scale was developed by selecting and refining items from previous reports about the statistical contents of medical journal articles and from published guidelines for statistical reporting. A total of 840 original medical research articles that were published between 2007-2015 in 16 journals were evaluated to test the scoring instrument. The total sum of all items was used to assess the intensity between sub-fields and journals. Inter-rater agreement was examined using a random sample of 40 articles. Four raters read and evaluated the selected articles using the developed instrument. The scale consisted of 66 items. The total summary score adequately discriminated between research articles according to their study design characteristics. The new instrument could also discriminate between journals according to their statistical intensity. The inter-observer agreement measured by the ICC was 0.88 between all four raters. Individual item analysis showed very high agreement between the rater pairs, the percentage agreement ranged from 91.7% to 95.2%. A reliable and applicable instrument for evaluating the statistical intensity in research papers was developed. It is a helpful tool for comparing the statistical intensity between sub-fields and journals. The novel instrument may be applied in manuscript peer review to identify papers in need of additional statistical review.

  15. Empirical Bayes scan statistics for detecting clusters of disease risk variants in genetic studies.

    PubMed

    McCallum, Kenneth J; Ionita-Laza, Iuliana

    2015-12-01

    Recent developments of high-throughput genomic technologies offer an unprecedented detailed view of the genetic variation in various human populations, and promise to lead to significant progress in understanding the genetic basis of complex diseases. Despite this tremendous advance in data generation, it remains very challenging to analyze and interpret these data due to their sparse and high-dimensional nature. Here, we propose novel applications and new developments of empirical Bayes scan statistics to identify genomic regions significantly enriched with disease risk variants. We show that the proposed empirical Bayes methodology can be substantially more powerful than existing scan statistics methods especially so in the presence of many non-disease risk variants, and in situations when there is a mixture of risk and protective variants. Furthermore, the empirical Bayes approach has greater flexibility to accommodate covariates such as functional prediction scores and additional biomarkers. As proof-of-concept we apply the proposed methods to a whole-exome sequencing study for autism spectrum disorders and identify several promising candidate genes. © 2015, The International Biometric Society.

  16. Qualitative and quantitative evaluation of some vocal function parameters following fitting of a prosthesis.

    PubMed

    Cavalot, A L; Palonta, F; Preti, G; Nazionale, G; Ricci, E; Vione, N; Albera, R; Cortesina, G

    2001-12-01

    The insertion of a prosthesis and restoration with pectoralis major myocutaneous flaps for patients subjected to total pharyngolaryngectomy is a technique now universally accepted; however the literature on the subject is lacking. Our study considers 10 patients subjected to total pharyngolaryngectomy and restoration with pectoralis major myocutaneous flaps who were fitted with vocal function prostheses and a control group of 50 subjects treated with a total laryngectomy without pectoralis major myocutaneous flaps and who were fitted with vocal function prostheses. Specific qualitative and quantitative parameters were compared. The quantitative measurement of the levels of voice intensity and the evaluation of the harmonics-to-noise ratio were not statistically significant (p > 0.05) between the two study groups at either high- or low-volume speech. On the contrary, statistically significant differences were found (p < 0.05) for the basic frequency of both the low and the high volume voice. For the qualitative analysis seven parameters were established for evaluation by trained and untrained listeners: on the basis of these parameters the control group had statistically better voices.

  17. Application of multivariate statistical techniques in microbial ecology.

    PubMed

    Paliy, O; Shankar, V

    2016-03-01

    Recent advances in high-throughput methods of molecular analyses have led to an explosion of studies generating large-scale ecological data sets. In particular, noticeable effect has been attained in the field of microbial ecology, where new experimental approaches provided in-depth assessments of the composition, functions and dynamic changes of complex microbial communities. Because even a single high-throughput experiment produces large amount of data, powerful statistical techniques of multivariate analysis are well suited to analyse and interpret these data sets. Many different multivariate techniques are available, and often it is not clear which method should be applied to a particular data set. In this review, we describe and compare the most widely used multivariate statistical techniques including exploratory, interpretive and discriminatory procedures. We consider several important limitations and assumptions of these methods, and we present examples of how these approaches have been utilized in recent studies to provide insight into the ecology of the microbial world. Finally, we offer suggestions for the selection of appropriate methods based on the research question and data set structure. © 2016 John Wiley & Sons Ltd.

  18. Spatial, temporal and spatio-temporal clusters of measles incidence at the county level in Guangxi, China during 2004-2014: flexibly shaped scan statistics.

    PubMed

    Tang, Xianyan; Geater, Alan; McNeil, Edward; Deng, Qiuyun; Dong, Aihu; Zhong, Ge

    2017-04-04

    Outbreaks of measles re-emerged in Guangxi province during 2013-2014, where measles again became a major public health concern. A better understanding of the patterns of measles cases would help in identifying high-risk areas and periods for optimizing preventive strategies, yet these patterns remain largely unknown. Thus, this study aimed to determine the patterns of measles clusters in space, time and space-time at the county level over the period 2004-2014 in Guangxi. Annual data on measles cases and population sizes for each county were obtained from Guangxi CDC and Guangxi Bureau of Statistics, respectively. Epidemic curves and Kulldorff's temporal scan statistics were used to identify seasonal peaks and high-risk periods. Tango's flexible scan statistics were implemented to determine irregular spatial clusters. Spatio-temporal clusters in elliptical cylinder shapes were detected by Kulldorff's scan statistics. Population attributable risk percent (PAR%) of children aged ≤24 months was used to identify regions with a heavy burden of measles. Seasonal peaks occurred between April and June, and a temporal measles cluster was detected in 2014. Spatial clusters were identified in West, Southwest and North Central Guangxi. Three phases of spatio-temporal clusters with high relative risk were detected: Central Guangxi during 2004-2005, Midwest Guangxi in 2007, and West and Southwest Guangxi during 2013-2014. Regions with high PAR% were mainly clustered in West, Southwest, North and Central Guangxi. A temporal uptrend of measles incidence existed in Guangxi between 2010 and 2014, while downtrend during 2004-2009. The hotspots shifted from Central to West and Southwest Guangxi, regions overburdened with measles. Thus, intensifying surveillance of timeliness and completeness of routine vaccination and implementing supplementary immunization activities for measles should prioritized in these regions.

  19. A SUMMARY OF STUDIES IN ACHIEVEMENT OF VOCATIONAL AGRICULTURE GRADUATES IN COLLEGE.

    ERIC Educational Resources Information Center

    MCCLELLAND, JOHN B.

    TWENTY-SEVEN STUDIES ARE INCLUDED IN THIS SYNTHESIS OF RESEARCH ON THE APPROPRIATENESS OF HIGH SCHOOL VOCATIONAL AGRICULTURE STUDENTS GOING ON TO AGRICULTURAL COLLEGES. MOST OF THE STUDIES INVOLVED STATISTICAL SIGNIFICANCE TREATMENT. THE STUDIES ARE ORGANIZED INTO SECTIONS--(1) COMPREHENSIVE, (2) ACHIEVEMENT IN LEADERSHIP ACTIVITIES, (3)…

  20. Using Relative Improvement over Chance (RIOC) to Examine Agreement between Tests: Three Case Examples Using Studies of Developmental Coordination Disorder (DCD) in Children

    ERIC Educational Resources Information Center

    Cairney, John; Streiner, David L.

    2011-01-01

    Although statistics such as kappa and phi are commonly used to assess agreement between tests, in situations where the base rate of a disorder in a population is low or high, these statistics tend to underestimate actual agreement. This can occur even if the tests are good and the classification of subjects is adequate. Relative improvement over…

  1. Interaction effects of metals and salinity on biodegradation of a complex hydrocarbon waste.

    PubMed

    Amatya, Prasanna L; Hettiaratchi, Joseph Patrick A; Joshi, Ramesh C

    2006-02-01

    The presence of high levels of salts because of produced brine water disposal at flare pits and the presence of metals at sufficient concentrations to impact microbial activity are of concern to bioremediation of flare pit waste in the upstream oil and gas industry. Two slurry-phase biotreatment experiments based on three-level factorial statistical experimental design were conducted with a flare pit waste. The experiments separately studied the primary effect of cadmium [Cd(II)] and interaction effect between Cd(II) and salinity and the primary effect of zinc [Zn(II)] and interaction effect between Zn(II) and salinity on hydrocarbon biodegradation. The results showed 42-52.5% hydrocarbon removal in slurries spiked with Cd and 47-62.5% in the slurries spiked with Zn. The analysis of variance showed that the primary effects of Cd and Cd-salinity interaction were statistically significant on hydrocarbon degradation. The primary effects of Zn and the Zn-salinity interaction were statistically insignificant, whereas the quadratic effect of Zn was highly significant on hydrocarbon degradation. The study on effects of metallic chloro-complexes showed that the total aqueous concentration of Cd or Zn does not give a reliable indication of overall toxicity to the microbial activity in the presence of high salinity levels.

  2. Comparisons of false negative rates from a trend test alone and from a trend test jointly with a control-high groups pairwise test in the determination of the carcinogenicity of new drugs.

    PubMed

    Lin, Karl K; Rahman, Mohammad A

    2018-05-21

    Interest has been expressed in using a joint test procedure that requires that the results of both a trend test and a pairwise comparison test between the control and the high groups be statistically significant simultaneously at the levels of significance recommended in the FDA 2001 draft guidance for industry document for the separate tests in order for the drug effect on the development of an individual tumor type to be considered as statistically significant. Results of our simulation studies show that there is a serious consequence of large inflations of the false negative rate through large decreases of false positive rate in the use of the above joint test procedure in the final interpretation of the carcinogenicity potential of a new drug if the levels of significance recommended for separate tests are used. The inflation can be as high as 204.5% of the false negative rate when the trend test alone is required to test if the effect is statistically significant. To correct the problem, new sets of levels of significance have also been developed for those who want to use the joint test in reviews of carcinogenicity studies.

  3. High-throughput optimization by statistical designs: example with rat liver slices cryopreservation.

    PubMed

    Martin, H; Bournique, B; Blanchi, B; Lerche-Langrand, C

    2003-08-01

    The purpose of this study was to optimize cryopreservation conditions of rat liver slices in a high-throughput format, with focus on reproducibility. A statistical design of 32 experiments was performed and intracellular lactate dehydrogenase (LDHi) activity and antipyrine (AP) metabolism were evaluated as biomarkers. At freezing, modified University of Wisconsin solution was better than Williams'E medium, and pure dimethyl sulfoxide was better than a cryoprotectant mixture. The best cryoprotectant concentrations were 10% for LDHi and 20% for AP metabolism. Fetal calf serum could be used at 50 or 80%, and incubation of slices with the cryoprotectant could last 10 or 20 min. At thawing, 42 degrees C was better than 22 degrees C. After thawing, 1h was better than 3h of preculture. Cryopreservation increased the interslice variability of the biomarkers. After cryopreservation, LDHi and AP metabolism levels were up to 84 and 80% of fresh values. However, these high levels were not reproducibly achieved. Two factors involved in the day-to-day variability of LDHi were identified: the incubation time with the cryoprotectant and the preculture time. In conclusion, the statistical design was very efficient to quickly determine optimized conditions by simultaneously measuring the role of numerous factors. The cryopreservation procedure developed appears suitable for qualitative metabolic profiling studies.

  4. Temporal variation and scale in movement-based resource selection functions

    USGS Publications Warehouse

    Hooten, M.B.; Hanks, E.M.; Johnson, D.S.; Alldredge, M.W.

    2013-01-01

    A common population characteristic of interest in animal ecology studies pertains to the selection of resources. That is, given the resources available to animals, what do they ultimately choose to use? A variety of statistical approaches have been employed to examine this question and each has advantages and disadvantages with respect to the form of available data and the properties of estimators given model assumptions. A wealth of high resolution telemetry data are now being collected to study animal population movement and space use and these data present both challenges and opportunities for statistical inference. We summarize traditional methods for resource selection and then describe several extensions to deal with measurement uncertainty and an explicit movement process that exists in studies involving high-resolution telemetry data. Our approach uses a correlated random walk movement model to obtain temporally varying use and availability distributions that are employed in a weighted distribution context to estimate selection coefficients. The temporally varying coefficients are then weighted by their contribution to selection and combined to provide inference at the population level. The result is an intuitive and accessible statistical procedure that uses readily available software and is computationally feasible for large datasets. These methods are demonstrated using data collected as part of a large-scale mountain lion monitoring study in Colorado, USA.

  5. Zone trends for three metropolitan statistical areas in North Carolina

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Oommen, R.G.; Aneja, V.P.; Riordan, A.J.

    1996-12-31

    As part of an effort by the state of North Carolina to develop a State Implementation Plan (SIP) for ozone control, a network of ozone stations was established to monitor ozone concentrations across the state. Approximately twenty-five ozone stations made continuous measurements surrounding the three major Metropolitan Statistical Areas (MSAs) between 1993-1995: Raleigh/Durham (RDU), Charlotte/Mecklenburg (CLT), and Greensboro/Winston-Salem/High Point (GSO). Statistical Averages on the ozone data were performed at each MSA to study trends and/or relationships on high ozone days. It was found that the three MSAs were not significantly different to each other, indicating they fall under the samemore » synoptic weather patterns, Transport and local production of biogenic sources of VOCs and NO{sub x} appear to play an important role for high ozone downwind at RDU, while mobile sources of these precursor gases contribute to the high ozone downwind of CLT and GSO. A {delta}(O{sub 3}) analysis (difference between the O{sub 3} measured at an upwind and downwind site) suggested that long-range transport of the precursors was a significant contribution for ozone problems at the three MSAs. 15 refs., 2 figs., 2 tabs.« less

  6. Aronia melanocarpa Treatment and Antioxidant Status in Selected Tissues in Wistar Rats

    PubMed Central

    Krośniak, Mirosław; Sanocka, Ilona; Bartoń, Henryk; Hebda, Tomasz; Francik, Sławomir

    2014-01-01

    Aronia juice is considered to be a source of compounds with high antioxidative potential. We conducted a study on the impact of compounds in the Aronia juice on oxidative stress in plasma and brain tissues. The influence of Aronia juice on oxidative stress parameters was tested with the use of a model with a high content of fructose and nonsaturated fats. Therefore, the activity of enzymatic (catalase, CAT, and paraoxonase, PON) and nonenzymatic (thiol groups, SH, and protein carbonyl groups, PCG) oxidative stress markers, which indicate changes in the carbohydrate and protein profiles, was marked in brain tissue homogenates. Adding Aronia caused statistically significant increase in the CAT activity in plasma in all tested diets, while the PON activity showed a statistically significant increase only in case of high fat diet. In animals fed with Aronia juice supplemented with carbohydrates or fat, statistically significant increase in the PON activity and the decrease in the CAT activity in brain tissue were observed. In case of the high fat diet, an increase in the number of SH groups and a decrease in the number of PCG groups in brain tissue were observed. PMID:25057488

  7. Aronia melanocarpa treatment and antioxidant status in selected tissues in Wistar rats.

    PubMed

    Francik, Renata; Krośniak, Mirosław; Sanocka, Ilona; Bartoń, Henryk; Hebda, Tomasz; Francik, Sławomir

    2014-01-01

    Aronia juice is considered to be a source of compounds with high antioxidative potential. We conducted a study on the impact of compounds in the Aronia juice on oxidative stress in plasma and brain tissues. The influence of Aronia juice on oxidative stress parameters was tested with the use of a model with a high content of fructose and nonsaturated fats. Therefore, the activity of enzymatic (catalase, CAT, and paraoxonase, PON) and nonenzymatic (thiol groups, SH, and protein carbonyl groups, PCG) oxidative stress markers, which indicate changes in the carbohydrate and protein profiles, was marked in brain tissue homogenates. Adding Aronia caused statistically significant increase in the CAT activity in plasma in all tested diets, while the PON activity showed a statistically significant increase only in case of high fat diet. In animals fed with Aronia juice supplemented with carbohydrates or fat, statistically significant increase in the PON activity and the decrease in the CAT activity in brain tissue were observed. In case of the high fat diet, an increase in the number of SH groups and a decrease in the number of PCG groups in brain tissue were observed.

  8. Comparison of future and base precipitation anomalies by SimCLIM statistical projection through ensemble approach in Pakistan

    NASA Astrophysics Data System (ADS)

    Amin, Asad; Nasim, Wajid; Mubeen, Muhammad; Kazmi, Dildar Hussain; Lin, Zhaohui; Wahid, Abdul; Sultana, Syeda Refat; Gibbs, Jim; Fahad, Shah

    2017-09-01

    Unpredictable precipitation trends have largely influenced by climate change which prolonged droughts or floods in South Asia. Statistical analysis of monthly, seasonal, and annual precipitation trend carried out for different temporal (1996-2015 and 2041-2060) and spatial scale (39 meteorological stations) in Pakistan. Statistical downscaling model (SimCLIM) was used for future precipitation projection (2041-2060) and analyzed by statistical approach. Ensemble approach combined with representative concentration pathways (RCPs) at medium level used for future projections. The magnitude and slop of trends were derived by applying Mann-Kendal and Sen's slop statistical approaches. Geo-statistical application used to generate precipitation trend maps. Comparison of base and projected precipitation by statistical analysis represented by maps and graphical visualization which facilitate to detect trends. Results of this study projects that precipitation trend was increasing more than 70% of weather stations for February, March, April, August, and September represented as base years. Precipitation trend was decreased in February to April but increase in July to October in projected years. Highest decreasing trend was reported in January for base years which was also decreased in projected years. Greater variation in precipitation trends for projected and base years was reported in February to April. Variations in projected precipitation trend for Punjab and Baluchistan highly accredited in March and April. Seasonal analysis shows large variation in winter, which shows increasing trend for more than 30% of weather stations and this increased trend approaches 40% for projected precipitation. High risk was reported in base year pre-monsoon season where 90% of weather station shows increasing trend but in projected years this trend decreased up to 33%. Finally, the annual precipitation trend has increased for more than 90% of meteorological stations in base (1996-2015) which has decreased for projected year (2041-2060) up to 76%. These result revealed that overall precipitation trend is decreasing in future year which may prolonged the drought in 14% of weather stations under study.

  9. Development and validation of a risk calculator predicting exercise-induced ventricular arrhythmia in patients with cardiovascular disease.

    PubMed

    Hermes, Ilarraza-Lomelí; Marianna, García-Saldivia; Jessica, Rojano-Castillo; Carlos, Barrera-Ramírez; Rafael, Chávez-Domínguez; María Dolores, Rius-Suárez; Pedro, Iturralde

    2016-10-01

    Mortality due to cardiovascular disease is often associated with ventricular arrhythmias. Nowadays, patients with cardiovascular disease are more encouraged to take part in physical training programs. Nevertheless, high-intensity exercise is associated to a higher risk for sudden death, even in apparently healthy people. During an exercise testing (ET), health care professionals provide patients, in a controlled scenario, an intense physiological stimulus that could precipitate cardiac arrhythmia in high risk individuals. There is still no clinical or statistical tool to predict this incidence. The aim of this study was to develop a statistical model to predict the incidence of exercise-induced potentially life-threatening ventricular arrhythmia (PLVA) during high intensity exercise. 6415 patients underwent a symptom-limited ET with a Balke ramp protocol. A multivariate logistic regression model where the primary outcome was PLVA was performed. Incidence of PLVA was 548 cases (8.5%). After a bivariate model, thirty one clinical or ergometric variables were statistically associated with PLVA and were included in the regression model. In the multivariate model, 13 of these variables were found to be statistically significant. A regression model (G) with a X(2) of 283.987 and a p<0.001, was constructed. Significant variables included: heart failure, antiarrhythmic drugs, myocardial lower-VD, age and use of digoxin, nitrates, among others. This study allows clinicians to identify patients at risk of ventricular tachycardia or couplets during exercise, and to take preventive measures or appropriate supervision. Copyright © 2016 Elsevier Ireland Ltd. All rights reserved.

  10. Corpus-based Statistical Screening for Phrase Identification

    PubMed Central

    Kim, Won; Wilbur, W. John

    2000-01-01

    Purpose: The authors study the extraction of useful phrases from a natural language database by statistical methods. The aim is to leverage human effort by providing preprocessed phrase lists with a high percentage of useful material. Method: The approach is to develop six different scoring methods that are based on different aspects of phrase occurrence. The emphasis here is not on lexical information or syntactic structure but rather on the statistical properties of word pairs and triples that can be obtained from a large database. Measurements: The Unified Medical Language System (UMLS) incorporates a large list of humanly acceptable phrases in the medical field as a part of its structure. The authors use this list of phrases as a gold standard for validating their methods. A good method is one that ranks the UMLS phrases high among all phrases studied. Measurements are 11-point average precision values and precision-recall curves based on the rankings. Result: The authors find of six different scoring methods that each proves effective in identifying UMLS quality phrases in a large subset of MEDLINE. These methods are applicable both to word pairs and word triples. All six methods are optimally combined to produce composite scoring methods that are more effective than any single method. The quality of the composite methods appears sufficient to support the automatic placement of hyperlinks in text at the site of highly ranked phrases. Conclusion: Statistical scoring methods provide a promising approach to the extraction of useful phrases from a natural language database for the purpose of indexing or providing hyperlinks in text. PMID:10984469

  11. Aerosol, a health hazard during ultrasonic scaling: A clinico-microbiological study.

    PubMed

    Singh, Akanksha; Shiva Manjunath, R G; Singla, Deepak; Bhattacharya, Hirak S; Sarkar, Arijit; Chandra, Neeraj

    2016-01-01

    Ultrasonic scaling is a routinely used treatment to remove plaque and calculus from tooth surfaces. These scalers use water as a coolant which is splattered during the vibration of the tip. The splatter when mixed with saliva and plaque of the patients causes the aerosol highly infectious and acts as a major risk factor for transmission of the disease. In spite of necessary protection, sometimes, the operator might get infected because of the infectious nature of the splatter. To evaluate the aerosol contamination produced during ultrasonic scaling by the help of microbiological analysis. This clinico-microbiological study consisted of twenty patients. Two agar plates were used for each patient; the first was kept at the center of the operatory room 20 min before the treatment while the second agar plate was kept 40 cm away from the patient's chest during the treatment. Both the agar plates were sent for microbiological analysis. The statistical analysis was done with the help of STATA 11.0 (StataCorp. 2013. Stata Statistical Software, Release 13. College Station, TX: StataCorp LP, 4905 Lakeway Drive College Station, Texas, USA). Statistical software was used for data analysis and the P < 0.001 was considered to be statistically significant. The results for bacterial count were highly significant when compared before and during the treatment. The Gram staining showed the presence of Staphylococcus and Streptococcus species in high numbers. The aerosols and splatters produced during dental procedures have the potential to spread infection to dental personnel. Therefore, proper precautions should be taken to minimize the risk of infection to the operator.

  12. The chi-square test of independence.

    PubMed

    McHugh, Mary L

    2013-01-01

    The Chi-square statistic is a non-parametric (distribution free) tool designed to analyze group differences when the dependent variable is measured at a nominal level. Like all non-parametric statistics, the Chi-square is robust with respect to the distribution of the data. Specifically, it does not require equality of variances among the study groups or homoscedasticity in the data. It permits evaluation of both dichotomous independent variables, and of multiple group studies. Unlike many other non-parametric and some parametric statistics, the calculations needed to compute the Chi-square provide considerable information about how each of the groups performed in the study. This richness of detail allows the researcher to understand the results and thus to derive more detailed information from this statistic than from many others. The Chi-square is a significance statistic, and should be followed with a strength statistic. The Cramer's V is the most common strength test used to test the data when a significant Chi-square result has been obtained. Advantages of the Chi-square include its robustness with respect to distribution of the data, its ease of computation, the detailed information that can be derived from the test, its use in studies for which parametric assumptions cannot be met, and its flexibility in handling data from both two group and multiple group studies. Limitations include its sample size requirements, difficulty of interpretation when there are large numbers of categories (20 or more) in the independent or dependent variables, and tendency of the Cramer's V to produce relative low correlation measures, even for highly significant results.

  13. [Road Extraction in Remote Sensing Images Based on Spectral and Edge Analysis].

    PubMed

    Zhao, Wen-zhi; Luo, Li-qun; Guo, Zhou; Yue, Jun; Yu, Xue-ying; Liu, Hui; Wei, Jing

    2015-10-01

    Roads are typically man-made objects in urban areas. Road extraction from high-resolution images has important applications for urban planning and transportation development. However, due to the confusion of spectral characteristic, it is difficult to distinguish roads from other objects by merely using traditional classification methods that mainly depend on spectral information. Edge is an important feature for the identification of linear objects (e. g. , roads). The distribution patterns of edges vary greatly among different objects. It is crucial to merge edge statistical information into spectral ones. In this study, a new method that combines spectral information and edge statistical features has been proposed. First, edge detection is conducted by using self-adaptive mean-shift algorithm on the panchromatic band, which can greatly reduce pseudo-edges and noise effects. Then, edge statistical features are obtained from the edge statistical model, which measures the length and angle distribution of edges. Finally, by integrating the spectral and edge statistical features, SVM algorithm is used to classify the image and roads are ultimately extracted. A series of experiments are conducted and the results show that the overall accuracy of proposed method is 93% comparing with only 78% overall accuracy of the traditional. The results demonstrate that the proposed method is efficient and valuable for road extraction, especially on high-resolution images.

  14. A Powerful Approach to Estimating Annotation-Stratified Genetic Covariance via GWAS Summary Statistics.

    PubMed

    Lu, Qiongshi; Li, Boyang; Ou, Derek; Erlendsdottir, Margret; Powles, Ryan L; Jiang, Tony; Hu, Yiming; Chang, David; Jin, Chentian; Dai, Wei; He, Qidu; Liu, Zefeng; Mukherjee, Shubhabrata; Crane, Paul K; Zhao, Hongyu

    2017-12-07

    Despite the success of large-scale genome-wide association studies (GWASs) on complex traits, our understanding of their genetic architecture is far from complete. Jointly modeling multiple traits' genetic profiles has provided insights into the shared genetic basis of many complex traits. However, large-scale inference sets a high bar for both statistical power and biological interpretability. Here we introduce a principled framework to estimate annotation-stratified genetic covariance between traits using GWAS summary statistics. Through theoretical and numerical analyses, we demonstrate that our method provides accurate covariance estimates, thereby enabling researchers to dissect both the shared and distinct genetic architecture across traits to better understand their etiologies. Among 50 complex traits with publicly accessible GWAS summary statistics (N total ≈ 4.5 million), we identified more than 170 pairs with statistically significant genetic covariance. In particular, we found strong genetic covariance between late-onset Alzheimer disease (LOAD) and amyotrophic lateral sclerosis (ALS), two major neurodegenerative diseases, in single-nucleotide polymorphisms (SNPs) with high minor allele frequencies and in SNPs located in the predicted functional genome. Joint analysis of LOAD, ALS, and other traits highlights LOAD's correlation with cognitive traits and hints at an autoimmune component for ALS. Copyright © 2017 American Society of Human Genetics. Published by Elsevier Inc. All rights reserved.

  15. Impaired Statistical Learning in Developmental Dyslexia

    PubMed Central

    Thiessen, Erik D.; Holt, Lori L.

    2015-01-01

    Purpose Developmental dyslexia (DD) is commonly thought to arise from phonological impairments. However, an emerging perspective is that a more general procedural learning deficit, not specific to phonological processing, may underlie DD. The current study examined if individuals with DD are capable of extracting statistical regularities across sequences of passively experienced speech and nonspeech sounds. Such statistical learning is believed to be domain-general, to draw upon procedural learning systems, and to relate to language outcomes. Method DD and control groups were familiarized with a continuous stream of syllables or sine-wave tones, the ordering of which was defined by high or low transitional probabilities across adjacent stimulus pairs. Participants subsequently judged two 3-stimulus test items with either high or low statistical coherence as being the most similar to the sounds heard during familiarization. Results As with control participants, the DD group was sensitive to the transitional probability structure of the familiarization materials as evidenced by above-chance performance. However, the performance of participants with DD was significantly poorer than controls across linguistic and nonlinguistic stimuli. In addition, reading-related measures were significantly correlated with statistical learning performance of both speech and nonspeech material. Conclusion Results are discussed in light of procedural learning impairments among participants with DD. PMID:25860795

  16. Low socioeconomic status is associated with worse survival in children with cancer: a systematic review.

    PubMed

    Gupta, Sumit; Wilejto, Marta; Pole, Jason D; Guttmann, Astrid; Sung, Lillian

    2014-01-01

    While low socioeconomic status (SES) has been associated with inferior cancer outcome among adults, its impact in pediatric oncology is unclear. Our objective was therefore to conduct a systematic review to determine the impact of SES upon outcome in children with cancer. We searched Ovid Medline, EMBASE and CINAHL from inception to December 2012. Studies for which survival-related outcomes were reported by socioeconomic subgroups were eligible for inclusion. Two reviewers independently assessed articles and extracted data. Given anticipated heterogeneity, no quantitative meta-analyses were planned a priori. Of 7,737 publications, 527 in ten languages met criteria for full review; 36 studies met final inclusion criteria. In low- and middle-income countries (LMIC), lower SES was uniformly associated with inferior survival, regardless of the measure chosen. The majority of associations were statistically significant. Of 52 associations between socioeconomic variables and outcome among high-income country (HIC) children, 38 (73.1%) found low SES to be associated with worse survival, 15 of which were statistically significant. Of the remaining 14 (no association or high SES associated with worse survival), only one was statistically significant. Both HIC studies examining the effect of insurance found uninsured status to be statistically associated with inferior survival. Socioeconomic gradients in which low SES is associated with inferior childhood cancer survival are ubiquitous in LMIC and common in HIC. Future studies should elucidate mechanisms underlying these gradients, allowing the design of interventions mediating socioeconomic effects. Targeting the effect of low SES will allow for further improvements in childhood cancer survival.

  17. The changing landscape of astrostatistics and astroinformatics

    NASA Astrophysics Data System (ADS)

    Feigelson, Eric D.

    2017-06-01

    The history and current status of the cross-disciplinary fields of astrostatistics and astroinformatics are reviewed. Astronomers need a wide range of statistical methods for both data reduction and science analysis. With the proliferation of high-throughput telescopes, efficient large scale computational methods are also becoming essential. However, astronomers receive only weak training in these fields during their formal education. Interest in the fields is rapidly growing with conferences organized by scholarly societies, textbooks and tutorial workshops, and research studies pushing the frontiers of methodology. R, the premier language of statistical computing, can provide an important software environment for the incorporation of advanced statistical and computational methodology into the astronomical community.

  18. An analysis of the relationship of flight hours and naval rotary wing aviation mishaps

    DTIC Science & Technology

    2017-03-01

    evidence to support indicators used for sequestration, high flight hours, night flight, and overwater flight had statistically significant effects on...estimates found enough evidence to support indicators used for sequestration, high flight hours, night flight, and overwater flight had statistically ...38 C. DESCRIPTIVE STATISTICS ................................................................38 D

  19. The RS4939827 polymorphism in the SMAD7 GENE and its association with Mediterranean diet in colorectal carcinogenesis.

    PubMed

    Alonso-Molero, Jéssica; González-Donquiles, Carmen; Palazuelos, Camilo; Fernández-Villa, Tania; Ramos, Elena; Pollán, Marina; Aragonés, Nuria; Llorca, Javier; Henar Alonso, M; Tardón, Adonina; Amiano, Pilar; Moleon, José Juan Jiménez; Pérez, Rosana Peiró; Capelo, Rocío; Molina, Antonio J; Acebo, Inés Gómez; Guevara, Marcela; Perez-Gomez, Beatriz; Lope, Virginia; Huerta, José María; Castaño-Vinyals, Gemma; Kogevinas, Manolis; Moreno, Victor; Martín, Vicente

    2017-10-30

    The objective of our investigation is to study the relationship between the rs4939827 SNP in the SMAD7 gene, Mediterranean diet pattern and the risk of colorectal cancer. We examined 1087 cases of colorectal cancer and 2409 population controls with available DNA samples from the MCC-Spain study, 2008-2012. Descriptive statistical analyses, and multivariate logistic mixed models were performed. The potential synergistic effect of rs4939827 and the Mediterranean diet pattern was evaluated with logistic regression in different strata of of adherence to the Mediterranean diet and the genotype. High adherence to Mediterrenean diet was statistically significantly associated with colorectal cancer risk. A decreased risk for CRC cancer was observed for the CC compared to the TT genotype (OR = 0.65 and 95% CI = 0.51-0.81) of the rs4939827 SNP Also, we could show an association between the Mediterranean diet pattern (protective factor) and rs4939827. Although the decreased risk for the CC genotype was slightly more pronounced in subjects with high adherence to Mediterrenean diet, there was no statistically significant synergistic effect between genotype CC and adherence to the Mediterranean dietary pattern factors. The SMAD7 gene and specifically the allele C could be protective for colorectal cancer. An independent protective association was also observed between high adherence Mediterranean diet pattern and CRC risk. Findings form this study indicate that high adherence to Mediterranean diet pattern has a protective role for CRC cancer probably involving the Tumor Growth Factor- β pathway in this cancer.

  20. Dermatoglyphics--a marker for malocclusion?

    PubMed

    Tikare, S; Rajesh, G; Prasad, K W; Thippeswamy, V; Javali, S B

    2010-08-01

    Dermatoglyphics is the study of dermal ridge configurations on palmar and plantar surfaces of hands and feet. Dermal ridges and craniofacial structures are both formed during 6-7th week of intra-uterine life. It is believed that hereditary and environmental factors leading to malocclusion may also cause peculiarities in fingerprint patterns. To study and assess the relationship between fingerprints and malocclusion among a group of high school children aged 12-16 years in Dharwad, Karnataka, India. A total of 696 high school children aged 12-16 years were randomly selected. Their fingerprints were recorded using duplicating ink and malocclusion status was clinically assessed using Angle's classification. Chi-square analysis revealed statistical association between whorl patterns and classes 1 and 2 malocclusion (p < 0.05). However, no overall statistical association was observed between fingerprint patterns and malocclusion (p > 0.05). Dermatoglyphics might be an appropriate marker for malocclusion and further studies are required to elucidate an association between fingerprint patterns and malocclusion.

  1. Interventions for reducing self-stigma in people with mental illnesses: a systematic review of randomized controlled trials

    PubMed Central

    Büchter, Roland Brian; Messer, Melanie

    2017-01-01

    Background: Self-stigma occurs when people with mental illnesses internalize negative stereotypes and prejudices about their condition. It can reduce help-seeking behaviour and treatment adherence. The effectiveness of interventions aimed at reducing self-stigma in people with mental illness is systematically reviewed. Results are discussed in the context of a logic model of the broader social context of mental illness stigma. Methods: Medline, Embase, PsycINFO, ERIC, and CENTRAL were searched for randomized controlled trials in November 2013. Studies were assessed with the Cochrane risk of bias tool. Results: Five trials were eligible for inclusion, four of which provided data for statistical analyses. Four studies had a high risk of bias. The quality of evidence was very low for each set of interventions and outcomes. The interventions studied included various group based anti-stigma interventions and an anti-stigma booklet. The intensity and fidelity of most interventions was high. Two studies were considered to be sufficiently homogeneous to be pooled for the outcome self-stigma. The meta-analysis did not find a statistically significant effect (SMD [95% CI] at 3 months: –0.26 [–0.64, 0.12], I2=0%, n=108). None of the individual studies found sustainable effects on other outcomes, including recovery, help-seeking behaviour and self-stigma. Conclusions: The effectiveness of interventions against self-stigma is uncertain. Previous studies lacked statistical power, used questionable outcome measures and had a high risk of bias. Future studies should be based on robust methods and consider practical implications regarding intervention development (relevance, implementability, and placement in routine services). PMID:28496396

  2. Plasma sheet density dependence on Interplanetary Magnetic Field and Solar Wind properties: statistical study using 9+ year of THEMIS data

    NASA Astrophysics Data System (ADS)

    Nykyri, K.; Chu, C.; Dimmock, A. P.

    2017-12-01

    Previous studies have shown that plasma sheet in tenuous and hot during southward IMF, whereas northward IMF conditions are associated with cold, dense plasma. The cold, dense plasma sheet (CDPS) has strong influence on magnetospheric dynamics. Closer to Earth, the CDPS could be formed via double high-latitude reconnection, while at increasing tailward distance reconnection, diffusion and kinetic Alfven waves in association with Kelvin-Helmholtz Instability are suggested as dominant source for cold-dense plasma sheet formation. In this paper we present statistical correlation study between Solar Wind, Magnetosheath and Plasma sheet properties using 9+ years of THEMIS data in aberrated GSM frame, and in a normalized coordinate system that takes into account the changes of the magnetopause and bow shock location with respect to changing solar wind conditions. We present statistical results of the plasma sheet density dependence on IMF orientation and other solar wind properties.

  3. A Statistical Analysis of IrisCode and Its Security Implications.

    PubMed

    Kong, Adams Wai-Kin

    2015-03-01

    IrisCode has been used to gather iris data for 430 million people. Because of the huge impact of IrisCode, it is vital that it is completely understood. This paper first studies the relationship between bit probabilities and a mean of iris images (The mean of iris images is defined as the average of independent iris images.) and then uses the Chi-square statistic, the correlation coefficient and a resampling algorithm to detect statistical dependence between bits. The results show that the statistical dependence forms a graph with a sparse and structural adjacency matrix. A comparison of this graph with a graph whose edges are defined by the inner product of the Gabor filters that produce IrisCodes shows that partial statistical dependence is induced by the filters and propagates through the graph. Using this statistical information, the security risk associated with two patented template protection schemes that have been deployed in commercial systems for producing application-specific IrisCodes is analyzed. To retain high identification speed, they use the same key to lock all IrisCodes in a database. The belief has been that if the key is not compromised, the IrisCodes are secure. This study shows that even without the key, application-specific IrisCodes can be unlocked and that the key can be obtained through the statistical dependence detected.

  4. Radar error statistics for the space shuttle

    NASA Technical Reports Server (NTRS)

    Lear, W. M.

    1979-01-01

    Radar error statistics of C-band and S-band that are recommended for use with the groundtracking programs to process space shuttle tracking data are presented. The statistics are divided into two parts: bias error statistics, using the subscript B, and high frequency error statistics, using the subscript q. Bias errors may be slowly varying to constant. High frequency random errors (noise) are rapidly varying and may or may not be correlated from sample to sample. Bias errors were mainly due to hardware defects and to errors in correction for atmospheric refraction effects. High frequency noise was mainly due to hardware and due to atmospheric scintillation. Three types of atmospheric scintillation were identified: horizontal, vertical, and line of sight. This was the first time that horizontal and line of sight scintillations were identified.

  5. Estimating Preferential Flow in Karstic Aquifers Using Statistical Mixed Models

    PubMed Central

    Anaya, Angel A.; Padilla, Ingrid; Macchiavelli, Raul; Vesper, Dorothy J.; Meeker, John D.; Alshawabkeh, Akram N.

    2013-01-01

    Karst aquifers are highly productive groundwater systems often associated with conduit flow. These systems can be highly vulnerable to contamination, resulting in a high potential for contaminant exposure to humans and ecosystems. This work develops statistical models to spatially characterize flow and transport patterns in karstified limestone and determines the effect of aquifer flow rates on these patterns. A laboratory-scale Geo-HydroBed model is used to simulate flow and transport processes in a karstic limestone unit. The model consists of stainless-steel tanks containing a karstified limestone block collected from a karst aquifer formation in northern Puerto Rico. Experimental work involves making a series of flow and tracer injections, while monitoring hydraulic and tracer response spatially and temporally. Statistical mixed models are applied to hydraulic data to determine likely pathways of preferential flow in the limestone units. The models indicate a highly heterogeneous system with dominant, flow-dependent preferential flow regions. Results indicate that regions of preferential flow tend to expand at higher groundwater flow rates, suggesting a greater volume of the system being flushed by flowing water at higher rates. Spatial and temporal distribution of tracer concentrations indicates the presence of conduit-like and diffuse flow transport in the system, supporting the notion of both combined transport mechanisms in the limestone unit. The temporal response of tracer concentrations at different locations in the model coincide with, and confirms the preferential flow distribution generated with the statistical mixed models used in the study. PMID:23802921

  6. Restoration of MRI data for intensity non-uniformities using local high order intensity statistics

    PubMed Central

    Hadjidemetriou, Stathis; Studholme, Colin; Mueller, Susanne; Weiner, Michael; Schuff, Norbert

    2008-01-01

    MRI at high magnetic fields (>3.0 T) is complicated by strong inhomogeneous radio-frequency fields, sometimes termed the “bias field”. These lead to non-biological intensity non-uniformities across the image. They can complicate further image analysis such as registration and tissue segmentation. Existing methods for intensity uniformity restoration have been optimized for 1.5 T, but they are less effective for 3.0 T MRI, and not at all satisfactory for higher fields. Also, many of the existing restoration algorithms require a brain template or use a prior atlas, which can restrict their practicalities. In this study an effective intensity uniformity restoration algorithm has been developed based on non-parametric statistics of high order local intensity co-occurrences. These statistics are restored with a non-stationary Wiener filter. The algorithm also assumes a smooth non-uniformity and is stable. It does not require a prior atlas and is robust to variations in anatomy. In geriatric brain imaging it is robust to variations such as enlarged ventricles and low contrast to noise ratio. The co-occurrence statistics improve robustness to whole head images with pronounced non-uniformities present in high field acquisitions. Its significantly improved performance and lower time requirements have been demonstrated by comparing it to the very commonly used N3 algorithm on BrainWeb MR simulator images as well as on real 4 T human head images. PMID:18621568

  7. Medical intelligence in Sweden. Vitamin B12: oral compared with parenteral?

    PubMed

    Nilsson, M; Norberg, B; Hultdin, J; Sandström, H; Westman, G; Lökk, J

    2005-03-01

    Sweden is the only country in which oral high dose vitamin B12 has gained widespread use in the treatment of deficiency states. The aim of the study was to describe prescribing patterns and sales statistics of vitamin B12 tablets and injections in Sweden 1990-2000.Design, setting, and sources: Official statistics of cobalamin prescriptions and sales were used. The use of vitamin B12 increased in Sweden 1990-2000, mainly because of an increase in the use of oral high dose vitamin B12 therapy. The experience, in statistical terms a "total investigation", comprised 1,000,000 patient years for tablets and 750,000 patient years for injections. During 2000, 13% of residents aged 70 and over were treated with vitamin B12, two of three with the tablet preparation. Most patients in Sweden requiring vitamin B12 therapy have transferred from parenteral to oral high dose vitamin B12 since 1964, when the oral preparation was introduced. The findings suggest that many patients in other post-industrial societies may also be suitable for oral vitamin B12 treatment.

  8. Long-Term Evolution of Email Networks: Statistical Regularities, Predictability and Stability of Social Behaviors.

    PubMed

    Godoy-Lorite, Antonia; Guimerà, Roger; Sales-Pardo, Marta

    2016-01-01

    In social networks, individuals constantly drop ties and replace them by new ones in a highly unpredictable fashion. This highly dynamical nature of social ties has important implications for processes such as the spread of information or of epidemics. Several studies have demonstrated the influence of a number of factors on the intricate microscopic process of tie replacement, but the macroscopic long-term effects of such changes remain largely unexplored. Here we investigate whether, despite the inherent randomness at the microscopic level, there are macroscopic statistical regularities in the long-term evolution of social networks. In particular, we analyze the email network of a large organization with over 1,000 individuals throughout four consecutive years. We find that, although the evolution of individual ties is highly unpredictable, the macro-evolution of social communication networks follows well-defined statistical patterns, characterized by exponentially decaying log-variations of the weight of social ties and of individuals' social strength. At the same time, we find that individuals have social signatures and communication strategies that are remarkably stable over the scale of several years.

  9. Min-By-Min Respiratory Exchange and Oxygen Uptake Kinetics During Steady-State Exercise in Subjects of High and Low Max VO2

    ERIC Educational Resources Information Center

    Weltman, Arthur; Katch, Victor

    1976-01-01

    No statistically meaningful differences in steady-state vo2 uptake for high and low max vo2 groups was indicated in this study, but a clear tendency was observed for the high max vo2 group to reach the steady-state at a faster rate. (MB)

  10. Influence of Cultural Cognition, Social Aspect of Culture, and Personality on Trust

    DTIC Science & Technology

    2013-12-31

    of individualists whereby their satisfaction towards business negotiation stems mainly from high economic gains or personal outcomes. Therefore, it...2007) conducted a study on attributions of trustworthiness to unfamiliar trustees among Japanese and Canadians undergraduate business students. No...High means score of analytic-holism scale signifies high level of holism. No statistically significant difference was found between Malays, Chinese

  11. Towards a High Quality High School Workforce: A Longitudinal, Demographic Analysis of U.S. Public School Physics Teachers

    ERIC Educational Resources Information Center

    Rushton, Gregory T.; Rosengrant, David; Dewar, Andrew; Shah, Lisa; Ray, Herman E.; Sheppard, Keith; Watanabe, Lynn

    2017-01-01

    Efforts to improve the number and quality of the high school physics teaching workforce have taken several forms, including those sponsored by professional organizations. Using a series of large-scale teacher demographic data sets from the National Center for Education Statistics (NCES), this study sought to investigate trends in teacher quality…

  12. Communication Dynamics of Blog Networks

    NASA Astrophysics Data System (ADS)

    Goldberg, Mark; Kelley, Stephen; Magdon-Ismail, Malik; Mertsalov, Konstantin; Wallace, William (Al)

    We study the communication dynamics of Blog networks, focusing on the Russian section of LiveJournal as a case study. Communication (blogger-to-blogger links) in such online communication networks is very dynamic: over 60% of the links in the network are new from one week to the next, though the set of bloggers remains approximately constant. Two fundamental questions are: (i) what models adequately describe such dynamic communication behavior; and (ii) how does one detect the phase transitions, i.e. the changes that go beyond the standard high-level dynamics? We approach these questions through the notion of stable statistics. We give strong experimental evidence to the fact that, despite the extreme amount of communication dynamics, several aggregate statistics are remarkably stable. We use stable statistics to test our models of communication dynamics postulating that any good model should produce values for these statistics which are both stable and close to the observed ones. Stable statistics can also be used to identify phase transitions, since any change in a normally stable statistic indicates a substantial change in the nature of the communication dynamics. We describe models of the communication dynamics in large social networks based on the principle of locality of communication: a node's communication energy is spent mostly within its own "social area," the locality of the node.

  13. An operational definition of a statistically meaningful trend.

    PubMed

    Bryhn, Andreas C; Dimberg, Peter H

    2011-04-28

    Linear trend analysis of time series is standard procedure in many scientific disciplines. If the number of data is large, a trend may be statistically significant even if data are scattered far from the trend line. This study introduces and tests a quality criterion for time trends referred to as statistical meaningfulness, which is a stricter quality criterion for trends than high statistical significance. The time series is divided into intervals and interval mean values are calculated. Thereafter, r(2) and p values are calculated from regressions concerning time and interval mean values. If r(2) ≥ 0.65 at p ≤ 0.05 in any of these regressions, then the trend is regarded as statistically meaningful. Out of ten investigated time series from different scientific disciplines, five displayed statistically meaningful trends. A Microsoft Excel application (add-in) was developed which can perform statistical meaningfulness tests and which may increase the operationality of the test. The presented method for distinguishing statistically meaningful trends should be reasonably uncomplicated for researchers with basic statistics skills and may thus be useful for determining which trends are worth analysing further, for instance with respect to causal factors. The method can also be used for determining which segments of a time trend may be particularly worthwhile to focus on.

  14. Trend analysis of body weight parameters, mortality, and incidence of spontaneous tumors in Tg.rasH2 mice.

    PubMed

    Paranjpe, Madhav G; Denton, Melissa D; Vidmar, Tom; Elbekai, Reem H

    2014-01-01

    Carcinogenicity studies have been performed in conventional 2-year rodent studies for at least 3 decades, whereas the short-term carcinogenicity studies in transgenic mice, such as Tg.rasH2, have only been performed over the last decade. In the 2-year conventional rodent studies, interlinked problems, such as increasing trends in the initial body weights, increased body weight gains, high incidence of spontaneous tumors, and low survival, that complicate the interpretation of findings have been well established. However, these end points have not been evaluated in the short-term carcinogenicity studies involving the Tg.rasH2 mice. In this article, we present retrospective analysis of data obtained from control groups in 26-week carcinogenicity studies conducted in Tg.rasH2 mice since 2004. Our analysis showed statistically significant decreasing trends in initial body weights of both sexes. Although the terminal body weights did not show any significant trends, there was a statistically significant increasing trend toward body weight gains, more so in males than in females, which correlated with increasing trends in the food consumption. There were no statistically significant alterations in mortality trends. In addition, the incidence of all common spontaneous tumors remained fairly constant with no statistically significant differences in trends. © The Author(s) 2014.

  15. Text mining and network analysis to find functional associations of genes in high altitude diseases.

    PubMed

    Bhasuran, Balu; Subramanian, Devika; Natarajan, Jeyakumar

    2018-05-02

    Travel to elevations above 2500 m is associated with the risk of developing one or more forms of acute altitude illness such as acute mountain sickness (AMS), high altitude cerebral edema (HACE) or high altitude pulmonary edema (HAPE). Our work aims to identify the functional association of genes involved in high altitude diseases. In this work we identified the gene networks responsible for high altitude diseases by using the principle of gene co-occurrence statistics from literature and network analysis. First, we mined the literature data from PubMed on high-altitude diseases, and extracted the co-occurring gene pairs. Next, based on their co-occurrence frequency, gene pairs were ranked. Finally, a gene association network was created using statistical measures to explore potential relationships. Network analysis results revealed that EPO, ACE, IL6 and TNF are the top five genes that were found to co-occur with 20 or more genes, while the association between EPAS1 and EGLN1 genes is strongly substantiated. The network constructed from this study proposes a large number of genes that work in-toto in high altitude conditions. Overall, the result provides a good reference for further study of the genetic relationships in high altitude diseases. Copyright © 2018 Elsevier Ltd. All rights reserved.

  16. metaCCA: summary statistics-based multivariate meta-analysis of genome-wide association studies using canonical correlation analysis.

    PubMed

    Cichonska, Anna; Rousu, Juho; Marttinen, Pekka; Kangas, Antti J; Soininen, Pasi; Lehtimäki, Terho; Raitakari, Olli T; Järvelin, Marjo-Riitta; Salomaa, Veikko; Ala-Korpela, Mika; Ripatti, Samuli; Pirinen, Matti

    2016-07-01

    A dominant approach to genetic association studies is to perform univariate tests between genotype-phenotype pairs. However, analyzing related traits together increases statistical power, and certain complex associations become detectable only when several variants are tested jointly. Currently, modest sample sizes of individual cohorts, and restricted availability of individual-level genotype-phenotype data across the cohorts limit conducting multivariate tests. We introduce metaCCA, a computational framework for summary statistics-based analysis of a single or multiple studies that allows multivariate representation of both genotype and phenotype. It extends the statistical technique of canonical correlation analysis to the setting where original individual-level records are not available, and employs a covariance shrinkage algorithm to achieve robustness.Multivariate meta-analysis of two Finnish studies of nuclear magnetic resonance metabolomics by metaCCA, using standard univariate output from the program SNPTEST, shows an excellent agreement with the pooled individual-level analysis of original data. Motivated by strong multivariate signals in the lipid genes tested, we envision that multivariate association testing using metaCCA has a great potential to provide novel insights from already published summary statistics from high-throughput phenotyping technologies. Code is available at https://github.com/aalto-ics-kepaco anna.cichonska@helsinki.fi or matti.pirinen@helsinki.fi Supplementary data are available at Bioinformatics online. © The Author 2016. Published by Oxford University Press.

  17. metaCCA: summary statistics-based multivariate meta-analysis of genome-wide association studies using canonical correlation analysis

    PubMed Central

    Cichonska, Anna; Rousu, Juho; Marttinen, Pekka; Kangas, Antti J.; Soininen, Pasi; Lehtimäki, Terho; Raitakari, Olli T.; Järvelin, Marjo-Riitta; Salomaa, Veikko; Ala-Korpela, Mika; Ripatti, Samuli; Pirinen, Matti

    2016-01-01

    Motivation: A dominant approach to genetic association studies is to perform univariate tests between genotype-phenotype pairs. However, analyzing related traits together increases statistical power, and certain complex associations become detectable only when several variants are tested jointly. Currently, modest sample sizes of individual cohorts, and restricted availability of individual-level genotype-phenotype data across the cohorts limit conducting multivariate tests. Results: We introduce metaCCA, a computational framework for summary statistics-based analysis of a single or multiple studies that allows multivariate representation of both genotype and phenotype. It extends the statistical technique of canonical correlation analysis to the setting where original individual-level records are not available, and employs a covariance shrinkage algorithm to achieve robustness. Multivariate meta-analysis of two Finnish studies of nuclear magnetic resonance metabolomics by metaCCA, using standard univariate output from the program SNPTEST, shows an excellent agreement with the pooled individual-level analysis of original data. Motivated by strong multivariate signals in the lipid genes tested, we envision that multivariate association testing using metaCCA has a great potential to provide novel insights from already published summary statistics from high-throughput phenotyping technologies. Availability and implementation: Code is available at https://github.com/aalto-ics-kepaco Contacts: anna.cichonska@helsinki.fi or matti.pirinen@helsinki.fi Supplementary information: Supplementary data are available at Bioinformatics online. PMID:27153689

  18. Effects of ozone (O3) therapy on cisplatin-induced ototoxicity in rats.

    PubMed

    Koçak, Hasan Emre; Taşkın, Ümit; Aydın, Salih; Oktay, Mehmet Faruk; Altınay, Serdar; Çelik, Duygu Sultan; Yücebaş, Kadir; Altaş, Bengül

    2016-12-01

    The aim of this study is to investigate the effect of rectal ozone and intratympanic ozone therapy on cisplatin-induced ototoxicity in rats. Eighteen female Wistar albino rats were included in our study. External auditory canal and tympanic membrane examinations were normal in all rats. The rats were randomly divided into three groups. Initially, all the rats were tested with distortion product otoacoustic emissions (DPOAE), and emissions were measured normally. All rats were injected with 5-mg/kg/day cisplatin for 3 days intraperitoneally. Ototoxicy had developed in all rats, as confirmed with DPOAE after 1 week. Rectal and intratympanic ozone therapy group was Group 1. No treatment was administered for the rats in Group 2 as the control group. The rats in Group 3 were treated with rectal ozone. All the rats were tested with DPOAE under general anesthesia, and all were sacrificed for pathological examination 1 week after ozone administration. Their cochleas were removed. The outer hair cell damage and stria vascularis damage were examined. In the statistical analysis conducted, a statistically significant difference between Group 1 and Group 2 was observed in all frequencies according to the DPOAE test. In addition, between Group 2 and Group 3, a statistically significant difference was observed in the DPOAE test. However, a statistically significant difference was not observed between Group 1 and Group 3 according to the DPOAE test. According to histopathological scoring, the outer hair cell damage score was statistically significantly high in Group 2 compared with Group 1. In addition, the outer hair cell damage score was also statistically significantly high in Group 2 compared with Group 3. Outer hair cell damage scores were low in Group 1 and Group 3, but there was no statistically significant difference between these groups. There was no statistically significant difference between the groups in terms of stria vascularis damage score examinations. Systemic ozone gas therapy is effective in the treatment of cell damage in cisplatin-induced ototoxicity. The intratympanic administration of ozone gas does not have any additional advantage over the rectal administration.

  19. Estimation of Mouse Organ Locations Through Registration of a Statistical Mouse Atlas With Micro-CT Images

    PubMed Central

    Stout, David B.; Chatziioannou, Arion F.

    2012-01-01

    Micro-CT is widely used in preclinical studies of small animals. Due to the low soft-tissue contrast in typical studies, segmentation of soft tissue organs from noncontrast enhanced micro-CT images is a challenging problem. Here, we propose an atlas-based approach for estimating the major organs in mouse micro-CT images. A statistical atlas of major trunk organs was constructed based on 45 training subjects. The statistical shape model technique was used to include inter-subject anatomical variations. The shape correlations between different organs were described using a conditional Gaussian model. For registration, first the high-contrast organs in micro-CT images were registered by fitting the statistical shape model, while the low-contrast organs were subsequently estimated from the high-contrast organs using the conditional Gaussian model. The registration accuracy was validated based on 23 noncontrast-enhanced and 45 contrast-enhanced micro-CT images. Three different accuracy metrics (Dice coefficient, organ volume recovery coefficient, and surface distance) were used for evaluation. The Dice coefficients vary from 0.45 ± 0.18 for the spleen to 0.90 ± 0.02 for the lungs, the volume recovery coefficients vary from for the liver to 1.30 ± 0.75 for the spleen, the surface distances vary from 0.18 ± 0.01 mm for the lungs to 0.72 ± 0.42 mm for the spleen. The registration accuracy of the statistical atlas was compared with two publicly available single-subject mouse atlases, i.e., the MOBY phantom and the DIGIMOUSE atlas, and the results proved that the statistical atlas is more accurate than the single atlases. To evaluate the influence of the training subject size, different numbers of training subjects were used for atlas construction and registration. The results showed an improvement of the registration accuracy when more training subjects were used for the atlas construction. The statistical atlas-based registration was also compared with the thin-plate spline based deformable registration, commonly used in mouse atlas registration. The results revealed that the statistical atlas has the advantage of improving the estimation of low-contrast organs. PMID:21859613

  20. K-nearest neighbors based methods for identification of different gear crack levels under different motor speeds and loads: Revisited

    NASA Astrophysics Data System (ADS)

    Wang, Dong

    2016-03-01

    Gears are the most commonly used components in mechanical transmission systems. Their failures may cause transmission system breakdown and result in economic loss. Identification of different gear crack levels is important to prevent any unexpected gear failure because gear cracks lead to gear tooth breakage. Signal processing based methods mainly require expertize to explain gear fault signatures which is usually not easy to be achieved by ordinary users. In order to automatically identify different gear crack levels, intelligent gear crack identification methods should be developed. The previous case studies experimentally proved that K-nearest neighbors based methods exhibit high prediction accuracies for identification of 3 different gear crack levels under different motor speeds and loads. In this short communication, to further enhance prediction accuracies of existing K-nearest neighbors based methods and extend identification of 3 different gear crack levels to identification of 5 different gear crack levels, redundant statistical features are constructed by using Daubechies 44 (db44) binary wavelet packet transform at different wavelet decomposition levels, prior to the use of a K-nearest neighbors method. The dimensionality of redundant statistical features is 620, which provides richer gear fault signatures. Since many of these statistical features are redundant and highly correlated with each other, dimensionality reduction of redundant statistical features is conducted to obtain new significant statistical features. At last, the K-nearest neighbors method is used to identify 5 different gear crack levels under different motor speeds and loads. A case study including 3 experiments is investigated to demonstrate that the developed method provides higher prediction accuracies than the existing K-nearest neighbors based methods for recognizing different gear crack levels under different motor speeds and loads. Based on the new significant statistical features, some other popular statistical models including linear discriminant analysis, quadratic discriminant analysis, classification and regression tree and naive Bayes classifier, are compared with the developed method. The results show that the developed method has the highest prediction accuracies among these statistical models. Additionally, selection of the number of new significant features and parameter selection of K-nearest neighbors are thoroughly investigated.

  1. A Predictive Statistical Model of Navy Career Enlisted Retention Behavior Utilizing Economic Variables.

    DTIC Science & Technology

    1980-12-01

    career retention rates , and to predict future career retention rates in the Navy. The statistical model utilizes economic variables as predictors...The model developed r has a high correlation with Navy career retention rates . The problem of Navy career retention has not been adequately studied, 0D...findings indicate Navy policymakers must be cognizant of the relationships of economic factors to Navy career retention rates . Accrzsiofl ’or NTIS GRA&I

  2. A national streamflow network gap analysis

    USGS Publications Warehouse

    Kiang, Julie E.; Stewart, David W.; Archfield, Stacey A.; Osborne, Emily B.; Eng, Ken

    2013-01-01

    The U.S. Geological Survey (USGS) conducted a gap analysis to evaluate how well the USGS streamgage network meets a variety of needs, focusing on the ability to calculate various statistics at locations that have streamgages (gaged) and that do not have streamgages (ungaged). This report presents the results of analysis to determine where there are gaps in the network of gaged locations, how accurately desired statistics can be calculated with a given length of record, and whether the current network allows for estimation of these statistics at ungaged locations. The analysis indicated that there is variability across the Nation’s streamflow data-collection network in terms of the spatial and temporal coverage of streamgages. In general, the Eastern United States has better coverage than the Western United States. The arid Southwestern United States, Alaska, and Hawaii were observed to have the poorest spatial coverage, using the dataset assembled for this study. Except in Hawaii, these areas also tended to have short streamflow records. Differences in hydrology lead to differences in the uncertainty of statistics calculated in different regions of the country. Arid and semiarid areas of the Central and Southwestern United States generally exhibited the highest levels of interannual variability in flow, leading to larger uncertainty in flow statistics. At ungaged locations, information can be transferred from nearby streamgages if there is sufficient similarity between the gaged watersheds and the ungaged watersheds of interest. Areas where streamgages exhibit high correlation are most likely to be suitable for this type of information transfer. The areas with the most highly correlated streamgages appear to coincide with mountainous areas of the United States. Lower correlations are found in the Central United States and coastal areas of the Southeastern United States. Information transfer from gaged basins to ungaged basins is also most likely to be successful when basin attributes show high similarity. At the scale of the analysis completed in this study, the attributes of basins upstream of USGS streamgages cover the full range of basin attributes observed at potential locations of interest fairly well. Some exceptions included very high or very low elevation areas and very arid areas.

  3. The Use of Cronbach's Alpha When Developing and Reporting Research Instruments in Science Education

    NASA Astrophysics Data System (ADS)

    Taber, Keith S.

    2017-06-01

    Cronbach's alpha is a statistic commonly quoted by authors to demonstrate that tests and scales that have been constructed or adopted for research projects are fit for purpose. Cronbach's alpha is regularly adopted in studies in science education: it was referred to in 69 different papers published in 4 leading science education journals in a single year (2015)—usually as a measure of reliability. This article explores how this statistic is used in reporting science education research and what it represents. Authors often cite alpha values with little commentary to explain why they feel this statistic is relevant and seldom interpret the result for readers beyond citing an arbitrary threshold for an acceptable value. Those authors who do offer readers qualitative descriptors interpreting alpha values adopt a diverse and seemingly arbitrary terminology. More seriously, illustrative examples from the science education literature demonstrate that alpha may be acceptable even when there are recognised problems with the scales concerned. Alpha is also sometimes inappropriately used to claim an instrument is unidimensional. It is argued that a high value of alpha offers limited evidence of the reliability of a research instrument, and that indeed a very high value may actually be undesirable when developing a test of scientific knowledge or understanding. Guidance is offered to authors reporting, and readers evaluating, studies that present Cronbach's alpha statistic as evidence of instrument quality.

  4. High levels of cynical distrust partly predict premature mortality in middle-aged to ageing men.

    PubMed

    Šmigelskas, Kastytis; Joffė, Roza; Jonynienė, Jolita; Julkunen, Juhani; Kauhanen, Jussi

    2017-08-01

    The aim of this study was to evaluate the effect of cynical distrust on mortality in middle-aged and aging men. The analysis is based on Kuopio Ischemic Heart Disease study, follow-up from 1984 to 2011. Sample consisted of 2682 men, aged 42-61 years at baseline. Data on mortality was provided by the National Death Registry, causes of death were classified by the National Center of Statistics of Finland. Cynical distrust was measured at baseline using Cynical Distrust Scale. Survival analyses were conducted using Cox regression models. In crude estimates after 28 years of follow-up, high cynical distrust was associated with 1.5-1.7 higher hazards for earlier death compared to low cynical distrust. Adjusted for conventional risk factors, high cynical distrust was significantly associated regarding CVD-free men and CVD mortality, while non-CVD mortality in study sample was consistently but not significantly associated. The risk effects were more expressed after 12-20 years rather than in earlier or later follow-up. To conclude, high cynical distrust associates with increased risk of CVD mortality in CVD-free men. The associations with non-CVD mortality are weaker and not reach statistical significance.

  5. A comparative study of hematological parameters of α and β thalassemias in a high prevalence zone: Saudi Arabia

    PubMed Central

    Mehdi, Syed Riaz; Al Dahmash, Badr Abdullah

    2011-01-01

    BACKGROUND AND AIMS: Saudi Arabia falls in the high prevalent zone of αα and β thalassemias. Early screening for the type of thalassemia is essential for further investigations and management. The study was carried out to differentiate the type of thalassemia based on red cell indices and other hematological parameters. MATERIALS AND METHODS: The study was carried out on 991 clinically suspected cases of thalassemias in Riyadh, Saudi Arabia. The hematological parameters were studied on Coulter STKS. Cellulose acetate hemoglobin electrophoresis and high-performance liquid chromatography (HPLC) were performed on all the blood samples. Gene deletion studies were carried out by restriction fragment length polymorphism (RFLP) technique using the restriction endonucleases Bam HI. STATISTICAL ANALYSIS: Statistical analysis was performed on SPSS 11.5 version. RESULTS: The hemoglobin electrophoresis and gene studies revealed that there were 406 (40.96%) and 59 (5.95 %) cases of β thalassemia trait and β thalassemia major respectively including adults and children. 426 cases of various deletion forms of α thalassemias were seen. Microcytosis was a common feature in β thalassemias trait and (-α/-α) and (--/αα) types of α thalassemias. MCH was a more significant distinguishing feature among thalassemias. β thalassemia major and α thalassemia (-α/αα) had almost normal hematological parameters. CONCLUSION: MCV and RBC counts are not statistically significant features for discriminating between α and β thalassemias. There is need for development of a discrimination index to differentiate between α and β thalassemias traits on the lines of discriminatory Indices available for distinguishing β thalassemias trait from iron deficiency anemia. PMID:22345994

  6. Emotional personality/proximity versus emotional authenticity in patient-physician communication in healthy study participants, and in patients with benign breast disease, and breast cancer: a prospective case-control study in Finland.

    PubMed

    Eskelinen, Matti; Korhonen, Riika; Selander, Tuomas; Ollonen, Paula

    2015-03-01

    The associations between emotional personality, proximity and authenticity in patient-physician communication during breast cancer (BC) consultations are rarely considered together in a prospective study. We, therefore, investigated emotional personality/proximity versus authenticity in patient-physician communication in healthy study subjects (HSS) and in patients with benign breast disease (BBD) and breast cancer (BC). In the Kuopio Breast Cancer Study, 115 women with breast symptoms were evaluated regarding emotional personality, proximity and authenticity in their a patient-physician communication before any diagnostic procedures were carried-out. The emotional personality and the emotional proximity in patient-physician communication was highly significantly positively correlated in the BBD group. The kappa-values for emotional personality versus emotional proximity in the HSS, BBD and BC groups were statistically significant. There was also a highly significant positive correlation between emotional personality and emotional authenticity in the HSS, BBD and BC groups and the kappa values in the HSS, BBD and BC groups were statistically significant. There was a highly significant positive correlation between emotional proximity and emotional authenticity in the BBD group, and the weighted kappa-values in the BBD group were statistically significant. The results of the present study support a powerful link between emotional personality/proximity and emotional authenticity, and provides new information in patient-physician communication in the HSS, BBD and BC groups. This finding is of clinical importance, since during breast disease consultation, barriers to patient-physician communication may be associated with difficulties in early BC diagnosis in the breast cancer diagnostic unit. Copyright© 2015 International Institute of Anticancer Research (Dr. John G. Delinassios), All rights reserved.

  7. A power comparison of generalized additive models and the spatial scan statistic in a case-control setting.

    PubMed

    Young, Robin L; Weinberg, Janice; Vieira, Verónica; Ozonoff, Al; Webster, Thomas F

    2010-07-19

    A common, important problem in spatial epidemiology is measuring and identifying variation in disease risk across a study region. In application of statistical methods, the problem has two parts. First, spatial variation in risk must be detected across the study region and, second, areas of increased or decreased risk must be correctly identified. The location of such areas may give clues to environmental sources of exposure and disease etiology. One statistical method applicable in spatial epidemiologic settings is a generalized additive model (GAM) which can be applied with a bivariate LOESS smoother to account for geographic location as a possible predictor of disease status. A natural hypothesis when applying this method is whether residential location of subjects is associated with the outcome, i.e. is the smoothing term necessary? Permutation tests are a reasonable hypothesis testing method and provide adequate power under a simple alternative hypothesis. These tests have yet to be compared to other spatial statistics. This research uses simulated point data generated under three alternative hypotheses to evaluate the properties of the permutation methods and compare them to the popular spatial scan statistic in a case-control setting. Case 1 was a single circular cluster centered in a circular study region. The spatial scan statistic had the highest power though the GAM method estimates did not fall far behind. Case 2 was a single point source located at the center of a circular cluster and Case 3 was a line source at the center of the horizontal axis of a square study region. Each had linearly decreasing logodds with distance from the point. The GAM methods outperformed the scan statistic in Cases 2 and 3. Comparing sensitivity, measured as the proportion of the exposure source correctly identified as high or low risk, the GAM methods outperformed the scan statistic in all three Cases. The GAM permutation testing methods provide a regression-based alternative to the spatial scan statistic. Across all hypotheses examined in this research, the GAM methods had competing or greater power estimates and sensitivities exceeding that of the spatial scan statistic.

  8. A power comparison of generalized additive models and the spatial scan statistic in a case-control setting

    PubMed Central

    2010-01-01

    Background A common, important problem in spatial epidemiology is measuring and identifying variation in disease risk across a study region. In application of statistical methods, the problem has two parts. First, spatial variation in risk must be detected across the study region and, second, areas of increased or decreased risk must be correctly identified. The location of such areas may give clues to environmental sources of exposure and disease etiology. One statistical method applicable in spatial epidemiologic settings is a generalized additive model (GAM) which can be applied with a bivariate LOESS smoother to account for geographic location as a possible predictor of disease status. A natural hypothesis when applying this method is whether residential location of subjects is associated with the outcome, i.e. is the smoothing term necessary? Permutation tests are a reasonable hypothesis testing method and provide adequate power under a simple alternative hypothesis. These tests have yet to be compared to other spatial statistics. Results This research uses simulated point data generated under three alternative hypotheses to evaluate the properties of the permutation methods and compare them to the popular spatial scan statistic in a case-control setting. Case 1 was a single circular cluster centered in a circular study region. The spatial scan statistic had the highest power though the GAM method estimates did not fall far behind. Case 2 was a single point source located at the center of a circular cluster and Case 3 was a line source at the center of the horizontal axis of a square study region. Each had linearly decreasing logodds with distance from the point. The GAM methods outperformed the scan statistic in Cases 2 and 3. Comparing sensitivity, measured as the proportion of the exposure source correctly identified as high or low risk, the GAM methods outperformed the scan statistic in all three Cases. Conclusions The GAM permutation testing methods provide a regression-based alternative to the spatial scan statistic. Across all hypotheses examined in this research, the GAM methods had competing or greater power estimates and sensitivities exceeding that of the spatial scan statistic. PMID:20642827

  9. A quadratically regularized functional canonical correlation analysis for identifying the global structure of pleiotropy with NGS data

    PubMed Central

    Zhu, Yun; Fan, Ruzong; Xiong, Momiao

    2017-01-01

    Investigating the pleiotropic effects of genetic variants can increase statistical power, provide important information to achieve deep understanding of the complex genetic structures of disease, and offer powerful tools for designing effective treatments with fewer side effects. However, the current multiple phenotype association analysis paradigm lacks breadth (number of phenotypes and genetic variants jointly analyzed at the same time) and depth (hierarchical structure of phenotype and genotypes). A key issue for high dimensional pleiotropic analysis is to effectively extract informative internal representation and features from high dimensional genotype and phenotype data. To explore correlation information of genetic variants, effectively reduce data dimensions, and overcome critical barriers in advancing the development of novel statistical methods and computational algorithms for genetic pleiotropic analysis, we proposed a new statistic method referred to as a quadratically regularized functional CCA (QRFCCA) for association analysis which combines three approaches: (1) quadratically regularized matrix factorization, (2) functional data analysis and (3) canonical correlation analysis (CCA). Large-scale simulations show that the QRFCCA has a much higher power than that of the ten competing statistics while retaining the appropriate type 1 errors. To further evaluate performance, the QRFCCA and ten other statistics are applied to the whole genome sequencing dataset from the TwinsUK study. We identify a total of 79 genes with rare variants and 67 genes with common variants significantly associated with the 46 traits using QRFCCA. The results show that the QRFCCA substantially outperforms the ten other statistics. PMID:29040274

  10. Preservice Teachers' Memories of Their Secondary Science Education Experiences

    NASA Astrophysics Data System (ADS)

    Hudson, Peter; Usak, Muhammet; Fančovičová, Jana; Erdoğan, Mehmet; Prokop, Pavol

    2010-12-01

    Understanding preservice teachers' memories of their education may aid towards articulating high-impact teaching practices. This study describes 246 preservice teachers' perceptions of their secondary science education experiences through a questionnaire and 28-item survey. ANOVA was statistically significant about participants' memories of science with 15 of the 28 survey items. Descriptive statistics through SPSS further showed that a teacher's enthusiastic nature (87%) and positive attitude towards science (87%) were regarded as highly memorable. In addition, explaining abstract concepts well (79%), and guiding the students' conceptual development with practical science activities (73%) may be considered as memorable secondary science teaching strategies. Implementing science lessons with one or more of these memorable science teaching practices may "make a difference" towards influencing high school students' positive long-term memories about science and their science education. Further research in other key learning areas may provide a clearer picture of high-impact teaching and a way to enhance pedagogical practices.

  11. High throughput single cell counting in droplet-based microfluidics.

    PubMed

    Lu, Heng; Caen, Ouriel; Vrignon, Jeremy; Zonta, Eleonora; El Harrak, Zakaria; Nizard, Philippe; Baret, Jean-Christophe; Taly, Valérie

    2017-05-02

    Droplet-based microfluidics is extensively and increasingly used for high-throughput single-cell studies. However, the accuracy of the cell counting method directly impacts the robustness of such studies. We describe here a simple and precise method to accurately count a large number of adherent and non-adherent human cells as well as bacteria. Our microfluidic hemocytometer provides statistically relevant data on large populations of cells at a high-throughput, used to characterize cell encapsulation and cell viability during incubation in droplets.

  12. A Novel Candidate Molecule in Pathological Grading Of Gliomas: ELABELA.

    PubMed

    Artas, Gokhan; Ozturk, Sait; Kuloglu, Tuncay; Dagli, Adile Ferda; Gonen, Murat; Artas, Hakan; Aydin, Suleyman; Erol, Fatih Serhat

    2018-04-06

    This study aimed to investigate the possible role of ELABELA (ELA) in the histopathological grading of gliomas. We retrospectively assessed pathological specimens of patients who underwent surgery for intracranial space-occupying lesions. Only primary glioma specimens were included in this study. We enrolled 11 patients histologically diagnosed with low-grade glioma and 22 patients with high-grade glioma. The ELA antibody was applied to 4-6-µm-thick sections obtained from paraffin blocks. Histoscores were calculated using the distribution and intensity of staining immunoreactivity. An independent sample t-test was used for two-point inter-group assessments, whereas one-way analysis of variance was used for the other assessments. P 0.05 was considered statistically significant. The histoscores of the control brain, low-grade glioma, and high-grade glioma tissues were found to be 0.08, 0.37, and 0.92, respectively. The difference in ELA immunoreactivity between the control brain tissue and glioma tissue was statistically significant (p 0.05). In addition, a statistically significant increase was observed in ELA immunoreactivity in high-grade glioma tissues compared with that in low-grade glioma tissues (p 0.05). ELA has an angiogenetic role in the progression of glial tumors. ELA, which is an endogenous ligand of the apelin receptor, activates the apelinergic system and causes the progression of glial tumors. Further studies with a large number of patients are necessary to investigate the angiogenetic role of ELA in glial tumors.

  13. Social Studies: Appendix for Elementary, Middle, and High School Guides for Teaching about Human Rights.

    ERIC Educational Resources Information Center

    Detroit Public Schools, MI. Dept. of Curriculum Development Services.

    Seventy documents including primary source materials, simulations, mock trials, short stories, vignettes, and statistical data are provided for the implementation of the elementary, middle, and high school human rights curriculum. Original documents include: (1) the Universal Declaration of Human Rights; (2) the Declaration of the Rights of the…

  14. An Analysis of Mathematics Course Sequences for Low Achieving Students at a Comprehensive Technical High School

    ERIC Educational Resources Information Center

    Edge, D. Michael

    2011-01-01

    This non-experimental study attempted to determine how the different prescribed mathematic tracks offered at a comprehensive technical high school influenced the mathematics performance of low-achieving students on standardized assessments of mathematics achievement. The goal was to provide an analysis of any statistically significant differences…

  15. The Problem of Attendance: Research Findings and Solutions.

    ERIC Educational Resources Information Center

    Levanto, Joseph

    This paper examines the growing problem of high school absenteeism and presents data gathered in a study of student attendance in a large Connecticut high school. Included are graphs displaying schoolwide patterns of absenteeism and a number of statistical tables containing attendance data related to such factors as student age, class, sex, race,…

  16. Establishing Differences between Diversity Requirements and Other Courses with Varying Degrees of Diversity Inclusivity

    ERIC Educational Resources Information Center

    Nelson Laird, Thomas F.; Engberg, Mark E.

    2011-01-01

    This study examines how diversity requirements differ from courses that are highly inclusive or less inclusive of diversity. Results suggest that instructor characteristics are statistically different and that highly inclusive and less inclusive diversity courses score highest and lowest, respectively, on measures of effective teaching compared…

  17. Science Achievement and Students' Self-Confidence and Interest in Science: A Taiwanese Representative Sample Study

    ERIC Educational Resources Information Center

    Chang, Chun-Yen; Cheng, Wei-Ying

    2008-01-01

    The interrelationship between senior high school students' science achievement (SA) and their self-confidence and interest in science (SCIS) was explored with a representative sample of approximately 1,044 11th-grade students from 30 classes attending four high schools throughout Taiwan. Statistical analyses indicated that a statistically…

  18. Texas Public School Attrition Study, 2011-12. IDRA Report

    ERIC Educational Resources Information Center

    Johnson, Roy L.; Montes, Felix

    2012-01-01

    This document contains 3 statistical reports. The first report, "Attrition Rate Decline Seems Promising--Though High Schools are Still Losing One in Four Students" (by Roy L. Johnson), presents results of long-term trend assessments of attrition data in Texas public high schools. The second report, "Slow Declining Pace Keeps Zero…

  19. Comparison of anti-plaque efficacy between a low and high cost dentifrice: A short term randomized double-blind trial

    PubMed Central

    Ganavadiya, Rahul; Shekar, B. R. Chandra; Goel, Pankaj; Hongal, Sudheer G.; Jain, Manish; Gupta, Ruchika

    2014-01-01

    Objective: The aim of this study was to compare the anti-plaque efficacy of a low and high cost commercially available tooth paste among 13-20 years old adolescents in a Residential Home, Bhopal, India. Materials and Methods: The study was randomized double-blind parallel clinical trial conducted in a Residential Home, Bhopal, India. A total of 65 patients with established dental plaque and gingivitis were randomly assigned to either low cost or high cost dentifrice group for 4 weeks. The plaque and gingival scores at baseline and post-intervention were assessed and compared. Statistical analysis was performed using paired t-test and the independent sample t-test. The statistical significance was fixed at 0.05. Results: Results indicated a significant reduction in plaque and gingival scores in both groups post-intervention compared with the baseline. Difference between the groups was not significant. No adverse events were reported and both the dentifrices were well-tolerated. Conclusion: Low cost dentifrice is equally effective to the high cost dentifrice in reducing plaque and gingival inflammation. PMID:25202220

  20. A Study of Particle Beam Spin Dynamics for High Precision Experiments

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Fiedler, Andrew J.

    In the search for physics beyond the Standard Model, high precision experiments to measure fundamental properties of particles are an important frontier. One group of such measurements involves magnetic dipole moment (MDM) values as well as searching for an electric dipole moment (EDM), both of which could provide insights about how particles interact with their environment at the quantum level and if there are undiscovered new particles. For these types of high precision experiments, minimizing statistical uncertainties in the measurements plays a critical role. \\\\ \\indent This work leverages computer simulations to quantify the effects of statistical uncertainty for experimentsmore » investigating spin dynamics. In it, analysis of beam properties and lattice design effects on the polarization of the beam is performed. As a case study, the beam lines that will provide polarized muon beams to the Fermilab Muon \\emph{g}-2 experiment are analyzed to determine the effects of correlations between the phase space variables and the overall polarization of the muon beam.« less

  1. Conditions, interventions, and outcomes in nursing research: a comparative analysis of North American and European/International journals. (1981-1990).

    PubMed

    Abraham, I L; Chalifoux, Z L; Evers, G C; De Geest, S

    1995-04-01

    This study compared the conceptual foci and methodological characteristics of research projects which tested the effects of nursing interventions, published in four general nursing research journals with predominantly North American, and two with predominantly European/International authorship and readership. Dimensions and variables of comparison included: nature of subjects, design issues, statistical methodology, statistical power, and types of interventions and outcomes. Although some differences emerged, the most striking and consistent finding was that there were no statistically significant differences (and thus similarities) in the content foci and methodological parameters of the intervention studies published in both groups of journals. We conclude that European/International and North American nursing intervention studies, as reported in major general nursing research journals, are highly similar in the parameters studied, yet in need of overall improvement. Certainly, there is no empirical support for the common (explicit or implicit) ethnocentric American bias that leadership in nursing intervention research resides with and in the United States of America.

  2. Infectious bursal disease: seroprevalence and associated risk factors in major poultry rearing areas of Ethiopia.

    PubMed

    Jenbreie, Shiferaw; Ayelet, Gelagay; Gelaye, Esayas; Kebede, Fekadu; Lynch, Stacey E; Negussie, Haileleul

    2013-01-01

    The study was conducted in eight districts of Ethiopia with the objectives of determining the seroprevalence and associated risk factors of infectious bursal disease (IBD). From the total of 2,597 chicken serum samples examined using ELISA, 83.1 % were found positive. The highest seroprevalence was found at Mekele (90.3 %) while the lowest was recorded at Gondar district (69.8 %). These differences among the study areas were statistically significant (p < 0.05). Highest seroprevalence was found in crossbreed of chicken (91.4 %) while the lowest was recorded in indigenous breed of chicken (81.4 %). This difference was statistically significant (p < 0.05) among the three breeds of chickens, but sex was not statistically significant (p > 0.05). The seroprevalence of the disease was found high in young (≤ 8 weeks) age group (86.6 %) while the lowest prevalence was recorded in adults (>8 weeks) (72 %). This is also statistically significant (p < 0.05) between young and adult age groups. The prevalence of IBD in different production system indicated that higher seroprevalence was recorded in intensive production system (85.9 %) while the lowest was recorded in extensive production system (81.6 %). This difference is also statistically significant (p < 0.05).

  3. Combining optical remote sensing, agricultural statistics and field observations for culture recognition over a peri-urban region

    NASA Astrophysics Data System (ADS)

    Delbart, Nicolas; Emmanuelle, Vaudour; Fabienne, Maignan; Catherine, Ottlé; Jean-Marc, Gilliot

    2017-04-01

    This study explores the potential of multi-temporal optical remote sensing, with high revisit frequency, to derive missing information on agricultural calendar and crop types over the agricultural lands in the Versailles plain in the western Paris suburbs. This study comes besides past and ongoing studies on the use of radar and high spatial resolution optical remote sensing to monitor agricultural practices in this study area (e.g. Vaudour et al. 2014). Agricultural statistics, such as the Land Parcel Identification System (LPIS) for France, permit to know the nature of annual crops for each digitized declared field of this land parcel registry. However, within each declared field several cropped plots and a diversity of practices may exist, being marked by agricultural rotations which vary both spatially and temporally within it and differ from one year to the other. Even though the new LPIS to be released in 2016 is expected to describe individual plots within declared fields, its attributes may not enable to discriminate between winter and spring crops. Here we evaluate the potential of high observation frequency remote sensing to differentiate seasonal crops based essentially on the seasonality of the spectral properties. In particular, we use the Landsat data to spatially disaggregate the LPIS statistical data, on the basis of the analysis of the remote sensing spectral seasonality measured on a number of selected ground-observed fields. This work is carried out in the framework of the CNES TOSCA-PLEIADES-CO of the French Space Agency.

  4. Superposed epoch analysis of physiological fluctuations: possible space weather connections

    NASA Astrophysics Data System (ADS)

    Wanliss, James; Cornélissen, Germaine; Halberg, Franz; Brown, Denzel; Washington, Brien

    2018-03-01

    There is a strong connection between space weather and fluctuations in technological systems. Some studies also suggest a statistical connection between space weather and subsequent fluctuations in the physiology of living creatures. This connection, however, has remained controversial and difficult to demonstrate. Here we present support for a response of human physiology to forcing from the explosive onset of the largest of space weather events—space storms. We consider a case study with over 16 years of high temporal resolution measurements of human blood pressure (systolic, diastolic) and heart rate variability to search for associations with space weather. We find no statistically significant change in human blood pressure but a statistically significant drop in heart rate during the main phase of space storms. Our empirical findings shed light on how human physiology may respond to exogenous space weather forcing.

  5. Superposed epoch analysis of physiological fluctuations: possible space weather connections.

    PubMed

    Wanliss, James; Cornélissen, Germaine; Halberg, Franz; Brown, Denzel; Washington, Brien

    2018-03-01

    There is a strong connection between space weather and fluctuations in technological systems. Some studies also suggest a statistical connection between space weather and subsequent fluctuations in the physiology of living creatures. This connection, however, has remained controversial and difficult to demonstrate. Here we present support for a response of human physiology to forcing from the explosive onset of the largest of space weather events-space storms. We consider a case study with over 16 years of high temporal resolution measurements of human blood pressure (systolic, diastolic) and heart rate variability to search for associations with space weather. We find no statistically significant change in human blood pressure but a statistically significant drop in heart rate during the main phase of space storms. Our empirical findings shed light on how human physiology may respond to exogenous space weather forcing.

  6. Proper interpretation of chronic toxicity studies and their statistics: A critique of "Which level of evidence does the US National Toxicology Program provide? Statistical considerations using the Technical Report 578 on Ginkgo biloba as an example".

    PubMed

    Kissling, Grace E; Haseman, Joseph K; Zeiger, Errol

    2015-09-02

    A recent article by Gaus (2014) demonstrates a serious misunderstanding of the NTP's statistical analysis and interpretation of rodent carcinogenicity data as reported in Technical Report 578 (Ginkgo biloba) (NTP, 2013), as well as a failure to acknowledge the abundant literature on false positive rates in rodent carcinogenicity studies. The NTP reported Ginkgo biloba extract to be carcinogenic in mice and rats. Gaus claims that, in this study, 4800 statistical comparisons were possible, and that 209 of them were statistically significant (p<0.05) compared with 240 (4800×0.05) expected by chance alone; thus, the carcinogenicity of Ginkgo biloba extract cannot be definitively established. However, his assumptions and calculations are flawed since he incorrectly assumes that the NTP uses no correction for multiple comparisons, and that significance tests for discrete data operate at exactly the nominal level. He also misrepresents the NTP's decision making process, overstates the number of statistical comparisons made, and ignores the fact that the mouse liver tumor effects were so striking (e.g., p<0.0000000000001) that it is virtually impossible that they could be false positive outcomes. Gaus' conclusion that such obvious responses merely "generate a hypothesis" rather than demonstrate a real carcinogenic effect has no scientific credibility. Moreover, his claims regarding the high frequency of false positive outcomes in carcinogenicity studies are misleading because of his methodological misconceptions and errors. Published by Elsevier Ireland Ltd.

  7. Proper interpretation of chronic toxicity studies and their statistics: A critique of “Which level of evidence does the US National Toxicology Program provide? Statistical considerations using the Technical Report 578 on Ginkgo biloba as an example”

    PubMed Central

    Kissling, Grace E.; Haseman, Joseph K.; Zeiger, Errol

    2014-01-01

    A recent article by Gaus (2014) demonstrates a serious misunderstanding of the NTP’s statistical analysis and interpretation of rodent carcinogenicity data as reported in Technical Report 578 (Ginkgo biloba) (NTP 2013), as well as a failure to acknowledge the abundant literature on false positive rates in rodent carcinogenicity studies. The NTP reported Ginkgo biloba extract to be carcinogenic in mice and rats. Gaus claims that, in this study, 4800 statistical comparisons were possible, and that 209 of them were statistically significant (p<0.05) compared with 240 (4800 × 0.05) expected by chance alone; thus, the carcinogenicity of Ginkgo biloba extract cannot be definitively established. However, his assumptions and calculations are flawed since he incorrectly assumes that the NTP uses no correction for multiple comparisons, and that significance tests for discrete data operate at exactly the nominal level. He also misrepresents the NTP’s decision making process, overstates the number of statistical comparisons made, and ignores that fact that that the mouse liver tumor effects were so striking (e.g., p<0.0000000000001) that it is virtually impossible that they could be false positive outcomes. Gaus’ conclusion that such obvious responses merely “generate a hypothesis” rather than demonstrate a real carcinogenic effect has no scientific credibility. Moreover, his claims regarding the high frequency of false positive outcomes in carcinogenicity studies are misleading because of his methodological misconceptions and errors. PMID:25261588

  8. Is fertility falling in Zimbabwe?

    PubMed

    Udjo, E O

    1996-01-01

    With an unequalled contraceptive prevalence rate in sub-Saharan Africa, of 43% among currently married women in Zimbabwe, the Central Statistical Office (1989) observed that fertility has declined sharply in recent years. Using data from several surveys on Zimbabwe, especially the birth histories of the Zimbabwe Demographic and Health Survey, this study examines fertility trends in Zimbabwe. The results show that the fertility decline in Zimbabwe is modest and that the decline is concentrated among high order births. Multivariate analysis did not show a statistically significant effect of contraception on fertility, partly because a high proportion of Zimbabwean women in the reproductive age group never use contraception due to prevailing pronatalist attitudes in the country.

  9. A mathematical model for HIV and hepatitis C co-infection and its assessment from a statistical perspective.

    PubMed

    Castro Sanchez, Amparo Yovanna; Aerts, Marc; Shkedy, Ziv; Vickerman, Peter; Faggiano, Fabrizio; Salamina, Guiseppe; Hens, Niel

    2013-03-01

    The hepatitis C virus (HCV) and the human immunodeficiency virus (HIV) are a clear threat for public health, with high prevalences especially in high risk groups such as injecting drug users. People with HIV infection who are also infected by HCV suffer from a more rapid progression to HCV-related liver disease and have an increased risk for cirrhosis and liver cancer. Quantifying the impact of HIV and HCV co-infection is therefore of great importance. We propose a new joint mathematical model accounting for co-infection with the two viruses in the context of injecting drug users (IDUs). Statistical concepts and methods are used to assess the model from a statistical perspective, in order to get further insights in: (i) the comparison and selection of optional model components, (ii) the unknown values of the numerous model parameters, (iii) the parameters to which the model is most 'sensitive' and (iv) the combinations or patterns of values in the high-dimensional parameter space which are most supported by the data. Data from a longitudinal study of heroin users in Italy are used to illustrate the application of the proposed joint model and its statistical assessment. The parameters associated with contact rates (sharing syringes) and the transmission rates per syringe-sharing event are shown to play a major role. Copyright © 2013 Elsevier B.V. All rights reserved.

  10. A multi-analyte serum test for the detection of non-small cell lung cancer

    PubMed Central

    Farlow, E C; Vercillo, M S; Coon, J S; Basu, S; Kim, A W; Faber, L P; Warren, W H; Bonomi, P; Liptay, M J; Borgia, J A

    2010-01-01

    Background: In this study, we appraised a wide assortment of biomarkers previously shown to have diagnostic or prognostic value for non-small cell lung cancer (NSCLC) with the intent of establishing a multi-analyte serum test capable of identifying patients with lung cancer. Methods: Circulating levels of 47 biomarkers were evaluated against patient cohorts consisting of 90 NSCLC and 43 non-cancer controls using commercial immunoassays. Multivariate statistical methods were used on all biomarkers achieving statistical relevance to define an optimised panel of diagnostic biomarkers for NSCLC. The resulting biomarkers were fashioned into a classification algorithm and validated against serum from a second patient cohort. Results: A total of 14 analytes achieved statistical relevance upon evaluation. Multivariate statistical methods then identified a panel of six biomarkers (tumour necrosis factor-α, CYFRA 21-1, interleukin-1ra, matrix metalloproteinase-2, monocyte chemotactic protein-1 and sE-selectin) as being the most efficacious for diagnosing early stage NSCLC. When tested against a second patient cohort, the panel successfully classified 75 of 88 patients. Conclusions: Here, we report the development of a serum algorithm with high specificity for classifying patients with NSCLC against cohorts of various ‘high-risk' individuals. A high rate of false positives was observed within the cohort in which patients had non-neoplastic lung nodules, possibly as a consequence of the inflammatory nature of these conditions. PMID:20859284

  11. Educational games in geriatric medicine education: a systematic review

    PubMed Central

    2010-01-01

    Objective To systematically review the medical literature to assess the effect of geriatric educational games on the satisfaction, knowledge, beliefs, attitudes and behaviors of health care professionals. Methods We conducted a systematic review following the Cochrane Collaboration methodology including an electronic search of 10 electronic databases. We included randomized controlled trials (RCT) and controlled clinical trials (CCT) and excluded single arm studies. Population of interests included members (practitioners or students) of the health care professions. Outcomes of interests were participants' satisfaction, knowledge, beliefs, attitude, and behaviors. Results We included 8 studies evaluating 5 geriatric role playing games, all conducted in United States. All studies suffered from one or more methodological limitations but the overall quality of evidence was acceptable. None of the studies assessed the effects of the games on beliefs or behaviors. None of the 8 studies reported a statistically significant difference between the 2 groups in terms of change in attitude. One study assessed the impact on knowledge and found non-statistically significant difference between the 2 groups. Two studies found levels of satisfaction among participants to be high. We did not conduct a planned meta-analysis because the included studies either reported no statistical data or reported different summary statistics. Conclusion The available evidence does not support the use of role playing interventions in geriatric medical education with the aim of improving the attitudes towards the elderly. PMID:20416055

  12. The PRAXIS I Math Study Guide Questions and the PRAXIS I Math Skills Test Questions: A Statistical Study

    ERIC Educational Resources Information Center

    Wilkins, M. Elaine

    2012-01-01

    In 2001, No Child Left Behind introduced the highly qualified status for k-12 teachers, which mandated the successful scores on a series of high-stakes test; within this series is the Pre-Professional Skills Test (PPST) or PRAXIS I. The PPST measures basic k-12 skills for reading, writing, and mathematics. The mathematics sub-test is a national…

  13. Inventory of Personal Skills for Achievement: Validity and Reliability Study of an Instrument for Identifying Educationally At-Risk Junior [High] School Students.

    ERIC Educational Resources Information Center

    Leaseburg, Melinda G.; And Others

    This paper describes the development and test of an early-warning instrument for identifying at-risk students aged 10-15. A statistically sound test to identify at-risk high school students existed in the Personal Skills Map--Adolescent version (PSMA-A). This study used a modified version of PSM-A , which was renamed Personal Skills for…

  14. Interpretation of correlations in clinical research.

    PubMed

    Hung, Man; Bounsanga, Jerry; Voss, Maren Wright

    2017-11-01

    Critically analyzing research is a key skill in evidence-based practice and requires knowledge of research methods, results interpretation, and applications, all of which rely on a foundation based in statistics. Evidence-based practice makes high demands on trained medical professionals to interpret an ever-expanding array of research evidence. As clinical training emphasizes medical care rather than statistics, it is useful to review the basics of statistical methods and what they mean for interpreting clinical studies. We reviewed the basic concepts of correlational associations, violations of normality, unobserved variable bias, sample size, and alpha inflation. The foundations of causal inference were discussed and sound statistical analyses were examined. We discuss four ways in which correlational analysis is misused, including causal inference overreach, over-reliance on significance, alpha inflation, and sample size bias. Recent published studies in the medical field provide evidence of causal assertion overreach drawn from correlational findings. The findings present a primer on the assumptions and nature of correlational methods of analysis and urge clinicians to exercise appropriate caution as they critically analyze the evidence before them and evaluate evidence that supports practice. Critically analyzing new evidence requires statistical knowledge in addition to clinical knowledge. Studies can overstate relationships, expressing causal assertions when only correlational evidence is available. Failure to account for the effect of sample size in the analyses tends to overstate the importance of predictive variables. It is important not to overemphasize the statistical significance without consideration of effect size and whether differences could be considered clinically meaningful.

  15. A Multidisciplinary Approach for Teaching Statistics and Probability

    ERIC Educational Resources Information Center

    Rao, C. Radhakrishna

    1971-01-01

    The author presents a syllabus for an introductory (first year after high school) course in statistics and probability and some methods of teaching statistical techniques. The description comes basically from the procedures used at the Indian Statistical Institute, Calcutta. (JG)

  16. A quantitative analysis of factors influencing the professional longevity of high school science teachers in Florida

    NASA Astrophysics Data System (ADS)

    Ridgley, James Alexander, Jr.

    This dissertation is an exploratory quantitative analysis of various independent variables to determine their effect on the professional longevity (years of service) of high school science teachers in the state of Florida for the academic years 2011-2012 to 2013-2014. Data are collected from the Florida Department of Education, National Center for Education Statistics, and the National Assessment of Educational Progress databases. The following research hypotheses are examined: H1 - There are statistically significant differences in Level 1 (teacher variables) that influence the professional longevity of a high school science teacher in Florida. H2 - There are statistically significant differences in Level 2 (school variables) that influence the professional longevity of a high school science teacher in Florida. H3 - There are statistically significant differences in Level 3 (district variables) that influence the professional longevity of a high school science teacher in Florida. H4 - When tested in a hierarchical multiple regression, there are statistically significant differences in Level 1, Level 2, or Level 3 that influence the professional longevity of a high school science teacher in Florida. The professional longevity of a Floridian high school science teacher is the dependent variable. The independent variables are: (Level 1) a teacher's sex, age, ethnicity, earned degree, salary, number of schools taught in, migration count, and various years of service in different areas of education; (Level 2) a school's geographic location, residential population density, average class size, charter status, and SES; and (Level 3) a school district's average SES and average spending per pupil. Statistical analyses of exploratory MLRs and a HMR are used to support the research hypotheses. The final results of the HMR analysis show a teacher's age, salary, earned degree (unknown, associate, and doctorate), and ethnicity (Hispanic and Native Hawaiian/Pacific Islander); a school's charter status; and a school district's average SES are all significant predictors of a Florida high school science teacher's professional longevity. Although statistically significant in the initial exploratory MLR analyses, a teacher's ethnicity (Asian and Black), a school's geographic location (city and rural), and a school's SES are not statistically significant in the final HMR model.

  17. Middle school science teachers' teaching self-efficacy and students' science self-efficacy

    NASA Astrophysics Data System (ADS)

    Pisa, Danielle

    Project 2061, initiated by the American Association for the Advancement of Science (AAAS), developed recommendations for what is essential in education to produce scientifically literate citizens. Furthermore, they suggest that teachers teach effectively. There is an abundance of literature that focuses on the effects of a teacher's science teaching self-efficacy and a student's science self-efficacy. However, there is no literature on the relationship between the two self-efficacies. This study investigated if there is a differential change in students' science self-efficacy over an academic term after instruction from a teacher with high science teaching self-efficacy. Quantitative analysis of STEBI scores for teachers showed that mean STEBI scores did not change over one academic term. A t test indicated that there was no statistically significant difference in mean SMTSL scores for students' science self-efficacy over the course of one academic term for a) the entire sample, b) each science class, and c) each grade level. In addition, ANOVA indicated that there was no statistically significant difference in mean gain factor of students rated as low, medium, and high on science self-efficacy as measured by the SMTSL, when students received instruction from a teacher with a high science teaching self-efficacy value as measured by the STEBI. Finally, there was no statistically significant association between the pre- and post-instructional rankings of SMTSL by grade level when students received instruction from a teacher with a high science teaching self-efficacy value as measured by the STEBI. This is the first study of its kind. Studies indicated that teaching strategies typically practiced by teachers with high science teaching were beneficial to physics self-efficacy (Fencl & Scheel, 2005). Although it was unsuccessful at determining whether or not a teacher with high science teaching self-efficacy has a differential affect on students' science self-efficacy, it is worth repeating on a more diverse sample of teacher and students over a longer period of time.

  18. Customizing national models for a medical center's population to rapidly identify patients at high risk of 30-day all-cause hospital readmission following a heart failure hospitalization.

    PubMed

    Cox, Zachary L; Lai, Pikki; Lewis, Connie M; Lindenfeld, JoAnn; Collins, Sean P; Lenihan, Daniel J

    2018-05-28

    Nationally-derived models predicting 30-day readmissions following heart failure (HF) hospitalizations yield insufficient discrimination for institutional use. Develop a customized readmission risk model from Medicare-employed and institutionally-customized risk factors and compare the performance against national models in a medical center. Medicare patients age ≥ 65 years hospitalized for HF (n = 1,454) were studied in a derivation cohort and in a separate validation cohort (n = 243). All 30-day hospital readmissions were documented. The primary outcome was risk discrimination (c-statistic) compared to national models. A customized model demonstrated improved discrimination (c-statistic 0.72; 95% CI 0.69 - 0.74) compared to national models (c-statistics of 0.60 and 0.61) with a c-statistic of 0.63 in the validation cohort. Compared to national models, a customized model demonstrated superior readmission risk profiling by distinguishing a high-risk (38.3%) from a low-risk (9.4%) quartile. A customized model improved readmission risk discrimination from HF hospitalizations compared to national models. Copyright © 2018 Elsevier Inc. All rights reserved.

  19. Statistical Engineering in Air Traffic Management Research

    NASA Technical Reports Server (NTRS)

    Wilson, Sara R.

    2015-01-01

    NASA is working to develop an integrated set of advanced technologies to enable efficient arrival operations in high-density terminal airspace for the Next Generation Air Transportation System. This integrated arrival solution is being validated and verified in laboratories and transitioned to a field prototype for an operational demonstration at a major U.S. airport. Within NASA, this is a collaborative effort between Ames and Langley Research Centers involving a multi-year iterative experimentation process. Designing and analyzing a series of sequential batch computer simulations and human-in-the-loop experiments across multiple facilities and simulation environments involves a number of statistical challenges. Experiments conducted in separate laboratories typically have different limitations and constraints, and can take different approaches with respect to the fundamental principles of statistical design of experiments. This often makes it difficult to compare results from multiple experiments and incorporate findings into the next experiment in the series. A statistical engineering approach is being employed within this project to support risk-informed decision making and maximize the knowledge gained within the available resources. This presentation describes a statistical engineering case study from NASA, highlights statistical challenges, and discusses areas where existing statistical methodology is adapted and extended.

  20. Outcomes Definitions and Statistical Tests in Oncology Studies: A Systematic Review of the Reporting Consistency.

    PubMed

    Rivoirard, Romain; Duplay, Vianney; Oriol, Mathieu; Tinquaut, Fabien; Chauvin, Franck; Magne, Nicolas; Bourmaud, Aurelie

    2016-01-01

    Quality of reporting for Randomized Clinical Trials (RCTs) in oncology was analyzed in several systematic reviews, but, in this setting, there is paucity of data for the outcomes definitions and consistency of reporting for statistical tests in RCTs and Observational Studies (OBS). The objective of this review was to describe those two reporting aspects, for OBS and RCTs in oncology. From a list of 19 medical journals, three were retained for analysis, after a random selection: British Medical Journal (BMJ), Annals of Oncology (AoO) and British Journal of Cancer (BJC). All original articles published between March 2009 and March 2014 were screened. Only studies whose main outcome was accompanied by a corresponding statistical test were included in the analysis. Studies based on censored data were excluded. Primary outcome was to assess quality of reporting for description of primary outcome measure in RCTs and of variables of interest in OBS. A logistic regression was performed to identify covariates of studies potentially associated with concordance of tests between Methods and Results parts. 826 studies were included in the review, and 698 were OBS. Variables were described in Methods section for all OBS studies and primary endpoint was clearly detailed in Methods section for 109 RCTs (85.2%). 295 OBS (42.2%) and 43 RCTs (33.6%) had perfect agreement for reported statistical test between Methods and Results parts. In multivariable analysis, variable "number of included patients in study" was associated with test consistency: aOR (adjusted Odds Ratio) for third group compared to first group was equal to: aOR Grp3 = 0.52 [0.31-0.89] (P value = 0.009). Variables in OBS and primary endpoint in RCTs are reported and described with a high frequency. However, statistical tests consistency between methods and Results sections of OBS is not always noted. Therefore, we encourage authors and peer reviewers to verify consistency of statistical tests in oncology studies.

  1. Outcomes Definitions and Statistical Tests in Oncology Studies: A Systematic Review of the Reporting Consistency

    PubMed Central

    Rivoirard, Romain; Duplay, Vianney; Oriol, Mathieu; Tinquaut, Fabien; Chauvin, Franck; Magne, Nicolas; Bourmaud, Aurelie

    2016-01-01

    Background Quality of reporting for Randomized Clinical Trials (RCTs) in oncology was analyzed in several systematic reviews, but, in this setting, there is paucity of data for the outcomes definitions and consistency of reporting for statistical tests in RCTs and Observational Studies (OBS). The objective of this review was to describe those two reporting aspects, for OBS and RCTs in oncology. Methods From a list of 19 medical journals, three were retained for analysis, after a random selection: British Medical Journal (BMJ), Annals of Oncology (AoO) and British Journal of Cancer (BJC). All original articles published between March 2009 and March 2014 were screened. Only studies whose main outcome was accompanied by a corresponding statistical test were included in the analysis. Studies based on censored data were excluded. Primary outcome was to assess quality of reporting for description of primary outcome measure in RCTs and of variables of interest in OBS. A logistic regression was performed to identify covariates of studies potentially associated with concordance of tests between Methods and Results parts. Results 826 studies were included in the review, and 698 were OBS. Variables were described in Methods section for all OBS studies and primary endpoint was clearly detailed in Methods section for 109 RCTs (85.2%). 295 OBS (42.2%) and 43 RCTs (33.6%) had perfect agreement for reported statistical test between Methods and Results parts. In multivariable analysis, variable "number of included patients in study" was associated with test consistency: aOR (adjusted Odds Ratio) for third group compared to first group was equal to: aOR Grp3 = 0.52 [0.31–0.89] (P value = 0.009). Conclusion Variables in OBS and primary endpoint in RCTs are reported and described with a high frequency. However, statistical tests consistency between methods and Results sections of OBS is not always noted. Therefore, we encourage authors and peer reviewers to verify consistency of statistical tests in oncology studies. PMID:27716793

  2. Statistics in the pharmacy literature.

    PubMed

    Lee, Charlene M; Soin, Herpreet K; Einarson, Thomas R

    2004-09-01

    Research in statistical methods is essential for maintenance of high quality of the published literature. To update previous reports of the types and frequencies of statistical terms and procedures in research studies of selected professional pharmacy journals. We obtained all research articles published in 2001 in 6 journals: American Journal of Health-System Pharmacy, The Annals of Pharmacotherapy, Canadian Journal of Hospital Pharmacy, Formulary, Hospital Pharmacy, and Journal of the American Pharmaceutical Association. Two independent reviewers identified and recorded descriptive and inferential statistical terms/procedures found in the methods, results, and discussion sections of each article. Results were determined by tallying the total number of times, as well as the percentage, that each statistical term or procedure appeared in the articles. One hundred forty-four articles were included. Ninety-eight percent employed descriptive statistics; of these, 28% used only descriptive statistics. The most common descriptive statistical terms were percentage (90%), mean (74%), standard deviation (58%), and range (46%). Sixty-nine percent of the articles used inferential statistics, the most frequent being chi(2) (33%), Student's t-test (26%), Pearson's correlation coefficient r (18%), ANOVA (14%), and logistic regression (11%). Statistical terms and procedures were found in nearly all of the research articles published in pharmacy journals. Thus, pharmacy education should aim to provide current and future pharmacists with an understanding of the common statistical terms and procedures identified to facilitate the appropriate appraisal and consequential utilization of the information available in research articles.

  3. From QCD-based hard-scattering to nonextensive statistical mechanical descriptions of transverse momentum spectra in high-energy p p and p p ¯ collisions

    DOE PAGES

    Wong, Cheuk-Yin; Wilk, Grzegorz; Cirto, Leonardo J. L.; ...

    2015-06-22

    Transverse spectra of both jets and hadrons obtained in high-energymore » $pp$ and $$p\\bar p $$ collisions at central rapidity exhibit power-law behavior of $$1/p_T^n$$ at high $$p_T$$. The power index $n$ is 4-5 for jet production and is slightly greater for hadron production. Furthermore, the hadron spectra spanning over 14 orders of magnitude down to the lowest $$p_T$$ region in $pp$ collisions at LHC can be adequately described by a single nonextensive statistical mechanical distribution that is widely used in other branches of science. This suggests indirectly the dominance of the hard-scattering process over essentially the whole $$p_T$$ region at central rapidity in $pp$ collisions at LHC. We show here direct evidences of such a dominance of the hard-scattering process by investigating the power index of UA1 jet spectra over an extended $$p_T$$ region and the two-particle correlation data of the STAR and PHENIX Collaborations in high-energy $pp$ and $$p \\bar p$$ collisions at central rapidity. We then study how the showering of the hard-scattering product partons alters the power index of the hadron spectra and leads to a hadron distribution that can be cast into a single-particle non-extensive statistical mechanical distribution. Lastly, because of such a connection, the non-extensive statistical mechanical distribution can be considered as a lowest-order approximation of the hard-scattering of partons followed by the subsequent process of parton showering that turns the jets into hadrons, in high energy $pp$ and $$p\\bar p$$ collisions.« less

  4. Spatial and temporal patterns of locally-acquired dengue transmission in northern Queensland, Australia, 1993-2012.

    PubMed

    Naish, Suchithra; Dale, Pat; Mackenzie, John S; McBride, John; Mengersen, Kerrie; Tong, Shilu

    2014-01-01

    Dengue has been a major public health concern in Australia since it re-emerged in Queensland in 1992-1993. We explored spatio-temporal characteristics of locally-acquired dengue cases in northern tropical Queensland, Australia during the period 1993-2012. Locally-acquired notified cases of dengue were collected for northern tropical Queensland from 1993 to 2012. Descriptive spatial and temporal analyses were conducted using geographic information system tools and geostatistical techniques. 2,398 locally-acquired dengue cases were recorded in northern tropical Queensland during the study period. The areas affected by the dengue cases exhibited spatial and temporal variation over the study period. Notified cases of dengue occurred more frequently in autumn. Mapping of dengue by statistical local areas (census units) reveals the presence of substantial spatio-temporal variation over time and place. Statistically significant differences in dengue incidence rates among males and females (with more cases in females) (χ(2) = 15.17, d.f.  = 1, p<0.01). Differences were observed among age groups, but these were not statistically significant. There was a significant positive spatial autocorrelation of dengue incidence for the four sub-periods, with the Moran's I statistic ranging from 0.011 to 0.463 (p<0.01). Semi-variogram analysis and smoothed maps created from interpolation techniques indicate that the pattern of spatial autocorrelation was not homogeneous across the northern Queensland. Tropical areas are potential high-risk areas for mosquito-borne diseases such as dengue. This study demonstrated that the locally-acquired dengue cases have exhibited a spatial and temporal variation over the past twenty years in northern tropical Queensland, Australia. Therefore, this study provides an impetus for further investigation of clusters and risk factors in these high-risk areas.

  5. Spatial and Temporal Patterns of Locally-Acquired Dengue Transmission in Northern Queensland, Australia, 1993–2012

    PubMed Central

    Naish, Suchithra; Dale, Pat; Mackenzie, John S.; McBride, John; Mengersen, Kerrie; Tong, Shilu

    2014-01-01

    Background Dengue has been a major public health concern in Australia since it re-emerged in Queensland in 1992–1993. We explored spatio-temporal characteristics of locally-acquired dengue cases in northern tropical Queensland, Australia during the period 1993–2012. Methods Locally-acquired notified cases of dengue were collected for northern tropical Queensland from 1993 to 2012. Descriptive spatial and temporal analyses were conducted using geographic information system tools and geostatistical techniques. Results 2,398 locally-acquired dengue cases were recorded in northern tropical Queensland during the study period. The areas affected by the dengue cases exhibited spatial and temporal variation over the study period. Notified cases of dengue occurred more frequently in autumn. Mapping of dengue by statistical local areas (census units) reveals the presence of substantial spatio-temporal variation over time and place. Statistically significant differences in dengue incidence rates among males and females (with more cases in females) (χ2 = 15.17, d.f. = 1, p<0.01). Differences were observed among age groups, but these were not statistically significant. There was a significant positive spatial autocorrelation of dengue incidence for the four sub-periods, with the Moran's I statistic ranging from 0.011 to 0.463 (p<0.01). Semi-variogram analysis and smoothed maps created from interpolation techniques indicate that the pattern of spatial autocorrelation was not homogeneous across the northern Queensland. Conclusions Tropical areas are potential high-risk areas for mosquito-borne diseases such as dengue. This study demonstrated that the locally-acquired dengue cases have exhibited a spatial and temporal variation over the past twenty years in northern tropical Queensland, Australia. Therefore, this study provides an impetus for further investigation of clusters and risk factors in these high-risk areas. PMID:24691549

  6. Evaluation of changes to foot shape in females 5 years after mastectomy: a case-control study.

    PubMed

    Głowacka-Mrotek, Iwona; Sowa, Magdalena; Siedlecki, Zygmunt; Nowikiewicz, Tomasz; Hagner, Wojciech; Zegarski, Wojciech

    2017-06-01

    The aim of this study was to evaluate changes in foot shape of women 5 years after undergoing breast amputation. Evaluation of foot shape was performed using a non-invasive device for computer analysis of the plantar surface of the foot. Obtained results were compared between feet on the healthy breast side (F1) and on the amputated breast side (F2). 128 women aged 63.60 ± 8.83, 5-6 years after breast amputation were enrolled in this case-control study. Weight bearing on the lower extremity on the amputated breast side (F1) compared with the healthy breast side (F2) showed statistically significant differences (p < 0.01). Patients put more weight onto the healthy breast side. No statistically significant difference was found with regard to F1 and F2 foot length (p = 0.4239), as well as BETA (p = 0.4470) and GAMMA (p = 0.4566) angles. Highly statistically significant differences were noted with respect to foot width, ALPHA angle, and Sztriter-Godunov index-higher values were observed on the healthy breast side (p < 0.001). Highly statistically significant differences were also noted while comparing Clark's angles, higher values being observed on the operated breast side (p < 0.001). Differences in foot shape on the healthy breast side and amputated breast side constitute a long-term negative consequence of mastectomy, and can be caused by unbalanced weight put on feet on the healthy breast side compared to the amputated breast side.

  7. Statistical properties of Chinese phonemic networks

    NASA Astrophysics Data System (ADS)

    Yu, Shuiyuan; Liu, Haitao; Xu, Chunshan

    2011-04-01

    The study of properties of speech sound systems is of great significance in understanding the human cognitive mechanism and the working principles of speech sound systems. Some properties of speech sound systems, such as the listener-oriented feature and the talker-oriented feature, have been unveiled with the statistical study of phonemes in human languages and the research of the interrelations between human articulatory gestures and the corresponding acoustic parameters. With all the phonemes of speech sound systems treated as a coherent whole, our research, which focuses on the dynamic properties of speech sound systems in operation, investigates some statistical parameters of Chinese phoneme networks based on real text and dictionaries. The findings are as follows: phonemic networks have high connectivity degrees and short average distances; the degrees obey normal distribution and the weighted degrees obey power law distribution; vowels enjoy higher priority than consonants in the actual operation of speech sound systems; the phonemic networks have high robustness against targeted attacks and random errors. In addition, for investigating the structural properties of a speech sound system, a statistical study of dictionaries is conducted, which shows the higher frequency of shorter words and syllables and the tendency that the longer a word is, the shorter the syllables composing it are. From these structural properties and dynamic properties one can derive the following conclusion: the static structure of a speech sound system tends to promote communication efficiency and save articulation effort while the dynamic operation of this system gives preference to reliable transmission and easy recognition. In short, a speech sound system is an effective, efficient and reliable communication system optimized in many aspects.

  8. Selection bias of Internet panel surveys: a comparison with a paper-based survey and national governmental statistics in Japan.

    PubMed

    Tsuboi, Satoshi; Yoshida, Honami; Ae, Ryusuke; Kojo, Takao; Nakamura, Yosikazu; Kitamura, Kunio

    2015-03-01

    To investigate the selection bias of an Internet panel survey organized by a commercial company. A descriptive study was conducted. The authors compared the characteristics of the Internet panel survey with a national paper-based survey and with national governmental statistics in Japan. The participants in the Internet panel survey were composed of more women, were older, and resided in large cities. Regardless of age and sex, the prevalence of highly educated people in the Internet panel survey was higher than in the paper-based survey and the national statistics. In men, the prevalence of heavy drinkers among the 30- to 49-year-old population and of habitual smokers among the 20- to 49-year-old population in the Internet panel survey was lower than what was found in the national statistics. The estimated characteristics of commercial Internet panel surveys were quite different from the national statistical data. In a commercial Internet panel survey, selection bias should not be underestimated. © 2012 APJPH.

  9. Child posture and shoulder belt fit during extended night-time traveling: an in-transit observational study.

    PubMed

    Forman, Jason L; Segui-Gomez, Maria; Ash, Joseph H; Lopez-Valdes, Francisco J

    2011-01-01

    Understanding pediatric occupant postures can help researchers indentify injury risk factors, and provide information for prospective injury prediction. This study sought to observe lateral head positions and shoulder belt fit among older child automobile occupants during a scenario likely to result in sleeping - extended travel during the night. An observational, volunteer, in-transit study was performed with 30 pediatric rear-seat passengers, ages 7 to 14. Each was restrained by a three-point seatbelt and was driven for seventy-five minutes at night. Ten subjects used a high-back booster seat, ten used a low-back booster seat, and ten used none (based on the subject height and weight). The subjects were recorded with a low-light video camera, and one frame was analyzed per each minute of video. The high-back booster group exhibited a statistically significant (p<0.05) decrease in the mean frequency of poor shoulder belt fit compared to the no-booster and low-back booster groups. The high-back booster group also exhibited statistically significant decreases in the 90(th) percentile of the absolute value of the relative lateral motion of the head. The low-back booster group did not result in statistically significant decreases in poor shoulder belt fit or lateral head motion compared to the no-booster group. These results are consistent with the presence of large lateral supports of the high-back booster which provided support to the head while sleeping, reducing voluntary lateral occupant motion and improving shoulder belt fit. Future work includes examining lap belt fit in-transit, and examining the effects of these observations on predicted injury risk.

  10. Child Posture and Shoulder Belt Fit During Extended Night-Time Traveling: An In-Transit Observational Study.

    PubMed Central

    Forman, Jason L.; Segui-Gomez, Maria; Ash, Joseph H.; Lopez-Valdes, Francisco J.

    2011-01-01

    Understanding pediatric occupant postures can help researchers indentify injury risk factors, and provide information for prospective injury prediction. This study sought to observe lateral head positions and shoulder belt fit among older child automobile occupants during a scenario likely to result in sleeping - extended travel during the night. An observational, volunteer, in-transit study was performed with 30 pediatric rear-seat passengers, ages 7 to 14. Each was restrained by a three-point seatbelt and was driven for seventy-five minutes at night. Ten subjects used a high-back booster seat, ten used a low-back booster seat, and ten used none (based on the subject height and weight). The subjects were recorded with a low-light video camera, and one frame was analyzed per each minute of video. The high-back booster group exhibited a statistically significant (p<0.05) decrease in the mean frequency of poor shoulder belt fit compared to the no-booster and low-back booster groups. The high-back booster group also exhibited statistically significant decreases in the 90th percentile of the absolute value of the relative lateral motion of the head. The low-back booster group did not result in statistically significant decreases in poor shoulder belt fit or lateral head motion compared to the no-booster group. These results are consistent with the presence of large lateral supports of the high-back booster which provided support to the head while sleeping, reducing voluntary lateral occupant motion and improving shoulder belt fit. Future work includes examining lap belt fit in-transit, and examining the effects of these observations on predicted injury risk. PMID:22105378

  11. From regular text to artistic writing and artworks: Fourier statistics of images with low and high aesthetic appeal

    PubMed Central

    Melmer, Tamara; Amirshahi, Seyed A.; Koch, Michael; Denzler, Joachim; Redies, Christoph

    2013-01-01

    The spatial characteristics of letters and their influence on readability and letter identification have been intensely studied during the last decades. There have been few studies, however, on statistical image properties that reflect more global aspects of text, for example properties that may relate to its aesthetic appeal. It has been shown that natural scenes and a large variety of visual artworks possess a scale-invariant Fourier power spectrum that falls off linearly with increasing frequency in log-log plots. We asked whether images of text share this property. As expected, the Fourier spectrum of images of regular typed or handwritten text is highly anisotropic, i.e., the spectral image properties in vertical, horizontal, and oblique orientations differ. Moreover, the spatial frequency spectra of text images are not scale-invariant in any direction. The decline is shallower in the low-frequency part of the spectrum for text than for aesthetic artworks, whereas, in the high-frequency part, it is steeper. These results indicate that, in general, images of regular text contain less global structure (low spatial frequencies) relative to fine detail (high spatial frequencies) than images of aesthetics artworks. Moreover, we studied images of text with artistic claim (ornate print and calligraphy) and ornamental art. For some measures, these images assume average values intermediate between regular text and aesthetic artworks. Finally, to answer the question of whether the statistical properties measured by us are universal amongst humans or are subject to intercultural differences, we compared images from three different cultural backgrounds (Western, East Asian, and Arabic). Results for different categories (regular text, aesthetic writing, ornamental art, and fine art) were similar across cultures. PMID:23554592

  12. Effect of Non-Surgical Periodontal Treatment on Clinical and Biochemical Risk Markers of Cardiovascular Disease: A Randomized Trial.

    PubMed

    Hada, Divya Singh; Garg, Subhash; Ramteke, Girish B; Ratre, Madhu Singh

    2015-11-01

    Various studies have shown periodontal disease is one of the risk factors for coronary heart disease (CHD), and periodontal treatment of patients with CHD has also been correlated with reduction in systemic markers of CHD. The aim of this study is to evaluate the effect of non-surgical periodontal treatment (NSPT) on the cardiovascular clinical and biochemical status of patients with CHD. Seventy known patients with CHD were allocated randomly to either a control group (C; no periodontal therapy) (n = 35) or an experimental group (E; NSPT in the form of scaling and root planing [SRP]) (n = 35). Cardiovascular status was assessed using clinical parameters such as pulse, respiratory rate, blood pressure (BP), and biochemical parameters, such as high-sensitivity C-reactive protein (hsCRP), lipid profile, and white blood cell (WBC) count, at baseline and 1, 3, and 6 months. Intergroup and intragroup comparisons were performed using Student t test, and P <0.05 was considered statistically significant. The complete data at the end of the study were provided by only 55 patients (group C, n = 25; group E, n = 30). Highly statistically significant reduction was observed in systolic BP (7.1 mm Hg) and very-low-density lipoproteins (VLDLs; 5.16 mg/dL) in group E. Changes were also observed in other cardiovascular biochemical and clinical parameters but were not statistically significant. NSPT (in the form of SRP) positively affects limited cardiovascular (clinical and biochemical) status of patients with CHD. Reduction in triglyceride, VLDL, total WBC, lymphocyte, and neutrophil counts and increase in hsCRP, total cholesterol, high-density lipoprotein, and low-density lipoprotein levels were observed. Highly significant reduction in VLDL cholesterol levels and systolic BP was observed among the various parameters measured.

  13. The Ontology of Biological and Clinical Statistics (OBCS) for standardized and reproducible statistical analysis.

    PubMed

    Zheng, Jie; Harris, Marcelline R; Masci, Anna Maria; Lin, Yu; Hero, Alfred; Smith, Barry; He, Yongqun

    2016-09-14

    Statistics play a critical role in biological and clinical research. However, most reports of scientific results in the published literature make it difficult for the reader to reproduce the statistical analyses performed in achieving those results because they provide inadequate documentation of the statistical tests and algorithms applied. The Ontology of Biological and Clinical Statistics (OBCS) is put forward here as a step towards solving this problem. The terms in OBCS including 'data collection', 'data transformation in statistics', 'data visualization', 'statistical data analysis', and 'drawing a conclusion based on data', cover the major types of statistical processes used in basic biological research and clinical outcome studies. OBCS is aligned with the Basic Formal Ontology (BFO) and extends the Ontology of Biomedical Investigations (OBI), an OBO (Open Biological and Biomedical Ontologies) Foundry ontology supported by over 20 research communities. Currently, OBCS comprehends 878 terms, representing 20 BFO classes, 403 OBI classes, 229 OBCS specific classes, and 122 classes imported from ten other OBO ontologies. We discuss two examples illustrating how the ontology is being applied. In the first (biological) use case, we describe how OBCS was applied to represent the high throughput microarray data analysis of immunological transcriptional profiles in human subjects vaccinated with an influenza vaccine. In the second (clinical outcomes) use case, we applied OBCS to represent the processing of electronic health care data to determine the associations between hospital staffing levels and patient mortality. Our case studies were designed to show how OBCS can be used for the consistent representation of statistical analysis pipelines under two different research paradigms. Other ongoing projects using OBCS for statistical data processing are also discussed. The OBCS source code and documentation are available at: https://github.com/obcs/obcs . The Ontology of Biological and Clinical Statistics (OBCS) is a community-based open source ontology in the domain of biological and clinical statistics. OBCS is a timely ontology that represents statistics-related terms and their relations in a rigorous fashion, facilitates standard data analysis and integration, and supports reproducible biological and clinical research.

  14. Corpus Approaches to Language Ideology

    ERIC Educational Resources Information Center

    Vessey, Rachelle

    2017-01-01

    This paper outlines how corpus linguistics--and more specifically the corpus-assisted discourse studies approach--can add useful dimensions to studies of language ideology. First, it is argued that the identification of words of high, low, and statistically significant frequency can help in the identification and exploration of language ideologies…

  15. Statewide traffic safety study phase II : identification of major traffic safety problem areas in Louisiana.

    DOT National Transportation Integrated Search

    2012-04-01

    This report summarizes a study that seeks to identify the factors leading to the high crash rate experienced on Louisiana highways. Factors were identified by comparing statistics from the Louisiana Crash Database with those from peer states using th...

  16. A retrospective survey of research design and statistical analyses in selected Chinese medical journals in 1998 and 2008.

    PubMed

    Jin, Zhichao; Yu, Danghui; Zhang, Luoman; Meng, Hong; Lu, Jian; Gao, Qingbin; Cao, Yang; Ma, Xiuqiang; Wu, Cheng; He, Qian; Wang, Rui; He, Jia

    2010-05-25

    High quality clinical research not only requires advanced professional knowledge, but also needs sound study design and correct statistical analyses. The number of clinical research articles published in Chinese medical journals has increased immensely in the past decade, but study design quality and statistical analyses have remained suboptimal. The aim of this investigation was to gather evidence on the quality of study design and statistical analyses in clinical researches conducted in China for the first decade of the new millennium. Ten (10) leading Chinese medical journals were selected and all original articles published in 1998 (N = 1,335) and 2008 (N = 1,578) were thoroughly categorized and reviewed. A well-defined and validated checklist on study design, statistical analyses, results presentation, and interpretation was used for review and evaluation. Main outcomes were the frequencies of different types of study design, error/defect proportion in design and statistical analyses, and implementation of CONSORT in randomized clinical trials. From 1998 to 2008: The error/defect proportion in statistical analyses decreased significantly ( = 12.03, p<0.001), 59.8% (545/1,335) in 1998 compared to 52.2% (664/1,578) in 2008. The overall error/defect proportion of study design also decreased ( = 21.22, p<0.001), 50.9% (680/1,335) compared to 42.40% (669/1,578). In 2008, design with randomized clinical trials remained low in single digit (3.8%, 60/1,578) with two-third showed poor results reporting (defects in 44 papers, 73.3%). Nearly half of the published studies were retrospective in nature, 49.3% (658/1,335) in 1998 compared to 48.2% (761/1,578) in 2008. Decreases in defect proportions were observed in both results presentation ( = 93.26, p<0.001), 92.7% (945/1,019) compared to 78.2% (1023/1,309) and interpretation ( = 27.26, p<0.001), 9.7% (99/1,019) compared to 4.3% (56/1,309), some serious ones persisted. Chinese medical research seems to have made significant progress regarding statistical analyses, but there remains ample room for improvement regarding study designs. Retrospective clinical studies are the most often used design, whereas randomized clinical trials are rare and often show methodological weaknesses. Urgent implementation of the CONSORT statement is imperative.

  17. Statistical downscaling of precipitation using long short-term memory recurrent neural networks

    NASA Astrophysics Data System (ADS)

    Misra, Saptarshi; Sarkar, Sudeshna; Mitra, Pabitra

    2017-11-01

    Hydrological impacts of global climate change on regional scale are generally assessed by downscaling large-scale climatic variables, simulated by General Circulation Models (GCMs), to regional, small-scale hydrometeorological variables like precipitation, temperature, etc. In this study, we propose a new statistical downscaling model based on Recurrent Neural Network with Long Short-Term Memory which captures the spatio-temporal dependencies in local rainfall. The previous studies have used several other methods such as linear regression, quantile regression, kernel regression, beta regression, and artificial neural networks. Deep neural networks and recurrent neural networks have been shown to be highly promising in modeling complex and highly non-linear relationships between input and output variables in different domains and hence we investigated their performance in the task of statistical downscaling. We have tested this model on two datasets—one on precipitation in Mahanadi basin in India and the second on precipitation in Campbell River basin in Canada. Our autoencoder coupled long short-term memory recurrent neural network model performs the best compared to other existing methods on both the datasets with respect to temporal cross-correlation, mean squared error, and capturing the extremes.

  18. Assessment of statistical methods used in library-based approaches to microbial source tracking.

    PubMed

    Ritter, Kerry J; Carruthers, Ethan; Carson, C Andrew; Ellender, R D; Harwood, Valerie J; Kingsley, Kyle; Nakatsu, Cindy; Sadowsky, Michael; Shear, Brian; West, Brian; Whitlock, John E; Wiggins, Bruce A; Wilbur, Jayson D

    2003-12-01

    Several commonly used statistical methods for fingerprint identification in microbial source tracking (MST) were examined to assess the effectiveness of pattern-matching algorithms to correctly identify sources. Although numerous statistical methods have been employed for source identification, no widespread consensus exists as to which is most appropriate. A large-scale comparison of several MST methods, using identical fecal sources, presented a unique opportunity to assess the utility of several popular statistical methods. These included discriminant analysis, nearest neighbour analysis, maximum similarity and average similarity, along with several measures of distance or similarity. Threshold criteria for excluding uncertain or poorly matched isolates from final analysis were also examined for their ability to reduce false positives and increase prediction success. Six independent libraries used in the study were constructed from indicator bacteria isolated from fecal materials of humans, seagulls, cows and dogs. Three of these libraries were constructed using the rep-PCR technique and three relied on antibiotic resistance analysis (ARA). Five of the libraries were constructed using Escherichia coli and one using Enterococcus spp. (ARA). Overall, the outcome of this study suggests a high degree of variability across statistical methods. Despite large differences in correct classification rates among the statistical methods, no single statistical approach emerged as superior. Thresholds failed to consistently increase rates of correct classification and improvement was often associated with substantial effective sample size reduction. Recommendations are provided to aid in selecting appropriate analyses for these types of data.

  19. GPA in Research Studies: An Invaluable but Neglected Opportunity

    ERIC Educational Resources Information Center

    Bacon, Donald R.; Bean, Beth

    2006-01-01

    Grade point average (GPA) often correlates highly with variables of interest to educational researchers and thus offers the potential to greatly increase the statistical power of their research studies. Yet this variable is often underused in marketing education research studies. The reliability and validity of the GPA are closely examined here in…

  20. 78 FR 51597 - Modernizing the E-Rate Program for Schools and Libraries

    Federal Register 2010, 2011, 2012, 2013, 2014

    2013-08-20

    ... small a school they attend or how far they live from experts in their field of study. High-capacity... study by Austan Goolsbee and Jonathan Guryan found that E-rate support substantially increased the... statistically significant effect on student test scores. Have more recent studies suggested otherwise? We also...

  1. Clinical competence of Guatemalan and Mexican physicians for family dysfunction management.

    PubMed

    Cabrera-Pivaral, Carlos Enrique; Orozco-Valerio, María de Jesús; Celis-de la Rosa, Alfredo; Covarrubias-Bermúdez, María de Los Ángeles; Zavala-González, Marco Antonio

    2017-01-01

    To evaluate the clinical competence of Mexican and Guatemalan physicians to management the family dysfunction. Cross comparative study in four care units first in Guadalajara, Mexico, and four in Guatemala, Guatemala, based on a purposeful sampling, involving 117 and 100 physicians, respectively. Clinical competence evaluated by validated instrument integrated for 187 items. Non-parametric descriptive and inferential statistical analysis was performed. The percentage of Mexican physicians with high clinical competence was 13.7%, medium 53%, low 24.8% and defined by random 8.5%. For the Guatemalan physicians'14% was high, average 63%, and 23% defined by random. There were no statistically significant differences between healthcare country units, but between the medium of Mexicans (0.55) and Guatemalans (0.55) (p = 0.02). The proportion of the high clinical competency of Mexican physicians' was as Guatemalans.

  2. Effects of lattice parameters on piezoelectric constants in wurtzite materials: A theoretical study using first-principles and statistical-learning methods

    NASA Astrophysics Data System (ADS)

    Momida, Hiroyoshi; Oguchi, Tamio

    2018-04-01

    Longitudinal piezoelectric constant (e 33) values of wurtzite materials, which are listed in a structure database, are calculated and analyzed by using first-principles and statistical learning methods. It is theoretically shown that wurtzite materials with high e 33 generally have small lattice constant ratios (c/a) almost independent of constituent elements, and approximately expressed as e 33 ∝ c/a - (c/a)0 with ideal lattice constant ratio (c/a)0. This relation also holds for highly-piezoelectric ternary materials such as Sc x Al1- x N. We conducted a search for high-piezoelectric wurtzite materials by identifying materials with smaller c/a values. It is proposed that the piezoelectricity of ZnO can be significantly enhanced by substitutions of Zn with Ca.

  3. Analysis of statistical properties of laser speckles, forming in skin and mucous of colon: potential application in laser surgery

    NASA Astrophysics Data System (ADS)

    Rubtsov, Vladimir; Kapralov, Sergey; Chalyk, Iuri; Ulianova, Onega; Ulyanov, Sergey

    2013-02-01

    Statistical properties of laser speckles, formed in skin and mucous of colon have been analyzed and compared. It has been demonstrated that first and second order statistics of "skin" speckles and "mucous" speckles are quite different. It is shown that speckles, formed in mucous, are not Gaussian one. Layered structure of colon mucous causes formation of speckled biospeckles. First- and second- order statistics of speckled speckles have been reviewed in this paper. Statistical properties of Fresnel and Fraunhofer doubly scattered and cascade speckles are described. Non-gaussian statistics of biospeckles may lead to high localization of intensity of coherent light in human tissue during the laser surgery. Way of suppression of highly localized non-gaussian speckles is suggested.

  4. Joint Adaptive Mean-Variance Regularization and Variance Stabilization of High Dimensional Data.

    PubMed

    Dazard, Jean-Eudes; Rao, J Sunil

    2012-07-01

    The paper addresses a common problem in the analysis of high-dimensional high-throughput "omics" data, which is parameter estimation across multiple variables in a set of data where the number of variables is much larger than the sample size. Among the problems posed by this type of data are that variable-specific estimators of variances are not reliable and variable-wise tests statistics have low power, both due to a lack of degrees of freedom. In addition, it has been observed in this type of data that the variance increases as a function of the mean. We introduce a non-parametric adaptive regularization procedure that is innovative in that : (i) it employs a novel "similarity statistic"-based clustering technique to generate local-pooled or regularized shrinkage estimators of population parameters, (ii) the regularization is done jointly on population moments, benefiting from C. Stein's result on inadmissibility, which implies that usual sample variance estimator is improved by a shrinkage estimator using information contained in the sample mean. From these joint regularized shrinkage estimators, we derived regularized t-like statistics and show in simulation studies that they offer more statistical power in hypothesis testing than their standard sample counterparts, or regular common value-shrinkage estimators, or when the information contained in the sample mean is simply ignored. Finally, we show that these estimators feature interesting properties of variance stabilization and normalization that can be used for preprocessing high-dimensional multivariate data. The method is available as an R package, called 'MVR' ('Mean-Variance Regularization'), downloadable from the CRAN website.

  5. Morphometric evaluation of the Afşin-Elbistan lignite basin using kernel density estimation and Getis-Ord's statistics of DEM derived indices, SE Turkey

    NASA Astrophysics Data System (ADS)

    Sarp, Gulcan; Duzgun, Sebnem

    2015-11-01

    A morphometric analysis of river network, basins and relief using geomorphic indices and geostatistical analyses of Digital Elevation Model (DEM) are useful tools for discussing the morphometric evolution of the basin area. In this study, three different indices including valley floor width to height ratio (Vf), stream gradient (SL), and stream sinuosity were applied to Afşin-Elbistan lignite basin to test the imprints of tectonic activity. Perturbations of these indices are usually indicative of differences in the resistance of outcropping lithological units to erosion and active faulting. To map the clusters of high and low indices values, the Kernel density estimation (K) and the Getis-Ord Gi∗ statistics were applied to the DEM-derived indices. The K method and Gi∗ statistic highlighting hot spots and cold spots of the SL index, the stream sinuosity and the Vf index values helped to identify the relative tectonic activity of the basin area. The results indicated that the estimation by the K and Gi∗ including three conceptualization of spatial relationships (CSR) for hot spots (percent volume contours 50 and 95 categorized as high and low respectively) yielded almost similar results in regions of high tectonic activity and low tectonic activity. According to the K and Getis-Ord Gi∗ statistics, the northern, northwestern and southern parts of the basin indicates a high tectonic activity. On the other hand, low elevation plain in the central part of the basin area shows a relatively low tectonic activity.

  6. Multinomial logistic regression analysis for differentiating 3 treatment outcome trajectory groups for headache-associated disability.

    PubMed

    Lewis, Kristin Nicole; Heckman, Bernadette Davantes; Himawan, Lina

    2011-08-01

    Growth mixture modeling (GMM) identified latent groups based on treatment outcome trajectories of headache disability measures in patients in headache subspecialty treatment clinics. Using a longitudinal design, 219 patients in headache subspecialty clinics in 4 large cities throughout Ohio provided data on their headache disability at pretreatment and 3 follow-up assessments. GMM identified 3 treatment outcome trajectory groups: (1) patients who initiated treatment with elevated disability levels and who reported statistically significant reductions in headache disability (high-disability improvers; 11%); (2) patients who initiated treatment with elevated disability but who reported no reductions in disability (high-disability nonimprovers; 34%); and (3) patients who initiated treatment with moderate disability and who reported statistically significant reductions in headache disability (moderate-disability improvers; 55%). Based on the final multinomial logistic regression model, a dichotomized treatment appointment attendance variable was a statistically significant predictor for differentiating high-disability improvers from high-disability nonimprovers. Three-fourths of patients who initiated treatment with elevated disability levels did not report reductions in disability after 5 months of treatment with new preventive pharmacotherapies. Preventive headache agents may be most efficacious for patients with moderate levels of disability and for patients with high disability levels who attend all treatment appointments. Copyright © 2011 International Association for the Study of Pain. Published by Elsevier B.V. All rights reserved.

  7. Caustics and Rogue Waves in an Optical Sea.

    PubMed

    Mathis, Amaury; Froehly, Luc; Toenger, Shanti; Dias, Frédéric; Genty, Goëry; Dudley, John M

    2015-08-06

    There are many examples in physics of systems showing rogue wave behaviour, the generation of high amplitude events at low probability. Although initially studied in oceanography, rogue waves have now been seen in many other domains, with particular recent interest in optics. Although most studies in optics have focussed on how nonlinearity can drive rogue wave emergence, purely linear effects have also been shown to induce extreme wave amplitudes. In this paper, we report a detailed experimental study of linear rogue waves in an optical system, using a spatial light modulator to impose random phase structure on a coherent optical field. After free space propagation, different random intensity patterns are generated, including partially-developed speckle, a broadband caustic network, and an intermediate pattern with characteristics of both speckle and caustic structures. Intensity peaks satisfying statistical criteria for rogue waves are seen especially in the case of the caustic network, and are associated with broader spatial spectra. In addition, the electric field statistics of the intermediate pattern shows properties of an "optical sea" with near-Gaussian statistics in elevation amplitude, and trough-to-crest statistics that are near-Rayleigh distributed but with an extended tail where a number of rogue wave events are observed.

  8. Caustics and Rogue Waves in an Optical Sea

    PubMed Central

    Mathis, Amaury; Froehly, Luc; Toenger, Shanti; Dias, Frédéric; Genty, Goëry; Dudley, John M.

    2015-01-01

    There are many examples in physics of systems showing rogue wave behaviour, the generation of high amplitude events at low probability. Although initially studied in oceanography, rogue waves have now been seen in many other domains, with particular recent interest in optics. Although most studies in optics have focussed on how nonlinearity can drive rogue wave emergence, purely linear effects have also been shown to induce extreme wave amplitudes. In this paper, we report a detailed experimental study of linear rogue waves in an optical system, using a spatial light modulator to impose random phase structure on a coherent optical field. After free space propagation, different random intensity patterns are generated, including partially-developed speckle, a broadband caustic network, and an intermediate pattern with characteristics of both speckle and caustic structures. Intensity peaks satisfying statistical criteria for rogue waves are seen especially in the case of the caustic network, and are associated with broader spatial spectra. In addition, the electric field statistics of the intermediate pattern shows properties of an “optical sea” with near-Gaussian statistics in elevation amplitude, and trough-to-crest statistics that are near-Rayleigh distributed but with an extended tail where a number of rogue wave events are observed. PMID:26245864

  9. Statistical properties of radiation from VUV and X-ray free electron laser

    NASA Astrophysics Data System (ADS)

    Saldin, E. L.; Schneidmiller, E. A.; Yurkov, M. V.

    1998-03-01

    The paper presents a comprehensive analysis of the statistical properties of the radiation from a self-amplified spontaneous emission (SASE) free electron laser operating in linear and nonlinear mode. The investigation has been performed in a one-dimensional approximation assuming the electron pulse length to be much larger than a coherence length of the radiation. The following statistical properties of the SASE FEL radiation have been studied in detail: time and spectral field correlations, distribution of the fluctuations of the instantaneous radiation power, distribution of the energy in the electron bunch, distribution of the radiation energy after the monochromator installed at the FEL amplifier exit and radiation spectrum. The linear high gain limit is studied analytically. It is shown that the radiation from a SASE FEL operating in the linear regime possesses all the features corresponding to completely chaotic polarized radiation. A detailed study of statistical properties of the radiation from a SASE FEL operating in linear and nonlinear regime has been performed by means of time-dependent simulation codes. All numerical results presented in the paper have been calculated for the 70 nm SASE FEL at the TESLA Test Facility being under construction at DESY.

  10. Needs of the Learning Effect on Instructional Website for Vocational High School Students

    ERIC Educational Resources Information Center

    Lo, Hung-Jen; Fu, Gwo-Liang; Chuang, Kuei-Chih

    2013-01-01

    The purpose of study was to understand the correlation between the needs of the learning effect on instructional website for the vocational high school students. Our research applied the statistic methods of product-moment correlation, stepwise regression, and structural equation method to analyze the questionnaire with the sample size of 377…

  11. Modifying Open-Campus Lunch Policy to Reduce Discipline Violations: An Action Research Study

    ERIC Educational Resources Information Center

    Wilkes, James S., III

    2016-01-01

    An intervention was implemented to address the high number of discipline violations due to an unconditional open-campus lunch policy at a senior high school. The intent of the intervention was to statistically measure discipline violations among voluntary participants and to determine whether or not a significant change occurred. The research…

  12. 1975 Status Report and Resource Guide on Aviation and Space Related High School Courses.

    ERIC Educational Resources Information Center

    General Aviation Manufacturers Association, Washington, DC.

    This study contains a statistical consolidation of information reflecting many of the trends and patterns becoming evident in high school aviation courses conducted across the country. For purposes of this report the term aviation relates to all courses including both aviation and space. The information reported is considered to be of value for…

  13. Accountability and Pennsylvania High Schools: Using a Value-Added Model to Identify, Quantify, and Track School Improvement

    ERIC Educational Resources Information Center

    Davies, Todd Matthew

    2012-01-01

    This dissertation investigates the prevailing No Child Left Behind (NCLB) mandate as an effective platform to improve schools. The data compiled for use in this study represented 426 high schools in Pennsylvania and were retrieved from publicly accessible, state-sponsored sources. The statistical methodologies from the Pennsylvania Value-Added…

  14. Discipline and Order in American High Schools. Contractor Report.

    ERIC Educational Resources Information Center

    DiPrete, Thomas A.; And Others

    Discipline and misbehavior in American high schools are the focus of this analysis of data from the first wave (1980) of a longitudinal study of over 30,000 sophomores and over 28,000 seniors. A summary of the findings shows that differences between urban and other schools are usually statistically insignificant when other school and student…

  15. Development of polytoxicomania in function of defence from psychoticism.

    PubMed

    Nenadović, Milutin M; Sapić, Rosa

    2011-01-01

    Polytoxicomanic proportions in subpopulations of youth have been growing steadily in recent decades, and this trend is pan-continental. Psychoticism is a psychological construct that assumes special basic dimensions of personality disintegration and cognitive functions. Psychoticism may, in general, be the basis of pathological functioning of youth and influence the patterns of thought, feelings and actions that cause dysfunction. The aim of this study was to determine the distribution of basic dimensions of psychoticism for commitment of youth to abuse psychoactive substances (PAS) in order to reduce disturbing intrapsychic experiences or manifestation of psychotic symptoms. For the purpose of this study, two groups of respondents were formed, balanced by age, gender and family structure of origin (at least one parent alive). The study applied a DELTA-9 instrument for assessment of cognitive disintegration in function of establishing psychoticism and its operationalization. The obtained results were statistically analyzed. From the parameters of descriptive statistics, the arithmetic mean was calculated with measures of dispersion. A cross-tabular analysis of variables tested was performed, as well as statistical significance with Pearson's chi2-test, and analysis of variance. Age structure and gender are approximately represented in the group of polytoximaniacs and the control group. Testing did not confirm the statistically significant difference (p > 0.5). Statistical methodology established that they significantly differed in most variables of psychoticism, polytoxicomaniacs compared with a control group of respondents. Testing confirmed a high statistical significance of differences of variables of psychoticism in the group of respondents for p < 0.001 to p < 0.01. A statistically significant representation of the dimension of psychoticism in the polytoxicomaniac group was established. The presence of factors concerning common executive dysfunction was emphasized.

  16. Does transport time help explain the high trauma mortality rates in rural areas? New and traditional predictors assessed by new and traditional statistical methods

    PubMed Central

    Røislien, Jo; Lossius, Hans Morten; Kristiansen, Thomas

    2015-01-01

    Background Trauma is a leading global cause of death. Trauma mortality rates are higher in rural areas, constituting a challenge for quality and equality in trauma care. The aim of the study was to explore population density and transport time to hospital care as possible predictors of geographical differences in mortality rates, and to what extent choice of statistical method might affect the analytical results and accompanying clinical conclusions. Methods Using data from the Norwegian Cause of Death registry, deaths from external causes 1998–2007 were analysed. Norway consists of 434 municipalities, and municipality population density and travel time to hospital care were entered as predictors of municipality mortality rates in univariate and multiple regression models of increasing model complexity. We fitted linear regression models with continuous and categorised predictors, as well as piecewise linear and generalised additive models (GAMs). Models were compared using Akaike's information criterion (AIC). Results Population density was an independent predictor of trauma mortality rates, while the contribution of transport time to hospital care was highly dependent on choice of statistical model. A multiple GAM or piecewise linear model was superior, and similar, in terms of AIC. However, while transport time was statistically significant in multiple models with piecewise linear or categorised predictors, it was not in GAM or standard linear regression. Conclusions Population density is an independent predictor of trauma mortality rates. The added explanatory value of transport time to hospital care is marginal and model-dependent, highlighting the importance of exploring several statistical models when studying complex associations in observational data. PMID:25972600

  17. The Development of Statistical Models for Predicting Surgical Site Infections in Japan: Toward a Statistical Model-Based Standardized Infection Ratio.

    PubMed

    Fukuda, Haruhisa; Kuroki, Manabu

    2016-03-01

    To develop and internally validate a surgical site infection (SSI) prediction model for Japan. Retrospective observational cohort study. We analyzed surveillance data submitted to the Japan Nosocomial Infections Surveillance system for patients who had undergone target surgical procedures from January 1, 2010, through December 31, 2012. Logistic regression analyses were used to develop statistical models for predicting SSIs. An SSI prediction model was constructed for each of the procedure categories by statistically selecting the appropriate risk factors from among the collected surveillance data and determining their optimal categorization. Standard bootstrapping techniques were applied to assess potential overfitting. The C-index was used to compare the predictive performances of the new statistical models with those of models based on conventional risk index variables. The study sample comprised 349,987 cases from 428 participant hospitals throughout Japan, and the overall SSI incidence was 7.0%. The C-indices of the new statistical models were significantly higher than those of the conventional risk index models in 21 (67.7%) of the 31 procedure categories (P<.05). No significant overfitting was detected. Japan-specific SSI prediction models were shown to generally have higher accuracy than conventional risk index models. These new models may have applications in assessing hospital performance and identifying high-risk patients in specific procedure categories.

  18. Preparing systems engineering and computing science students in disciplined methods, quantitative, and advanced statistical techniques to improve process performance

    NASA Astrophysics Data System (ADS)

    McCray, Wilmon Wil L., Jr.

    The research was prompted by a need to conduct a study that assesses process improvement, quality management and analytical techniques taught to students in U.S. colleges and universities undergraduate and graduate systems engineering and the computing science discipline (e.g., software engineering, computer science, and information technology) degree programs during their academic training that can be applied to quantitatively manage processes for performance. Everyone involved in executing repeatable processes in the software and systems development lifecycle processes needs to become familiar with the concepts of quantitative management, statistical thinking, process improvement methods and how they relate to process-performance. Organizations are starting to embrace the de facto Software Engineering Institute (SEI) Capability Maturity Model Integration (CMMI RTM) Models as process improvement frameworks to improve business processes performance. High maturity process areas in the CMMI model imply the use of analytical, statistical, quantitative management techniques, and process performance modeling to identify and eliminate sources of variation, continually improve process-performance; reduce cost and predict future outcomes. The research study identifies and provides a detail discussion of the gap analysis findings of process improvement and quantitative analysis techniques taught in U.S. universities systems engineering and computing science degree programs, gaps that exist in the literature, and a comparison analysis which identifies the gaps that exist between the SEI's "healthy ingredients " of a process performance model and courses taught in U.S. universities degree program. The research also heightens awareness that academicians have conducted little research on applicable statistics and quantitative techniques that can be used to demonstrate high maturity as implied in the CMMI models. The research also includes a Monte Carlo simulation optimization model and dashboard that demonstrates the use of statistical methods, statistical process control, sensitivity analysis, quantitative and optimization techniques to establish a baseline and predict future customer satisfaction index scores (outcomes). The American Customer Satisfaction Index (ACSI) model and industry benchmarks were used as a framework for the simulation model.

  19. Assessing Clinical Faculty Understanding of Statistical Terms Used to Measure Treatment Effects and Their Application to Teaching.

    PubMed

    Hazelton, Lara; Allen, Michael; MacLeod, Tanya; LeBlanc, Constance; Boudreau, Michelle

    2016-01-01

    Understanding of statistical terms used to measure treatment effect is important for evidence-informed medical teaching and practice. We explored knowledge of these terms among clinical faculty who instruct and mentor a continuum of medical learners to inform medical faculty learning needs. This was a mixed methods study that used a questionnaire to measure a health professional's understanding of measures of treatment effect and a focus group to explore perspectives on learning, applying, and teaching these terms. We analyzed questionnaire data using descriptive statistics and focus group data using thematic analysis. We analyzed responses from clinical faculty who were physicians and completed all sections of the questionnaire (n = 137). Overall, approximately 55% were highly confident in their understanding of statistical terms; self-reported understanding was highest for number needed to treat (77%). Only 26% of respondents correctly responded to all comprehension questions; however, 80% correctly responded to at least one of these questions. There was a significant association among self-reported understanding and ability to correctly calculate terms. A focus group with clinical/medical faculty (n = 4) revealed themes of mentorship, support and resources, and beliefs about the value of statistical literacy. We found that half of clinical faculty members are highly confident in their understanding of relative and absolute terms. Despite the limitations of self-assessment data, our study provides some evidence that self-assessment can be reliable. Recognizing that faculty development is not mandatory for clinical faculty in many centers, and the notion that faculty may benefit from mentorship in critical appraisal topics, it may be appropriate to first engage and support influential clinical faculty rather than using a broad strategy to achieve universal statistical literacy. Second, senior leadership in medical education should support continuous learning by providing paid, protected time for faculty to incorporate evidence in their teaching.

  20. Statistical approach for selection of biologically informative genes.

    PubMed

    Das, Samarendra; Rai, Anil; Mishra, D C; Rai, Shesh N

    2018-05-20

    Selection of informative genes from high dimensional gene expression data has emerged as an important research area in genomics. Many gene selection techniques have been proposed so far are either based on relevancy or redundancy measure. Further, the performance of these techniques has been adjudged through post selection classification accuracy computed through a classifier using the selected genes. This performance metric may be statistically sound but may not be biologically relevant. A statistical approach, i.e. Boot-MRMR, was proposed based on a composite measure of maximum relevance and minimum redundancy, which is both statistically sound and biologically relevant for informative gene selection. For comparative evaluation of the proposed approach, we developed two biological sufficient criteria, i.e. Gene Set Enrichment with QTL (GSEQ) and biological similarity score based on Gene Ontology (GO). Further, a systematic and rigorous evaluation of the proposed technique with 12 existing gene selection techniques was carried out using five gene expression datasets. This evaluation was based on a broad spectrum of statistically sound (e.g. subject classification) and biological relevant (based on QTL and GO) criteria under a multiple criteria decision-making framework. The performance analysis showed that the proposed technique selects informative genes which are more biologically relevant. The proposed technique is also found to be quite competitive with the existing techniques with respect to subject classification and computational time. Our results also showed that under the multiple criteria decision-making setup, the proposed technique is best for informative gene selection over the available alternatives. Based on the proposed approach, an R Package, i.e. BootMRMR has been developed and available at https://cran.r-project.org/web/packages/BootMRMR. This study will provide a practical guide to select statistical techniques for selecting informative genes from high dimensional expression data for breeding and system biology studies. Published by Elsevier B.V.

  1. Power of tests for comparing trend curves with application to national immunization survey (NIS).

    PubMed

    Zhao, Zhen

    2011-02-28

    To develop statistical tests for comparing trend curves of study outcomes between two socio-demographic strata across consecutive time points, and compare statistical power of the proposed tests under different trend curves data, three statistical tests were proposed. For large sample size with independent normal assumption among strata and across consecutive time points, the Z and Chi-square test statistics were developed, which are functions of outcome estimates and the standard errors at each of the study time points for the two strata. For small sample size with independent normal assumption, the F-test statistic was generated, which is a function of sample size of the two strata and estimated parameters across study period. If two trend curves are approximately parallel, the power of Z-test is consistently higher than that of both Chi-square and F-test. If two trend curves cross at low interaction, the power of Z-test is higher than or equal to the power of both Chi-square and F-test; however, at high interaction, the powers of Chi-square and F-test are higher than that of Z-test. The measurement of interaction of two trend curves was defined. These tests were applied to the comparison of trend curves of vaccination coverage estimates of standard vaccine series with National Immunization Survey (NIS) 2000-2007 data. Copyright © 2011 John Wiley & Sons, Ltd.

  2. Fabrication of Hyperbranched Block-Statistical Copolymer-Based Prodrug with Dual Sensitivities for Controlled Release.

    PubMed

    Zheng, Luping; Wang, Yunfei; Zhang, Xianshuo; Ma, Liwei; Wang, Baoyan; Ji, Xiangling; Wei, Hua

    2018-01-17

    Dendrimer with hyperbranched structure and multivalent surface is regarded as one of the most promising candidates close to the ideal drug delivery systems, but the clinical translation and scale-up production of dendrimer has been hampered significantly by the synthetic difficulties. Therefore, there is considerable scope for the development of novel hyperbranched polymer that can not only address the drawbacks of dendrimer but maintain its advantages. The reversible addition-fragmentation chain transfer self-condensing vinyl polymerization (RAFT-SCVP) technique has enabled facile preparation of segmented hyperbranched polymer (SHP) by using chain transfer monomer (CTM)-based double-head agent during the past decade. Meanwhile, the design and development of block-statistical copolymers has been proven in our recent studies to be a simple yet effective way to address the extracellular stability vs intracellular high delivery efficacy dilemma. To integrate the advantages of both hyperbranched and block-statistical structures, we herein reported the fabrication of hyperbranched block-statistical copolymer-based prodrug with pH and reduction dual sensitivities using RAFT-SCVP and post-polymerization click coupling. The external homo oligo(ethylene glycol methyl ether methacrylate) (OEGMA) block provides sufficient extracellularly colloidal stability for the nanocarriers by steric hindrance, and the interior OEGMA units incorporated by the statistical copolymerization promote intracellular drug release by facilitating the permeation of GSH and H + for the cleavage of the reduction-responsive disulfide bond and pH-liable carbonate link as well as weakening the hydrophobic encapsulation of drug molecules. The delivery efficacy of the target hyperbranched block-statistical copolymer-based prodrug was evaluated in terms of in vitro drug release and cytotoxicity studies, which confirms both acidic pH and reduction-triggered drug release for inhibiting proliferation of HeLa cells. Interestingly, the simultaneous application of both acidic pH and GSH triggers promoted significantly the cleavage and release of CPT compared to the exertion of single trigger. This study thus developed a facile approach toward hyperbranched polymer-based prodrugs with high therapeutic efficacy for anticancer drug delivery.

  3. Using a higher criticism statistic to detect modest effects in a genome-wide study of rheumatoid arthritis

    PubMed Central

    2009-01-01

    In high-dimensional studies such as genome-wide association studies, the correction for multiple testing in order to control total type I error results in decreased power to detect modest effects. We present a new analytical approach based on the higher criticism statistic that allows identification of the presence of modest effects. We apply our method to the genome-wide study of rheumatoid arthritis provided in the Genetic Analysis Workshop 16 Problem 1 data set. There is evidence for unknown bias in this study that could be explained by the presence of undetected modest effects. We compared the asymptotic and empirical thresholds for the higher criticism statistic. Using the asymptotic threshold we detected the presence of modest effects genome-wide. We also detected modest effects using 90th percentile of the empirical null distribution as a threshold; however, there is no such evidence when the 95th and 99th percentiles were used. While the higher criticism method suggests that there is some evidence for modest effects, interpreting individual single-nucleotide polymorphisms with significant higher criticism statistics is of undermined value. The goal of higher criticism is to alert the researcher that genetic effects remain to be discovered and to promote the use of more targeted and powerful studies to detect the remaining effects. PMID:20018032

  4. Potential Mediators in Parenting and Family Intervention: Quality of Mediation Analyses

    PubMed Central

    Patel, Chandni C.; Fairchild, Amanda J.; Prinz, Ronald J.

    2017-01-01

    Parenting and family interventions have repeatedly shown effectiveness in preventing and treating a range of youth outcomes. Accordingly, investigators in this area have conducted a number of studies using statistical mediation to examine some of the potential mechanisms of action by which these interventions work. This review examined from a methodological perspective in what ways and how well the family-based intervention studies tested statistical mediation. A systematic search identified 73 published outcome studies that tested mediation for family-based interventions across a wide range of child and adolescent outcomes (i.e., externalizing, internalizing, and substance-abuse problems; high-risk sexual activity; and academic achievement), for putative mediators pertaining to positive and negative parenting, family functioning, youth beliefs and coping skills, and peer relationships. Taken as a whole, the studies used designs that adequately addressed temporal precedence. The majority of studies used the product of coefficients approach to mediation, which is preferred, and less limiting than the causal steps approach. Statistical significance testing did not always make use of the most recently developed approaches, which would better accommodate small sample sizes and more complex functions. Specific recommendations are offered for future mediation studies in this area with respect to full longitudinal design, mediation approach, significance testing method, documentation and reporting of statistics, testing of multiple mediators, and control for Type I error. PMID:28028654

  5. Earth system feedback statistically extracted from the Indian Ocean deep-sea sediments recording Eocene hyperthermals.

    PubMed

    Yasukawa, Kazutaka; Nakamura, Kentaro; Fujinaga, Koichiro; Ikehara, Minoru; Kato, Yasuhiro

    2017-09-12

    Multiple transient global warming events occurred during the early Palaeogene. Although these events, called hyperthermals, have been reported from around the globe, geologic records for the Indian Ocean are limited. In addition, the recovery processes from relatively modest hyperthermals are less constrained than those from the severest and well-studied hothouse called the Palaeocene-Eocene Thermal Maximum. In this study, we constructed a new and high-resolution geochemical dataset of deep-sea sediments clearly recording multiple Eocene hyperthermals in the Indian Ocean. We then statistically analysed the high-dimensional data matrix and extracted independent components corresponding to the biogeochemical responses to the hyperthermals. The productivity feedback commonly controls and efficiently sequesters the excess carbon in the recovery phases of the hyperthermals via an enhanced biological pump, regardless of the magnitude of the events. Meanwhile, this negative feedback is independent of nannoplankton assemblage changes generally recognised in relatively large environmental perturbations.

  6. Statistical gamma-ray decay studies at iThemba LABS

    NASA Astrophysics Data System (ADS)

    Wiedeking, M.; Bernstein, L. A.; Bleuel, D. L.; Brits, C. P.; Sowazi, K.; Görgen, A.; Goldblum, B. L.; Guttormsen, M.; Kheswa, B. V.; Larsen, A. C.; Majola, S. N. T.; Malatji, K. L.; Negi, D.; Nogwanya, T.; Siem, S.; Zikhali, B. R.

    2017-09-01

    A program to study the γ-ray decay from the region of high-level density has been established at iThemba LABS, where a high-resolution gamma-ray detector array is used in conjunction with silicon particle-telescopes. Results from two recent projects are presented: 1) The 74Ge(α,α'γ) reaction was used to investigate the Pygmy Dipole Resonance. The results were compared to (γ,γ') data and indicate that the dipole states split into mixed isospin and relatively pure isovector excitations. 2) Data from the 95Mo(d,p) reaction were used to develop a novel method for the determination of spins for low-lying discrete levels utilizing statistical γ-ray decay in the vicinity of the neutron separation energy. These results provide insight into the competition of (γ,n) and (γ,γ') reactions and highlights the need to correct for angular momentum barrier effects.

  7. Randomized Clinical Trial Comparing Low Density versus High Density Meshes in Patients with Bilateral Inguinal Hernia.

    PubMed

    Carro, Jose Luis Porrero; Riu, Sol Villar; Lojo, Beatriz Ramos; Latorre, Lucia; Garcia, Maria Teresa Alonso; Pardo, Benito Alcaide; Naranjo, Oscar Bonachia; Herrero, Alberto Marcos; Cabezudo, Carlos Sanchez; Herreras, Esther Quiros

    2017-12-01

    We present a randomized clinical trial to compare postoperative pain, complications, feeling of a foreign body, and recurrence between heavyweight and lightweight meshes in patients with bilateral groin hernia. Sixty-seven patients with bilateral hernia were included in our study. In each patient, the side of the lightweight mesh was decided by random numbers table. Pain score was measured by visual analogue scale, on 1st, 3rd, 5th, and 7th postoperative day, and one year after the surgery. There were no statistically significative differences between both meshes in postoperative complications. About differences of pain average, there were statistically significant differences only on the 1st postoperative day (P <0.01) and the 7th postoperative day (P <0.05). In the review after a year, there were no statistically significative differences in any parameter. In our study, we did not find statistically significative differences between lightweight and heavyweight meshes in postoperative pain, complications, feeling of a foreign body, and recurrence, except pain on 1st and 7th postoperative day.

  8. The influence of depression and anxiety in the development of heart failure after coronary angioplasty.

    PubMed

    Gegenava, T; Gegenava, M; Kavtaradze, G

    2009-03-01

    The aim of our study was to investigate the association between history of depressive episode and anxiety and complications in patients after 6 months of coronary artery angioplasty. The research was conducted on 70 patients, the grade of coronary occlusion that would not respond to therapeutic treatment and need coronary angioplasty had been established. Complications were estimated in 60 patients after 6 months of coronary angioplasty. To evaluate depression we used Beck depression scale Anxiety was assessed by Spilberger State-trait anxiety scale. Statistic analysis of the data was made by means of the methods of variation statistics using Students' criterion and program of STATISTICA w 5.0. Complications were discovered in 36 (60%) patients; 24 (40%) patients had not complications. There was not revealed significant statistical differences in depression and anxiety degree in coronary angioplasty period and after 6 months of coronary angioplasty. There was not revealed significant statistical differences in depression and anxiety degree in coronary angioplasty period and after 6 months of coronary angioplasty. Our study demonstrated that complications were revealed in patients who had high degree of depression and anxiety.

  9. Critical discussion of evaluation parameters for inter-observer variability in target definition for radiation therapy.

    PubMed

    Fotina, I; Lütgendorf-Caucig, C; Stock, M; Pötter, R; Georg, D

    2012-02-01

    Inter-observer studies represent a valid method for the evaluation of target definition uncertainties and contouring guidelines. However, data from the literature do not yet give clear guidelines for reporting contouring variability. Thus, the purpose of this work was to compare and discuss various methods to determine variability on the basis of clinical cases and a literature review. In this study, 7 prostate and 8 lung cases were contoured on CT images by 8 experienced observers. Analysis of variability included descriptive statistics, calculation of overlap measures, and statistical measures of agreement. Cross tables with ratios and correlations were established for overlap parameters. It was shown that the minimal set of parameters to be reported should include at least one of three volume overlap measures (i.e., generalized conformity index, Jaccard coefficient, or conformation number). High correlation between these parameters and scatter of the results was observed. A combination of descriptive statistics, overlap measure, and statistical measure of agreement or reliability analysis is required to fully report the interrater variability in delineation.

  10. Testing homogeneity of proportion ratios for stratified correlated bilateral data in two-arm randomized clinical trials.

    PubMed

    Pei, Yanbo; Tian, Guo-Liang; Tang, Man-Lai

    2014-11-10

    Stratified data analysis is an important research topic in many biomedical studies and clinical trials. In this article, we develop five test statistics for testing the homogeneity of proportion ratios for stratified correlated bilateral binary data based on an equal correlation model assumption. Bootstrap procedures based on these test statistics are also considered. To evaluate the performance of these statistics and procedures, we conduct Monte Carlo simulations to study their empirical sizes and powers under various scenarios. Our results suggest that the procedure based on score statistic performs well generally and is highly recommended. When the sample size is large, procedures based on the commonly used weighted least square estimate and logarithmic transformation with Mantel-Haenszel estimate are recommended as they do not involve any computation of maximum likelihood estimates requiring iterative algorithms. We also derive approximate sample size formulas based on the recommended test procedures. Finally, we apply the proposed methods to analyze a multi-center randomized clinical trial for scleroderma patients. Copyright © 2014 John Wiley & Sons, Ltd.

  11. Subjective Ratings of Beauty and Aesthetics: Correlations With Statistical Image Properties in Western Oil Paintings

    PubMed Central

    Lehmann, Thomas; Redies, Christoph

    2017-01-01

    For centuries, oil paintings have been a major segment of the visual arts. The JenAesthetics data set consists of a large number of high-quality images of oil paintings of Western provenance from different art periods. With this database, we studied the relationship between objective image measures and subjective evaluations of the images, especially evaluations on aesthetics (defined as artistic value) and beauty (defined as individual liking). The objective measures represented low-level statistical image properties that have been associated with aesthetic value in previous research. Subjective rating scores on aesthetics and beauty correlated not only with each other but also with different combinations of the objective measures. Furthermore, we found that paintings from different art periods vary with regard to the objective measures, that is, they exhibit specific patterns of statistical image properties. In addition, clusters of participants preferred different combinations of these properties. In conclusion, the results of the present study provide evidence that statistical image properties vary between art periods and subject matters and, in addition, they correlate with the subjective evaluation of paintings by the participants. PMID:28694958

  12. PSYCHOLOGY. Estimating the reproducibility of psychological science.

    PubMed

    2015-08-28

    Reproducibility is a defining feature of science, but the extent to which it characterizes current research is unknown. We conducted replications of 100 experimental and correlational studies published in three psychology journals using high-powered designs and original materials when available. Replication effects were half the magnitude of original effects, representing a substantial decline. Ninety-seven percent of original studies had statistically significant results. Thirty-six percent of replications had statistically significant results; 47% of original effect sizes were in the 95% confidence interval of the replication effect size; 39% of effects were subjectively rated to have replicated the original result; and if no bias in original results is assumed, combining original and replication results left 68% with statistically significant effects. Correlational tests suggest that replication success was better predicted by the strength of original evidence than by characteristics of the original and replication teams. Copyright © 2015, American Association for the Advancement of Science.

  13. Substantial increase in concurrent droughts and heatwaves in the United States

    PubMed Central

    Mazdiyasni, Omid; AghaKouchak, Amir

    2015-01-01

    A combination of climate events (e.g., low precipitation and high temperatures) may cause a significant impact on the ecosystem and society, although individual events involved may not be severe extremes themselves. Analyzing historical changes in concurrent climate extremes is critical to preparing for and mitigating the negative effects of climatic change and variability. This study focuses on the changes in concurrences of heatwaves and meteorological droughts from 1960 to 2010. Despite an apparent hiatus in rising temperature and no significant trend in droughts, we show a substantial increase in concurrent droughts and heatwaves across most parts of the United States, and a statistically significant shift in the distribution of concurrent extremes. Although commonly used trend analysis methods do not show any trend in concurrent droughts and heatwaves, a unique statistical approach discussed in this study exhibits a statistically significant change in the distribution of the data. PMID:26324927

  14. Substantial increase in concurrent droughts and heatwaves in the United States.

    PubMed

    Mazdiyasni, Omid; AghaKouchak, Amir

    2015-09-15

    A combination of climate events (e.g., low precipitation and high temperatures) may cause a significant impact on the ecosystem and society, although individual events involved may not be severe extremes themselves. Analyzing historical changes in concurrent climate extremes is critical to preparing for and mitigating the negative effects of climatic change and variability. This study focuses on the changes in concurrences of heatwaves and meteorological droughts from 1960 to 2010. Despite an apparent hiatus in rising temperature and no significant trend in droughts, we show a substantial increase in concurrent droughts and heatwaves across most parts of the United States, and a statistically significant shift in the distribution of concurrent extremes. Although commonly used trend analysis methods do not show any trend in concurrent droughts and heatwaves, a unique statistical approach discussed in this study exhibits a statistically significant change in the distribution of the data.

  15. An issue of literacy on pediatric arterial hypertension

    NASA Astrophysics Data System (ADS)

    Teodoro, M. Filomena; Romana, Andreia; Simão, Carla

    2017-11-01

    Arterial hypertension in pediatric age is a public health problem, whose prevalence has increased significantly over time. Pediatric arterial hypertension (PAH) is under-diagnosed in most cases, a highly prevalent disease, appears without notice with multiple consequences on the children's health and future adults. Children caregivers and close family must know the PAH existence, the negative consequences associated with it, the risk factors and, finally, must do prevention. In [12, 13] can be found a statistical data analysis using a simpler questionnaire introduced in [4] under the aim of a preliminary study about PAH caregivers acquaintance. A continuation of such analysis is detailed in [14]. An extension of such questionnaire was built and applied to a distinct population and it was filled online. The statistical approach is partially reproduced in the present work. Some statistical models were estimated using several approaches, namely multivariate analysis (factorial analysis), also adequate methods to analyze the kind of data in study.

  16. An elevated level of physical activity is associated with normal lipoprotein(a) levels in individuals from Maracaibo, Venezuela.

    PubMed

    Bermúdez, Valmore; Aparicio, Daniel; Rojas, Edward; Peñaranda, Lianny; Finol, Freddy; Acosta, Luis; Mengual, Edgardo; Rojas, Joselyn; Arráiz, Nailet; Toledo, Alexandra; Colmenares, Carlos; Urribarí, Jesica; Sanchez, Wireynis; Pineda, Carlos; Rodriguez, Dalia; Faria, Judith; Añez, Roberto; Cano, Raquel; Cano, Clímaco; Sorell, Luis; Velasco, Manuel

    2010-01-01

    Coronary artery disease is the main cause of death worldwide. Lipoprotein(a) [Lp(a)], is an independent risk factor for coronary artery disease in which concentrations are genetically regulated. Contradictory results have been published about physical activity influence on Lp(a) concentration. This research aimed to determine associations between different physical activity levels and Lp(a) concentration. A descriptive and cross-sectional study was made in 1340 randomly selected subjects (males = 598; females = 712) to whom a complete clinical history, the International Physical Activity Questionnaire, and Lp(a) level determination were made. Statistical analysis was carried out to assess qualitative variables relationship by chi2 and differences between means by one-way analysis of variance considering a P value <0.05 as statistically significant. Results are shown as absolute frequencies, percentages, and mean +/- standard deviation according to case. Physical activity levels were ordinal classified as follows: low activity with 24.3% (n = 318), moderate activity with 35.0% (n = 458), and high physical activity with 40.8% (n = 534). Lp(a) concentration in the studied sample was 26.28 +/- 12.64 (IC: 25.59-26.96) mg/dL. Lp(a) concentration according to low, moderate, and high physical activity levels were 29.22 +/- 13.74, 26.27 +/- 12.91, and 24.53 +/- 11.35 mg/dL, respectively, observing statistically significant differences between low and moderate level (P = 0.004) and low and high level (P < 0.001). A strong association (chi2 = 9.771; P = 0.002) was observed among a high physical activity level and a normal concentration of Lp(a) (less than 30 mg/dL). A lifestyle characterized by high physical activity is associated with normal Lp(a) levels.

  17. Reconsidering the autohypnotic model of the dissociative disorders.

    PubMed

    Dell, Paul F

    2018-03-22

    The dissociative disorders field and the hypnosis field currently reject the autohypnotic model of the dissociative disorders, largely because many correlational studies have shown hypnotizability and dissociation to be minimally related (r = .12). Curiously, it is also widely accepted that dissociative patients are highly hypnotizable. If dissociative patients are highly hypnotizable because only highly hypnotizable individuals can develop a dissociative disorder - as the author proposes - then the methodology of correlational studies of hypnotizability and dissociation in random clinical and community samples would necessarily be constitutively unable to detect, and statistically unable to reflect, that fact. That is, the autohypnotic, dissociative distancing of that small subset of highly hypnotizable individuals who repeatedly encountered intolerable circumstances is statistically lost among the data of (1) the highly hypnotizable subjects who do not dissociate and (2) subjects (of all levels of hypnotizability) who manifest other kinds of dissociation. The author proposes that, when highly hypnotizable individuals repeatedly engage in autohypnotic distancing from intolerable circumstances, they develop an overlearned, highly-motivated, automatized pattern of dissociative self-protection (i.e., a dissociative disorder). The author urges that theorists of hypnosis and the dissociative disorders explicitly include in their theories (a) the trait of high hypnotizability, (b) the phenomena of autohypnosis, and (c) the manifestations of systematized, autohypnotic pathology. Said differently, the author is suggesting that autohypnosis and autohypnotic pathology are unacknowledged nodes in the nomothetic networks of both hypnosis and dissociation.

  18. Coping strategies and self-esteem in the high-risk offspring of bipolar parents.

    PubMed

    Goodday, Sarah M; Bentall, Richard; Jones, Steven; Weir, Arielle; Duffy, Anne

    2018-03-01

    This study investigated whether there were differences in coping strategies and self-esteem between offspring of parents with bipolar disorder (high-risk) and offspring of unaffected parents (control), and whether these psychological factors predicted the onset and recurrence of mood episodes. High-risk and control offspring were followed longitudinally as part of the Flourish Canadian high-risk bipolar offspring cohort study. Offspring were clinically assessed annually by a psychiatrist using semi-structured interviews and completed a measure of coping strategies and self-esteem. In high-risk offspring, avoidant coping strategies significantly increased the hazard of a new onset Diagnostic and Statistical Manual of Mental Disorders, 4th Edition twice revised mood episode or recurrence (hazard ratio: 1.89, p = 0.04), while higher self-esteem significantly decreased this hazard (hazard ratio: 2.50, p < 0.01). Self-esteem and avoidant coping significantly interacted with one another ( p < 0.05), where the risk of a Diagnostic and Statistical Manual of Mental Disorders, 4th Edition twice revised new onset mood episode or recurrence was only significantly increased among high-risk offspring with both high avoidant coping and low self-esteem. A reduction of avoidant coping strategies in response to stress and improvement of self-esteem may be useful intervention targets for preventing the new onset or recurrence of a clinically significant mood disorder among individuals at high familial risk.

  19. Non-linear learning in online tutorial to enhance students’ knowledge on normal distribution application topic

    NASA Astrophysics Data System (ADS)

    Kartono; Suryadi, D.; Herman, T.

    2018-01-01

    This study aimed to analyze the enhancement of non-linear learning (NLL) in the online tutorial (OT) content to students’ knowledge of normal distribution application (KONDA). KONDA is a competence expected to be achieved after students studied the topic of normal distribution application in the course named Education Statistics. The analysis was performed by quasi-experiment study design. The subject of the study was divided into an experimental class that was given OT content in NLL model and a control class which was given OT content in conventional learning (CL) model. Data used in this study were the results of online objective tests to measure students’ statistical prior knowledge (SPK) and students’ pre- and post-test of KONDA. The statistical analysis test of a gain score of KONDA of students who had low and moderate SPK’s scores showed students’ KONDA who learn OT content with NLL model was better than students’ KONDA who learn OT content with CL model. Meanwhile, for students who had high SPK’s scores, the gain score of students who learn OT content with NLL model had relatively similar with the gain score of students who learn OT content with CL model. Based on those findings it could be concluded that the NLL model applied to OT content could enhance KONDA of students in low and moderate SPK’s levels. Extra and more challenging didactical situation was needed for students in high SPK’s level to achieve the significant gain score.

  20. Statistical Primer on Biosimilar Clinical Development.

    PubMed

    Isakov, Leah; Jin, Bo; Jacobs, Ira Allen

    A biosimilar is highly similar to a licensed biological product and has no clinically meaningful differences between the biological product and the reference (originator) product in terms of safety, purity, and potency and is approved under specific regulatory approval processes. Because both the originator and the potential biosimilar are large and structurally complex proteins, biosimilars are not generic equivalents of the originator. Thus, the regulatory approach for a small-molecule generic is not appropriate for a potential biosimilar. As a result, different study designs and statistical approaches are used in the assessment of a potential biosimilar. This review covers concepts and terminology used in statistical analyses in the clinical development of biosimilars so that clinicians can understand how similarity is evaluated. This should allow the clinician to understand the statistical considerations in biosimilar clinical trials and make informed prescribing decisions when an approved biosimilar is available.

  1. From Statistics to Meaning: Infants’ Acquisition of Lexical Categories

    PubMed Central

    Lany, Jill; Saffran, Jenny R.

    2013-01-01

    Infants are highly sensitive to statistical patterns in their auditory language input that mark word categories (e.g., noun and verb). However, it is unknown whether experience with these cues facilitates the acquisition of semantic properties of word categories. In a study testing this hypothesis, infants first listened to an artificial language in which word categories were reliably distinguished by statistical cues (experimental group) or in which these properties did not cue category membership (control group). Both groups were then trained on identical pairings between the words and pictures from two categories (animals and vehicles). Only infants in the experimental group learned the trained associations between specific words and pictures. Moreover, these infants generalized the pattern to include novel pairings. These results suggest that experience with statistical cues marking lexical categories sets the stage for learning the meanings of individual words and for generalizing meanings to new category members. PMID:20424058

  2. The invariant statistical rule of aerosol scattering pulse signal modulated by random noise

    NASA Astrophysics Data System (ADS)

    Yan, Zhen-gang; Bian, Bao-Min; Yang, Juan; Peng, Gang; Li, Zhen-hua

    2010-11-01

    A model of the random background noise acting on particle signals is established to study the impact of the background noise of the photoelectric sensor in the laser airborne particle counter on the statistical character of the aerosol scattering pulse signals. The results show that the noises broaden the statistical distribution of the particle's measurement. Further numerical research shows that the output of the signal amplitude still has the same distribution when the airborne particle with the lognormal distribution was modulated by random noise which has lognormal distribution. Namely it follows the statistics law of invariance. Based on this model, the background noise of photoelectric sensor and the counting distributions of random signal for aerosol's scattering pulse are obtained and analyzed by using a high-speed data acquisition card PCI-9812. It is found that the experiment results and simulation results are well consistent.

  3. Bayesian demography 250 years after Bayes

    PubMed Central

    Bijak, Jakub; Bryant, John

    2016-01-01

    Bayesian statistics offers an alternative to classical (frequentist) statistics. It is distinguished by its use of probability distributions to describe uncertain quantities, which leads to elegant solutions to many difficult statistical problems. Although Bayesian demography, like Bayesian statistics more generally, is around 250 years old, only recently has it begun to flourish. The aim of this paper is to review the achievements of Bayesian demography, address some misconceptions, and make the case for wider use of Bayesian methods in population studies. We focus on three applications: demographic forecasts, limited data, and highly structured or complex models. The key advantages of Bayesian methods are the ability to integrate information from multiple sources and to describe uncertainty coherently. Bayesian methods also allow for including additional (prior) information next to the data sample. As such, Bayesian approaches are complementary to many traditional methods, which can be productively re-expressed in Bayesian terms. PMID:26902889

  4. Assessing the statistical significance of the achieved classification error of classifiers constructed using serum peptide profiles, and a prescription for random sampling repeated studies for massive high-throughput genomic and proteomic studies.

    PubMed

    Lyons-Weiler, James; Pelikan, Richard; Zeh, Herbert J; Whitcomb, David C; Malehorn, David E; Bigbee, William L; Hauskrecht, Milos

    2005-01-01

    Peptide profiles generated using SELDI/MALDI time of flight mass spectrometry provide a promising source of patient-specific information with high potential impact on the early detection and classification of cancer and other diseases. The new profiling technology comes, however, with numerous challenges and concerns. Particularly important are concerns of reproducibility of classification results and their significance. In this work we describe a computational validation framework, called PACE (Permutation-Achieved Classification Error), that lets us assess, for a given classification model, the significance of the Achieved Classification Error (ACE) on the profile data. The framework compares the performance statistic of the classifier on true data samples and checks if these are consistent with the behavior of the classifier on the same data with randomly reassigned class labels. A statistically significant ACE increases our belief that a discriminative signal was found in the data. The advantage of PACE analysis is that it can be easily combined with any classification model and is relatively easy to interpret. PACE analysis does not protect researchers against confounding in the experimental design, or other sources of systematic or random error. We use PACE analysis to assess significance of classification results we have achieved on a number of published data sets. The results show that many of these datasets indeed possess a signal that leads to a statistically significant ACE.

  5. Statistical process control using optimized neural networks: a case study.

    PubMed

    Addeh, Jalil; Ebrahimzadeh, Ata; Azarbad, Milad; Ranaee, Vahid

    2014-09-01

    The most common statistical process control (SPC) tools employed for monitoring process changes are control charts. A control chart demonstrates that the process has altered by generating an out-of-control signal. This study investigates the design of an accurate system for the control chart patterns (CCPs) recognition in two aspects. First, an efficient system is introduced that includes two main modules: feature extraction module and classifier module. In the feature extraction module, a proper set of shape features and statistical feature are proposed as the efficient characteristics of the patterns. In the classifier module, several neural networks, such as multilayer perceptron, probabilistic neural network and radial basis function are investigated. Based on an experimental study, the best classifier is chosen in order to recognize the CCPs. Second, a hybrid heuristic recognition system is introduced based on cuckoo optimization algorithm (COA) algorithm to improve the generalization performance of the classifier. The simulation results show that the proposed algorithm has high recognition accuracy. Copyright © 2013 ISA. Published by Elsevier Ltd. All rights reserved.

  6. Derivation and Applicability of Asymptotic Results for Multiple Subtests Person-Fit Statistics

    PubMed Central

    Albers, Casper J.; Meijer, Rob R.; Tendeiro, Jorge N.

    2016-01-01

    In high-stakes testing, it is important to check the validity of individual test scores. Although a test may, in general, result in valid test scores for most test takers, for some test takers, test scores may not provide a good description of a test taker’s proficiency level. Person-fit statistics have been proposed to check the validity of individual test scores. In this study, the theoretical asymptotic sampling distribution of two person-fit statistics that can be used for tests that consist of multiple subtests is first discussed. Second, simulation study was conducted to investigate the applicability of this asymptotic theory for tests of finite length, in which the correlation between subtests and number of items in the subtests was varied. The authors showed that these distributions provide reasonable approximations, even for tests consisting of subtests of only 10 items each. These results have practical value because researchers do not have to rely on extensive simulation studies to simulate sampling distributions. PMID:29881053

  7. Effects of forcing time scale on the simulated turbulent flows and turbulent collision statistics of inertial particles

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Rosa, B., E-mail: bogdan.rosa@imgw.pl; Parishani, H.; Department of Earth System Science, University of California, Irvine, California 92697-3100

    2015-01-15

    In this paper, we study systematically the effects of forcing time scale in the large-scale stochastic forcing scheme of Eswaran and Pope [“An examination of forcing in direct numerical simulations of turbulence,” Comput. Fluids 16, 257 (1988)] on the simulated flow structures and statistics of forced turbulence. Using direct numerical simulations, we find that the forcing time scale affects the flow dissipation rate and flow Reynolds number. Other flow statistics can be predicted using the altered flow dissipation rate and flow Reynolds number, except when the forcing time scale is made unrealistically large to yield a Taylor microscale flow Reynoldsmore » number of 30 and less. We then study the effects of forcing time scale on the kinematic collision statistics of inertial particles. We show that the radial distribution function and the radial relative velocity may depend on the forcing time scale when it becomes comparable to the eddy turnover time. This dependence, however, can be largely explained in terms of altered flow Reynolds number and the changing range of flow length scales present in the turbulent flow. We argue that removing this dependence is important when studying the Reynolds number dependence of the turbulent collision statistics. The results are also compared to those based on a deterministic forcing scheme to better understand the role of large-scale forcing, relative to that of the small-scale turbulence, on turbulent collision of inertial particles. To further elucidate the correlation between the altered flow structures and dynamics of inertial particles, a conditional analysis has been performed, showing that the regions of higher collision rate of inertial particles are well correlated with the regions of lower vorticity. Regions of higher concentration of pairs at contact are found to be highly correlated with the region of high energy dissipation rate.« less

  8. Multiple Category-Lot Quality Assurance Sampling: A New Classification System with Application to Schistosomiasis Control

    PubMed Central

    Olives, Casey; Valadez, Joseph J.; Brooker, Simon J.; Pagano, Marcello

    2012-01-01

    Background Originally a binary classifier, Lot Quality Assurance Sampling (LQAS) has proven to be a useful tool for classification of the prevalence of Schistosoma mansoni into multiple categories (≤10%, >10 and <50%, ≥50%), and semi-curtailed sampling has been shown to effectively reduce the number of observations needed to reach a decision. To date the statistical underpinnings for Multiple Category-LQAS (MC-LQAS) have not received full treatment. We explore the analytical properties of MC-LQAS, and validate its use for the classification of S. mansoni prevalence in multiple settings in East Africa. Methodology We outline MC-LQAS design principles and formulae for operating characteristic curves. In addition, we derive the average sample number for MC-LQAS when utilizing semi-curtailed sampling and introduce curtailed sampling in this setting. We also assess the performance of MC-LQAS designs with maximum sample sizes of n = 15 and n = 25 via a weighted kappa-statistic using S. mansoni data collected in 388 schools from four studies in East Africa. Principle Findings Overall performance of MC-LQAS classification was high (kappa-statistic of 0.87). In three of the studies, the kappa-statistic for a design with n = 15 was greater than 0.75. In the fourth study, where these designs performed poorly (kappa-statistic less than 0.50), the majority of observations fell in regions where potential error is known to be high. Employment of semi-curtailed and curtailed sampling further reduced the sample size by as many as 0.5 and 3.5 observations per school, respectively, without increasing classification error. Conclusion/Significance This work provides the needed analytics to understand the properties of MC-LQAS for assessing the prevalance of S. mansoni and shows that in most settings a sample size of 15 children provides a reliable classification of schools. PMID:22970333

  9. Arsenic exposure and bladder cancer: quantitative assessment of studies in human populations to detect risks at low doses.

    PubMed

    Tsuji, Joyce S; Alexander, Dominik D; Perez, Vanessa; Mink, Pamela J

    2014-03-20

    While exposures to high levels of arsenic in drinking water are associated with excess cancer risk (e.g., skin, bladder, and lung), exposures at lower levels (e.g., <100-200 µg/L) generally are not. Lack of significant associations may result from methodological issues (e.g., inadequate statistical power, exposure misclassification), or a different dose-response relationship at low exposures, possibly associated with a toxicological mode of action that requires a sufficient dose for increased tumor formation. The extent to which bladder cancer risk for low-level arsenic exposure can be statistically measured by epidemiological studies was examined using an updated meta-analysis of bladder cancer risk with data from two new publications. The summary relative risk estimate (SRRE) for all nine studies was elevated slightly, but not significantly (1.07; 95% confidence interval [CI]: 0.95-1.21, p-Heterogeneity [p-H]=0.543). The SRRE among never smokers was 0.85 (95% CI: 0.66-1.08, p-H=0.915), whereas the SRRE was positive and more heterogeneous among ever smokers (1.18; 95% CI: 0.97-1.44, p-H=0.034). The SRRE was statistically significantly lower than relative risks predicted for never smokers in the United States based on linear extrapolation of risks from higher doses in southwest Taiwan to arsenic water exposures >10 µg/L for more than one-third of a lifetime. By contrast, for all study subjects, relative risks predicted for one-half of lifetime exposure to 50 µg/L were just above the upper 95% CI on the SRRE. Thus, results from low-exposure studies, particularly for never smokers, were statistically inconsistent with predicted risk based on high-dose extrapolation. Additional studies that better characterize tobacco use and stratify analyses of arsenic and bladder cancer by smoking status are necessary to further examine risks of arsenic exposure for smokers. Copyright © 2014 The Authors. Published by Elsevier Ireland Ltd.. All rights reserved.

  10. High precision mass measurements for wine metabolomics

    PubMed Central

    Roullier-Gall, Chloé; Witting, Michael; Gougeon, Régis D.; Schmitt-Kopplin, Philippe

    2014-01-01

    An overview of the critical steps for the non-targeted Ultra-High Performance Liquid Chromatography coupled with Quadrupole Time-of-Flight Mass Spectrometry (UPLC-Q-ToF-MS) analysis of wine chemistry is given, ranging from the study design, data preprocessing and statistical analyses, to markers identification. UPLC-Q-ToF-MS data was enhanced by the alignment of exact mass data from FTICR-MS, and marker peaks were identified using UPLC-Q-ToF-MS2. In combination with multivariate statistical tools and the annotation of peaks with metabolites from relevant databases, this analytical process provides a fine description of the chemical complexity of wines, as exemplified in the case of red (Pinot noir) and white (Chardonnay) wines from various geographic origins in Burgundy. PMID:25431760

  11. High precision mass measurements for wine metabolomics

    NASA Astrophysics Data System (ADS)

    Roullier-Gall, Chloé; Witting, Michael; Gougeon, Régis; Schmitt-Kopplin, Philippe

    2014-11-01

    An overview of the critical steps for the non-targeted Ultra-High Performance Liquid Chromatography coupled with Quadrupole Time-of-Flight Mass Spectrometry (UPLC-Q-ToF-MS) analysis of wine chemistry is given, ranging from the study design, data preprocessing and statistical analyses, to markers identification. UPLC-Q-ToF-MS data was enhanced by the alignment of exact mass data from FTICR-MS, and marker peaks were identified using UPLC-Q-ToF-MS². In combination with multivariate statistical tools and the annotation of peaks with metabolites from relevant databases, this analytical process provides a fine description of the chemical complexity of wines, as exemplified in the case of red (Pinot noir) and white (Chardonnay) wines from various geographic origins in Burgundy.

  12. High Prevalence of Obesity and Female Gender Among Patients With Concomitant Tibialis Posterior Tendonitis and Plantar Fasciitis.

    PubMed

    Reb, Christopher W; Schick, Faith A; Karanjia, Homyar N; Daniel, Joseph N

    2015-10-01

    The link between increased body weight and hindfoot complaints is largely based on correlation to single foot pathology. We retrospectively reviewed 6879 patients with tibialis posterior tendonitis (TPT), plantar fasciitis (PF), or both. Among patients with either TPT or PF, 1 in 11 (9%) had both. We then compared age, gender, and body mass index among these groups. Patients with both diagnoses were neither statistically older nor more obese than patients with single diagnoses. However, they were statistically more female. Given the overall high prevalence of obesity in the study population, we feel these data support the link between obesity and multiple foot pathology. Prognostic, Level IV: Case series. © 2015 The Author(s).

  13. Factors influencing medical informatics examination grade--can biorhythm, astrological sign, seasonal aspect, or bad statistics predict outcome?

    PubMed

    Petrovecki, Mladen; Rahelić, Dario; Bilić-Zulle, Lidija; Jelec, Vjekoslav

    2003-02-01

    To investigate whether and to what extent various parameters, such as individual characteristics, computer habits, situational factors, and pseudoscientific variables, influence Medical Informatics examination grade, and how inadequate statistical analysis can lead to wrong conclusions. The study included a total of 382 second-year undergraduate students at the Rijeka University School of Medicine in the period from 1996/97 to 2000/01 academic year. After passing the Medical Informatics exam, students filled out an anonymous questionnaire about their attitude toward learning medical informatics. They were asked to grade the course organization and curriculum content, and provide their date of birth; sex; study year; high school grades; Medical Informatics examination grade, type, and term; and describe their computer habits. From these data, we determined their zodiac signs and biorhythm. Data were compared by the use of t-test, one-way ANOVA with Tukey's honest significance difference test, and randomized complete block design ANOVA. Out of 21 variables analyzed, only 10 correlated with the average grade. Students taking Medical Informatics examination in the 1998/99 academic year earned lower average grade than any other generation. Significantly higher Medical Informatics exam grade was earned by students who finished a grammar high school; owned and regularly used a computer, Internet, and e-mail (p< or =0.002 for all items); passed an oral exam without taking a written test (p=0.004), or did not repeat the exam (p<0.001). Better high-school students and students with better grades from high-school informatics course also scored significantly better (p=0.032 and p<0.001, respectively). Grade in high-school mathematics, student's sex, and time of year when the examination was taken were not related to the grade, and neither were pseudoscientific parameters, such as student zodiac sign, zodiac sign quality, or biorhythm cycles, except when intentionally inadequate statistics was used for data analysis. Medical Informatics examination grades correlated with general learning capacity and computer habits of students, but showed no relation to other investigated parameters, such as examination term or pseudoscientific parameters. Inadequate statistical analysis can always confirm false conclusions.

  14. The effect of role assignment in high fidelity patient simulation on nursing students: An experimental research study.

    PubMed

    Weiler, Dustin T; Gibson, Andrea L; Saleem, Jason J

    2018-04-01

    Previous studies have evaluated the effectiveness of high fidelity patient simulators (HFPS) on nursing training; however, a gap exists on the effects of role assignment on critical thinking, self-efficacy, and situation awareness skills in team-based simulation scenarios. This study aims to determine if role assignment and the involvement level related to the roles yields significant effects and differences in critical thinking, situation awareness and self-efficacy scores in team-based high-fidelity simulation scenarios. A single factorial design with five levels and random assignment was utilized. A public university-sponsored simulation center in the United States of America. A convenience sample of 69 junior-level baccalaureate nursing students was recruited for participation. Participants were randomly assigned one of five possible roles and completed pre-simulation critical thinking and self-efficacy assessments prior to the simulation beginning. Playing within their assigned roles, participants experienced post-partum hemorrhaging scenario using an HFPS. After completing the simulation, participants completed a situation awareness assessment and a post-simulation critical thinking and self-efficacy assessment. Role assignment was found to have a statistically significant effect on critical thinking skills and a statistically significant difference in various areas of self-efficacy was also noted. However, no statistical significance in situation awareness abilities was found. Results support the notion that certain roles required the participant to be more involved with the simulation scenario, which may have yielded higher critical thinking and self-efficacy scores than roles that required a lesser level of involvement. Copyright © 2018 Elsevier Ltd. All rights reserved.

  15. Relationship between High School Mathematics Grade and Number of Attempts Required to Pass the Medication Calculation Test in Nurse Education: An Explorative Study

    PubMed Central

    Alteren, Johanne; Nerdal, Lisbeth

    2015-01-01

    In Norwegian nurse education, students are required to achieve a perfect score in a medication calculation test before undertaking their first practice period during the second semester. Passing the test is a challenge, and students often require several attempts. Adverse events in medication administration can be related to poor mathematical skills. The purpose of this study was to explore the relationship between high school mathematics grade and the number of attempts required to pass the medication calculation test in nurse education. The study used an exploratory design. The participants were 90 students enrolled in a bachelor’s nursing program. They completed a self-report questionnaire, and statistical analysis was performed. The results provided no basis for the conclusion that a statistical relationship existed between high school mathematics grade and number of attempts required to pass the medication calculation test. Regardless of their grades in mathematics, 43% of the students passed the medication calculation test on the first attempt. All of the students who had achieved grade 5 had passed by the third attempt. High grades in mathematics were not crucial to passing the medication calculation test. Nonetheless, the grade may be important in ensuring a pass within fewer attempts. PMID:27417767

  16. Comparison of student's learning achievement through realistic mathematics education (RME) approach and problem solving approach on grade VII

    NASA Astrophysics Data System (ADS)

    Ilyas, Muhammad; Salwah

    2017-02-01

    The type of this research was experiment. The purpose of this study was to determine the difference and the quality of student's learning achievement between students who obtained learning through Realistic Mathematics Education (RME) approach and students who obtained learning through problem solving approach. This study was a quasi-experimental research with non-equivalent experiment group design. The population of this study was all students of grade VII in one of junior high school in Palopo, in the second semester of academic year 2015/2016. Two classes were selected purposively as sample of research that was: year VII-5 as many as 28 students were selected as experiment group I and VII-6 as many as 23 students were selected as experiment group II. Treatment that used in the experiment group I was learning by RME Approach, whereas in the experiment group II by problem solving approach. Technique of data collection in this study gave pretest and posttest to students. The analysis used in this research was an analysis of descriptive statistics and analysis of inferential statistics using t-test. Based on the analysis of descriptive statistics, it can be concluded that the average score of students' mathematics learning after taught using problem solving approach was similar to the average results of students' mathematics learning after taught using realistic mathematics education (RME) approach, which are both at the high category. In addition, It can also be concluded that; (1) there was no difference in the results of students' mathematics learning taught using realistic mathematics education (RME) approach and students who taught using problem solving approach, (2) quality of learning achievement of students who received RME approach and problem solving approach learning was same, which was at the high category.

  17. Salivary secretory leukocyte protease inhibitor (SLPI) and head and neck cancer: The Cancer Prevention Study II Nutrition Cohort.

    PubMed

    Pierce Campbell, Christine M; Giuliano, Anna R; Torres, B Nelson; O'Keefe, Michael T; Ingles, Donna J; Anderson, Rebecca L; Teras, Lauren R; Gapstur, Susan M

    2016-04-01

    Secretory leukocyte protease inhibitor (SLPI) is an innate-immunity protein displaying antimicrobial and anti-inflammatory properties that is found in high concentrations in saliva. The role of extracellular salivary SLPI in head and neck squamous cell carcinoma (HNSCC) remains unclear. Thus, we aimed to evaluate the association between SLPI and HNSCC risk in the Cancer Prevention Study II Nutrition Cohort. Among 53,180 men and women with no history of cancer who provided an oral rinse between 2001 and 2002, 60 were subsequently diagnosed with incident HNSCC between specimen collection and June 2009. In this nested case-control study, archived oral supernatants were evaluated using the Human SLPI Quantikine ELISA Kit for all 60 cases and 180 controls individually matched on gender, race, date of birth, and date of oral rinse collection. Conditional logistic regression was used to estimate HNSCC risk. Overall, pre-diagnostic salivary SLPI was associated with a non-statistically significant higher risk of HNSCC (OR=1.6, 95% CI=0.9-3.0). Among never smokers, high SLPI was associated with a non-statistically significant lower risk (OR=0.5, 95% CI=0.1-1.9), whereas among ever smokers, high SLPI was associated with a statistically significant higher risk (OR=2.1, 95% CI=1.0-4.3) of HNSCC, compared to low SLPI. While results from this study suggest that higher concentrations of salivary SLPI might increase the risk of HNSCC among ever smokers, more research is needed to verify these findings and define the mechanisms by which SLPI and smoking influence the etiology of HNSCC. Copyright © 2016 Elsevier Ltd. All rights reserved.

  18. Antibiotic treatment of bacterial vaginosis in pregnancy: a meta-analysis.

    PubMed

    Leitich, Harald; Brunbauer, Mathias; Bodner-Adler, Barbara; Kaider, Alexandra; Egarter, Christian; Husslein, Peter

    2003-03-01

    The purpose of this study was to evaluate the effectiveness of antibiotic treatment of bacterial vaginosis in pregnancy to reduce preterm delivery. We performed a meta-analysis of published, English-language, randomized, placebo-controlled clinical trials of antibiotic treatment of bacterial vaginosis in pregnant women with intact amniotic membranes at <37 weeks of gestation. Primary outcomes included preterm delivery, perinatal or neonatal death, and neonatal morbidity. Ten studies with results for 3969 patients were included. In patients without preterm labor, antibiotic treatment did not significantly decrease preterm delivery at <37 weeks of gestation, in all patients combined (odds ratio, 0.83; 95% CI, 0.57-1.21) nor in high-risk patients with a previous preterm delivery (odds ratio, 0.50; 95% CI, 0.22-1.12). In both groups, significant statistical heterogeneity was observed. A significant reduction in preterm delivery and no statistical heterogeneity were observed in 338 high-risk patients who received oral regimens with treatment durations of > or =7 days (odds ratio, 0.42; 95% CI, 0.27-0.67). Nonsignificant effects and no statistical heterogeneity were observed in low-risk patients (odds ratio, 0.94; 95% CI, 0.71-1.25) and with vaginal regimens (odds ratio, 1.25; 95% CI: 0.86-1.81). In one study antibiotic treatment in patients with preterm labor led to a nonsignificant decrease in the rate of preterm deliveries (odds ratio, 0.31; 95% CI, 0.03-3.24). The screening of pregnant women who have bacterial vaginosis and who have had a previous preterm delivery and treatment with an oral regimen of longer duration can be justified on the basis of current evidence. More studies are needed to confirm the effectiveness of this strategy, both in high-risk patients without preterm labor and in patients with preterm labor.

  19. Estimating the mean and standard deviation of environmental data with below detection limit observations: Considering highly skewed data and model misspecification.

    PubMed

    Shoari, Niloofar; Dubé, Jean-Sébastien; Chenouri, Shoja'eddin

    2015-11-01

    In environmental studies, concentration measurements frequently fall below detection limits of measuring instruments, resulting in left-censored data. Some studies employ parametric methods such as the maximum likelihood estimator (MLE), robust regression on order statistic (rROS), and gamma regression on order statistic (GROS), while others suggest a non-parametric approach, the Kaplan-Meier method (KM). Using examples of real data from a soil characterization study in Montreal, we highlight the need for additional investigations that aim at unifying the existing literature. A number of studies have examined this issue; however, those considering data skewness and model misspecification are rare. These aspects are investigated in this paper through simulations. Among other findings, results show that for low skewed data, the performance of different statistical methods is comparable, regardless of the censoring percentage and sample size. For highly skewed data, the performance of the MLE method under lognormal and Weibull distributions is questionable; particularly, when the sample size is small or censoring percentage is high. In such conditions, MLE under gamma distribution, rROS, GROS, and KM are less sensitive to skewness. Related to model misspecification, MLE based on lognormal and Weibull distributions provides poor estimates when the true distribution of data is misspecified. However, the methods of rROS, GROS, and MLE under gamma distribution are generally robust to model misspecifications regardless of skewness, sample size, and censoring percentage. Since the characteristics of environmental data (e.g., type of distribution and skewness) are unknown a priori, we suggest using MLE based on gamma distribution, rROS and GROS. Copyright © 2015 Elsevier Ltd. All rights reserved.

  20. Prevalence of high altitude pulmonary hypertension among the natives of Spiti Valley--a high altitude region in Himachal Pradesh, India.

    PubMed

    Negi, Prakash Chand; Marwaha, Rajeev; Asotra, Sanjeev; Kandoria, Arvind; Ganju, Neeraj; Sharma, Rajesh; Kumar, Ravi V; Bhardwaj, Rajeev

    2014-12-01

    The study aimed to determine the prevalence of high altitude pulmonary hypertension (HAPH) and its predisposing factors among natives of Spiti Valley. A cross-sectional survey study was done on the permanent natives of Spiti Valley residing at an altitude of 3000 m to 4200 m. Demographic characteristics, health behavior, anthropometrics, and blood pressure were recorded. Investigations included recording of 12 lead electrocardiogram (ECG), SaO2 with pulse oximeter, spirometry and echocardiography study, and measurement of Hb levels using the cynmethhemoglobin method. HAPH was diagnosed using criteria; tricuspid regurgitation (TR) gradient of ≥46 mmHg. ECG evidence of RV overload on 12 lead ECG was documented based on presence of 2 out of 3 criteria; R>S in V1, right axis deviation or RV strain, T wave inversion in V1 and V2. Data of 1087 subjects were analyzed who were free of cardiorespiratory diseases to determine the prevalence of HAPH and its predisposing factors. HAPH was recorded in 3.23% (95% C.I. of 0.9-8.1%) and ECG evidence of right ventricular (RV) overload was 1.5% in the study population. Prevalence of HAPH was not different in men and women 2.63% vs. 3.54% p<0.2. Age (Z statistics of 3.4 p<0.0006), hypoxemia (Z statistics of 2.9 p<0.002), and erythrocythemia (Z statistics of 4.7 p<0.003) were independently associated with HAPH. Altitude of residence was not found to be significantly associated with HAPH, although there was a trend of increasing prevalence with increasing altitude. It can be concluded that HAPH is prevalent in 3.23% of natives of Spiti Valley. Increasing age, erythrocythemia and hypoxemia are independent predisposing factors.

  1. Analysis of the Relationship between the Emotional Intelligence and Professional Burnout Levels of Teachers

    ERIC Educational Resources Information Center

    Adilogullari, Ilhan

    2014-01-01

    The purpose of this study is to analyze the relationship between the emotional intelligence and professional burnout levels of teachers. The nature of the study consists of high school teachers employed in city center of Kirsehir Province; 563 volunteer teachers form the nature of sampling. The statistical implementation of the study is performed…

  2. WE-G-18A-04: 3D Dictionary Learning Based Statistical Iterative Reconstruction for Low-Dose Cone Beam CT Imaging

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Bai, T; UT Southwestern Medical Center, Dallas, TX; Yan, H

    2014-06-15

    Purpose: To develop a 3D dictionary learning based statistical reconstruction algorithm on graphic processing units (GPU), to improve the quality of low-dose cone beam CT (CBCT) imaging with high efficiency. Methods: A 3D dictionary containing 256 small volumes (atoms) of 3x3x3 voxels was trained from a high quality volume image. During reconstruction, we utilized a Cholesky decomposition based orthogonal matching pursuit algorithm to find a sparse representation on this dictionary basis of each patch in the reconstructed image, in order to regularize the image quality. To accelerate the time-consuming sparse coding in the 3D case, we implemented our algorithm inmore » a parallel fashion by taking advantage of the tremendous computational power of GPU. Evaluations are performed based on a head-neck patient case. FDK reconstruction with full dataset of 364 projections is used as the reference. We compared the proposed 3D dictionary learning based method with a tight frame (TF) based one using a subset data of 121 projections. The image qualities under different resolutions in z-direction, with or without statistical weighting are also studied. Results: Compared to the TF-based CBCT reconstruction, our experiments indicated that 3D dictionary learning based CBCT reconstruction is able to recover finer structures, to remove more streaking artifacts, and is less susceptible to blocky artifacts. It is also observed that statistical reconstruction approach is sensitive to inconsistency between the forward and backward projection operations in parallel computing. Using high a spatial resolution along z direction helps improving the algorithm robustness. Conclusion: 3D dictionary learning based CBCT reconstruction algorithm is able to sense the structural information while suppressing noise, and hence to achieve high quality reconstruction. The GPU realization of the whole algorithm offers a significant efficiency enhancement, making this algorithm more feasible for potential clinical application. A high zresolution is preferred to stabilize statistical iterative reconstruction. This work was supported in part by NIH(1R01CA154747-01), NSFC((No. 61172163), Research Fund for the Doctoral Program of Higher Education of China (No. 20110201110011), China Scholarship Council.« less

  3. Fundamentals of nuclear medicine

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Alazraki, N.P.; Mishkin, F.S.

    1988-01-01

    The book begins with basic science and statistics relevant to nuclear medicine, and specific organ systems are addressed in separate chapters. A section of the text also covers imaging of groups of disease processes (eg, trauma, cancer). The authors present a comparison between nuclear medicine techniques and other diagnostic imaging studies. A table is given which comments on sensitivities and specificities of common nuclear medicine studies. The sensitivities and specificities are categorized as very high, high, moderate, and so forth.

  4. An Exploration of Teacher Attrition and Mobility in High Poverty Racially Segregated Schools

    ERIC Educational Resources Information Center

    Djonko-Moore, Cara M.

    2016-01-01

    The purpose of this study was to examine the mobility (movement to a new school) and attrition (quitting teaching) patterns of teachers in high poverty, racially segregated (HPRS) schools in the US. Using 2007-9 survey data from the National Center for Education Statistics, a multi-level multinomial logistic regression was performed to examine the…

  5. The Relationship of High School Preparation in Mathematics to the Enrollment of College Freshman in Postsecondary Developmental Mathematics Courses

    ERIC Educational Resources Information Center

    Lang, Erick

    2012-01-01

    A student's mathematical preparation is important in readiness for postsecondary study and ultimately success in a global job market. Nationally, a significant number of students are leaving high school unprepared for college-level course work in mathematics. A 2008 National Center for Educational Statistics report on Community Colleges indicates…

  6. United States High School Sophomores: A Twenty-Two Year Comparison, 1980-2002. Statistical Analysis Report. NCES 2006-327

    ERIC Educational Resources Information Center

    Cahalan, Margaret W.; Ingels, Steven J.; Burns, Laura J.; Planty, Michael; Daniel, Bruce

    2006-01-01

    This report presents information on similarities and differences between U.S. high school sophomores as studied at three points in time over the past 22 years, with a focus on cohort demographics, academic programs and performance, extracurricular activities, life values, and educational/occupational aspirations. It provides an update to the…

  7. The Relationship between Knowledge Management and Organizational Learning with the Effectiveness of Ordinary and Smart Secondary School Principals

    ERIC Educational Resources Information Center

    Khammar, Zahra; Heidarzadegan, Alireza; Balaghat, Seyed Reza; Salehi, Hadi

    2013-01-01

    This study aimed to investigate the relationship between knowledge management and organizational learning with the effectiveness of ordinary and smart high school principals in Zahedan Pre-province. The statistical community of this research is 1350 male and female teachers teaching in ordinary and smart students of high schools in that 300 ones…

  8. High-dose progestins for the treatment of cancer anorexia-cachexia syndrome: a systematic review of randomised clinical trials.

    PubMed

    Maltoni, M; Nanni, O; Scarpi, E; Rossi, D; Serra, P; Amadori, D

    2001-03-01

    The aim of the present study was to summarise evidence from scientific studies on cancer anorexia-cachexia syndrome in order to assess and highlight the efficacy of high-dose progestins (megestrol acetate and medroxyprogesterone acetate) compared with placebo in patients with hormone-independent tumors. A systematic review of published randomised clinical trials was carried out by an extensive electronic and hand search through databases, relevant journals and books, congress, proceedings, reference lists, without any language or year of publication restriction. The research was conducted by two independent operators who collected the data in a form specifically designed for this review. Among the several possible outcomes, appetite and body weight were chosen. Fifteen randomised clinical trials (more than 2000 patients) were retrieved for the review. There was a statistically significant advantage for high-dose progestins as regards improved appetite: pooled odds ratio (OR) = 4.23, 95% confidence interval (CI): 2.53-7.04. Although the effect of high-dose progestins on body weight was less impressive, statistical significance was also reached for this outcome: pooled OR = 2.66, 95% CI: 1.80-3.92. Treatment morbidity was very low, due to the brief period of the treatment in most of the studies. The effects of high-dose progestins on appetite and body weight were clearly demonstrated. However, further studies are undoubtedly warranted to investigate other aspects of progestin activity, especially as regards dosage, duration and timing with best therapeutic index.

  9. Biosocial correlates and spatial distribution of consanguinity in South America.

    PubMed

    Bronberg, Ruben; Gili, Juan; Gimenez, Lucas; Dipierri, Jose; Lopez Camelo, Jorge

    2016-05-01

    To analyze potential biosocial factors in consanguineous unions according to the level of consanguinity and its spatial distribution in South America. The data used came from the Latin American Collaborative Study of Congenital Malformations. Information on 126,213 nonmalformed newborns out of 6,014,749 births was used. This information was collected between 1967 and 2011 at 204 hospitals in 116 cities in 10 South American countries. The spatial scan statistic was performed under a model of nonhierarchical k-means segmentation, based on statistically significant clusters, areas with levels of high, medium, and low consanguinity were determined. Consanguinity in South America is heterogeneously distributed, with two groups of high consanguinity, in northwestern Venezuela and southeast of Brazil, and two clusters of low consanguinity located in the south of the continent, mainly Argentina. The socio-demographic factors associated with consanguinity influence the population structure in areas of high consanguinity. This study demonstrates that consanguinity in the South American continent is strongly associated with a greater magnitude of poverty in the area of high consanguinity. Am. J. Hum. Biol. 28:405-411, 2016. © 2015 Wiley Periodicals, Inc. © 2015 Wiley Periodicals, Inc.

  10. Relationship of nurses' intrapersonal characteristics with work performance and caring behaviors: A cross-sectional study.

    PubMed

    Geyer, Nelouise-Marié; Coetzee, Siedine K; Ellis, Suria M; Uys, Leana R

    2018-02-28

    This study aimed to describe intrapersonal characteristics (professional values, personality, empathy, and job involvement), work performance as perceived by nurses, and caring behaviors as perceived by patients, and to examine the relationships among these variables. A cross-sectional design was employed. A sample was recruited of 218 nurses and 116 patients in four private hospitals and four public hospitals. Data were collected using self-report measures. Data analysis included descriptive statistics, exploratory and confirmatory factor analyses, hierarchical linear modelling, correlations, and structural equation modeling. Nurses perceived their work performance to be of high quality. Among the intrapersonal characteristics, nurses had high scores for professional values, and moderately high scores for personality, empathy and job involvement. Patients perceived nurses' caring behaviors as moderately high. Professional values of nurses were the only selected intrapersonal characteristic with a statistically significant positive relationship, of practical importance, with work performance as perceived by nurses and with caring behaviors as perceived by patients at ward level. Managers can enhance nurses' work performance and caring behaviors through provision of in-service training that focuses on development of professional values. © 2018 John Wiley & Sons Australia, Ltd.

  11. Meta-Analysis of Correlations Between Marginal Bone Resorption and High Insertion Torque of Dental Implants.

    PubMed

    Li, Haoyan; Liang, Yongqiang; Zheng, Qiang

    2015-01-01

    To evaluate correlations between marginal bone resorption and high insertion torque value (> 50 Ncm) of dental implants and to assess the significance of immediate and early/conventional loading of implants under a certain range torque value. Specific inclusion and exclusion criteria were used to retrieve eligible articles from Ovid, PubMed, and EBSCO up to December 2013. Screening of eligible studies, quality assessment, and data extraction were conducted in duplicate. The results were expressed as random/fixed-effects models using weighted mean differences for continuous outcomes with 95% confidence intervals. Initially, 154 articles were selected (11 from Ovid, 112 from PubMed, and 31 from EBSCO). After exclusion of duplicate articles and articles that did not meet the inclusion criteria, six clinical studies were selected. Assessment of P values revealed that correlations between marginal bone resorption and high insertion torque were not statistically significant and that there was no difference between immediately versus early/conventionally loaded implants under a certain range of torque. None of the meta-analyses revealed any statistically significant differences between high insertion torque and conventional insertion torque in terms of effects on marginal bone resorption.

  12. Digital Breast Tomosynthesis guided Near Infrared Spectroscopy: Volumetric estimates of fibroglandular fraction and breast density from tomosynthesis reconstructions

    PubMed Central

    Vedantham, Srinivasan; Shi, Linxi; Michaelsen, Kelly E.; Krishnaswamy, Venkataramanan; Pogue, Brian W.; Poplack, Steven P.; Karellas, Andrew; Paulsen, Keith D.

    2016-01-01

    A multimodality system combining a clinical prototype digital breast tomosynthesis with its imaging geometry modified to facilitate near-infrared spectroscopic imaging has been developed. The accuracy of parameters recovered from near-infrared spectroscopy is dependent on fibroglandular tissue content. Hence, in this study, volumetric estimates of fibroglandular tissue from tomosynthesis reconstructions were determined. A kernel-based fuzzy c-means algorithm was implemented to segment tomosynthesis reconstructed slices in order to estimate fibroglandular content and to provide anatomic priors for near-infrared spectroscopy. This algorithm was used to determine volumetric breast density (VBD), defined as the ratio of fibroglandular tissue volume to the total breast volume, expressed as percentage, from 62 tomosynthesis reconstructions of 34 study participants. For a subset of study participants who subsequently underwent mammography, VBD from mammography matched for subject, breast laterality and mammographic view was quantified using commercial software and statistically analyzed to determine if it differed from tomosynthesis. Summary statistics of the VBD from all study participants were compared with prior independent studies. The fibroglandular volume from tomosynthesis and mammography were not statistically different (p=0.211, paired t-test). After accounting for the compressed breast thickness, which were different between tomosynthesis and mammography, the VBD from tomosynthesis was correlated with (r =0.809, p<0.001), did not statistically differ from (p>0.99, paired t-test), and was linearly related to, the VBD from mammography. Summary statistics of the VBD from tomosynthesis were not statistically different from prior studies using high-resolution dedicated breast computed tomography. The observation of correlation and linear association in VBD between mammography and tomosynthesis suggests that breast density associated risk measures determined for mammography are translatable to tomosynthesis. Accounting for compressed breast thickness is important when it differs between the two modalities. The fibroglandular volume from tomosynthesis reconstructions is similar to mammography indicating suitability for use during near-infrared spectroscopy. PMID:26941961

  13. Multifactorial modelling of high-temperature treatment of timber in the saturated water steam medium

    NASA Astrophysics Data System (ADS)

    Prosvirnikov, D. B.; Safin, R. G.; Ziatdinova, D. F.; Timerbaev, N. F.; Lashkov, V. A.

    2016-04-01

    The paper analyses experimental data obtained in studies of high-temperature treatment of softwood and hardwood in an environment of saturated water steam. Data were processed in the Curve Expert software for the purpose of statistical modelling of processes and phenomena occurring during this process. The multifactorial modelling resulted in the empirical dependences, allowing determining the main parameters of this type of hydrothermal treatment with high accuracy.

  14. Statistical context shapes stimulus-specific adaptation in human auditory cortex

    PubMed Central

    Henry, Molly J.; Fromboluti, Elisa Kim; McAuley, J. Devin

    2015-01-01

    Stimulus-specific adaptation is the phenomenon whereby neural response magnitude decreases with repeated stimulation. Inconsistencies between recent nonhuman animal recordings and computational modeling suggest dynamic influences on stimulus-specific adaptation. The present human electroencephalography (EEG) study investigates the potential role of statistical context in dynamically modulating stimulus-specific adaptation by examining the auditory cortex-generated N1 and P2 components. As in previous studies of stimulus-specific adaptation, listeners were presented with oddball sequences in which the presentation of a repeated tone was infrequently interrupted by rare spectral changes taking on three different magnitudes. Critically, the statistical context varied with respect to the probability of small versus large spectral changes within oddball sequences (half of the time a small change was most probable; in the other half a large change was most probable). We observed larger N1 and P2 amplitudes (i.e., release from adaptation) for all spectral changes in the small-change compared with the large-change statistical context. The increase in response magnitude also held for responses to tones presented with high probability, indicating that statistical adaptation can overrule stimulus probability per se in its influence on neural responses. Computational modeling showed that the degree of coadaptation in auditory cortex changed depending on the statistical context, which in turn affected stimulus-specific adaptation. Thus the present data demonstrate that stimulus-specific adaptation in human auditory cortex critically depends on statistical context. Finally, the present results challenge the implicit assumption of stationarity of neural response magnitudes that governs the practice of isolating established deviant-detection responses such as the mismatch negativity. PMID:25652920

  15. Searching for hidden unexpected features in the SnIa data

    NASA Astrophysics Data System (ADS)

    Shafieloo, A.; Perivolaropoulos, L.

    2010-06-01

    It is known that κ2 statistic and likelihood analysis may not be sensitive to the all features of the data. Despite of the fact that by using κ2 statistic we can measure the overall goodness of fit for a model confronted to a data set, some specific features of the data can stay undetectable. For instance, it has been pointed out that there is an unexpected brightness of the SnIa data at z > 1 in the Union compilation. We quantify this statement by constructing a new statistic, called Binned Normalized Difference (BND) statistic, which is applicable directly on the Type Ia Supernova (SnIa) distance moduli. This statistic is designed to pick up systematic brightness trends of SnIa data points with respect to a best fit cosmological model at high redshifts. According to this statistic there are 2.2%, 5.3% and 12.6% consistency between the Gold06, Union08 and Constitution09 data and spatially flat ΛCDM model when the real data is compared with many realizations of the simulated monte carlo datasets. The corresponding realization probability in the context of a (w0,w1) = (-1.4,2) model is more than 30% for all mentioned datasets indicating a much better consistency for this model with respect to the BND statistic. The unexpected high z brightness of SnIa can be interpreted either as a trend towards more deceleration at high z than expected in the context of ΛCDM or as a statistical fluctuation or finally as a systematic effect perhaps due to a mild SnIa evolution at high z.

  16. Genome-wide association analysis of secondary imaging phenotypes from the Alzheimer's disease neuroimaging initiative study.

    PubMed

    Zhu, Wensheng; Yuan, Ying; Zhang, Jingwen; Zhou, Fan; Knickmeyer, Rebecca C; Zhu, Hongtu

    2017-02-01

    The aim of this paper is to systematically evaluate a biased sampling issue associated with genome-wide association analysis (GWAS) of imaging phenotypes for most imaging genetic studies, including the Alzheimer's Disease Neuroimaging Initiative (ADNI). Specifically, the original sampling scheme of these imaging genetic studies is primarily the retrospective case-control design, whereas most existing statistical analyses of these studies ignore such sampling scheme by directly correlating imaging phenotypes (called the secondary traits) with genotype. Although it has been well documented in genetic epidemiology that ignoring the case-control sampling scheme can produce highly biased estimates, and subsequently lead to misleading results and suspicious associations, such findings are not well documented in imaging genetics. We use extensive simulations and a large-scale imaging genetic data analysis of the Alzheimer's Disease Neuroimaging Initiative (ADNI) data to evaluate the effects of the case-control sampling scheme on GWAS results based on some standard statistical methods, such as linear regression methods, while comparing it with several advanced statistical methods that appropriately adjust for the case-control sampling scheme. Copyright © 2016 Elsevier Inc. All rights reserved.

  17. A Statistical Study of Serum Cholesterol Level by Gender and Race.

    PubMed

    Tharu, Bhikhari Prasad; Tsokos, Chris P

    2017-07-25

    Cholesterol level (CL) is growing concerned as health issue in human health since it is considered one of the causes in heart diseases. A study of cholesterol level can provide insight about its nature and characteristics. A cross-sectional study. National Health and Nutrition Examination Survey (NHANS) II was conducted on a probability sample of approximately 28,000 persons in the USA and cholesterol level is obtained from laboratory results. Samples were selected so that certain population groups thought to be at high risk of malnutrition. Study included 11,864 persons for CL cases with 9,602 males and 2,262 females with races: whites, blacks, and others. Non-parametric statistical tests and goodness of fit test have been used to identify probability distributions. The study concludes that the cholesterol level exhibits significant racial and gender differences in terms of probability distributions. The study has concluded that white people are relatively higher at risk than black people to have risk line and high risk cholesterol. The study clearly indicates that black males normally have higher cholesterol. Females have lower variation in cholesterol than males. There exists gender and racial discrepancies in cholesterol which has been identified as lognormal and gamma probability distributions. White individuals seem to be at a higher risk of having high risk cholesterol level than blacks. Females tend to have higher variation in cholesterol level than males.

  18. A diagnostic model for chronic hypersensitivity pneumonitis

    PubMed Central

    Johannson, Kerri A; Elicker, Brett M; Vittinghoff, Eric; Assayag, Deborah; de Boer, Kaïssa; Golden, Jeffrey A; Jones, Kirk D; King, Talmadge E; Koth, Laura L; Lee, Joyce S; Ley, Brett; Wolters, Paul J; Collard, Harold R

    2017-01-01

    The objective of this study was to develop a diagnostic model that allows for a highly specific diagnosis of chronic hypersensitivity pneumonitis using clinical and radiological variables alone. Chronic hypersensitivity pneumonitis and other interstitial lung disease cases were retrospectively identified from a longitudinal database. High-resolution CT scans were blindly scored for radiographic features (eg, ground-glass opacity, mosaic perfusion) as well as the radiologist’s diagnostic impression. Candidate models were developed then evaluated using clinical and radiographic variables and assessed by the cross-validated C-statistic. Forty-four chronic hypersensitivity pneumonitis and eighty other interstitial lung disease cases were identified. Two models were selected based on their statistical performance, clinical applicability and face validity. Key model variables included age, down feather and/or bird exposure, radiographic presence of ground-glass opacity and mosaic perfusion and moderate or high confidence in the radiographic impression of chronic hypersensitivity pneumonitis. Models were internally validated with good performance, and cut-off values were established that resulted in high specificity for a diagnosis of chronic hypersensitivity pneumonitis. PMID:27245779

  19. The effectiveness and cost-effectiveness of intraoperative imaging in high-grade glioma resection; a comparative review of intraoperative ALA, fluorescein, ultrasound and MRI.

    PubMed

    Eljamel, M Sam; Mahboob, Syed Osama

    2016-12-01

    Surgical resection of high-grade gliomas (HGG) is standard therapy because it imparts significant progression free (PFS) and overall survival (OS). However, HGG-tumor margins are indistinguishable from normal brain during surgery. Hence intraoperative technology such as fluorescence (ALA, fluorescein) and intraoperative ultrasound (IoUS) and MRI (IoMRI) has been deployed. This study compares the effectiveness and cost-effectiveness of these technologies. Critical literature review and meta-analyses, using MEDLINE/PubMed service. The list of references in each article was double-checked for any missing references. We included all studies that reported the use of ALA, fluorescein (FLCN), IoUS or IoMRI to guide HGG-surgery. The meta-analyses were conducted according to statistical heterogeneity between studies. If there was no heterogeneity, fixed effects model was used; otherwise, a random effects model was used. Statistical heterogeneity was explored by χ 2 and inconsistency (I 2 ) statistics. To assess cost-effectiveness, we calculated the incremental cost per quality-adjusted life-year (QALY). Gross total resection (GTR) after ALA, FLCN, IoUS and IoMRI was 69.1%, 84.4%, 73.4% and 70% respectively. The differences were not statistically significant. All four techniques led to significant prolongation of PFS and tended to prolong OS. However none of these technologies led to significant prolongation of OS compared to controls. The cost/QALY was $16,218, $3181, $6049 and $32,954 for ALA, FLCN, IoUS and IoMRI respectively. ALA, FLCN, IoUS and IoMRI significantly improve GTR and PFS of HGG. Their incremental cost was below the threshold for cost-effectiveness of HGG-therapy, denoting that each intraoperative technology was cost-effective on its own. Copyright © 2016 Elsevier B.V. All rights reserved.

  20. Effect of Disease Improvement with Self-Measurement Compliance (Measurement Frequency Level) in SmartCare Hypertension Management Service.

    PubMed

    Lee, Chang-Hee; Chang, Byeong-Yun

    2016-03-01

    This study's purpose was to analyze the effect of the SmartCare pilot project, which was conducted in 2011 in South Korea. Recent studies of telehealth mostly compare the intervention group and the control group. Therefore, it is necessary to analyze the disease improvement effect depending on the self-measurement compliance (measurement frequency level) of patients who are receiving the hypertension management services. In the SmartCare center, health managers (nurses, nutritionists, and exercise prescribers) monitored the measurement data transmitted by participants through the SmartCare system. The health managers provided the prevention, consultation, and education services remotely to patients. Of the 231 participants who were enrolled in the study, the final analysis involved 213 individuals who completed their blood pressure measurements and SmartCare services until the end of a 6-month service period. The evaluated measurement group was classified into three groups (Low, Middle, and High) by evenly dividing the monthly average frequency of measurement for 6 months. The evaluation indices were systolic blood pressure (SBP), diastolic blood pressure (DBP), weight, and body mass index (BMI); this information was transmitted through the SmartCare system. For changes in the evaluation indices after 6 months compared with the initial baseline, in the Low Group, SBP and DBP slightly decreased, and weight and BMI slightly increased (difference not statistically significant). In the Middle Group, SBP and DBP decreased slightly (difference not statistically significant); however, both weight and BMI decreased (difference statistically significant). In the High Group, SBP, DBP, weight, and BMI decreased (difference statistically significant). Patients who received the SmartCare services with higher measurement frequency levels at home showed greater effectiveness regarding the provided services compared with those patients with lower levels of BP, weight, and BMI control.

  1. The large sample size fallacy.

    PubMed

    Lantz, Björn

    2013-06-01

    Significance in the statistical sense has little to do with significance in the common practical sense. Statistical significance is a necessary but not a sufficient condition for practical significance. Hence, results that are extremely statistically significant may be highly nonsignificant in practice. The degree of practical significance is generally determined by the size of the observed effect, not the p-value. The results of studies based on large samples are often characterized by extreme statistical significance despite small or even trivial effect sizes. Interpreting such results as significant in practice without further analysis is referred to as the large sample size fallacy in this article. The aim of this article is to explore the relevance of the large sample size fallacy in contemporary nursing research. Relatively few nursing articles display explicit measures of observed effect sizes or include a qualitative discussion of observed effect sizes. Statistical significance is often treated as an end in itself. Effect sizes should generally be calculated and presented along with p-values for statistically significant results, and observed effect sizes should be discussed qualitatively through direct and explicit comparisons with the effects in related literature. © 2012 Nordic College of Caring Science.

  2. Use and effect of clinical photographic records on the motivation and oral hygiene practices of a group of mentally handicapped adults.

    PubMed

    Bickley, S R; Shaw, L; Shaw, M J

    1990-01-01

    A study was undertaken to test the hypothesis that clinical photographic records can be used to motivate oral hygiene performance in mentally handicapped adults. Plaque reduction over the period of the study was shown to be higher in the test group than the control group but differences between the test and control groups were not statistically significant. The small numbers involved (29) and the difficulties in matching subjects may have mitigated against demonstrating a statistically significant difference between the two groups. All participants demonstrated high levels of toothbrushing ability during the practical aspects of the study but this was not maintained through daily oral hygiene practices in the majority of subjects.

  3. High precision and high yield fabrication of dense nanoparticle arrays onto DNA origami at statistically independent binding sites

    NASA Astrophysics Data System (ADS)

    Takabayashi, Sadao; Klein, William P.; Onodera, Craig; Rapp, Blake; Flores-Estrada, Juan; Lindau, Elias; Snowball, Lejmarc; Sam, Joseph T.; Padilla, Jennifer E.; Lee, Jeunghoon; Knowlton, William B.; Graugnard, Elton; Yurke, Bernard; Kuang, Wan; Hughes, William L.

    2014-10-01

    High precision, high yield, and high density self-assembly of nanoparticles into arrays is essential for nanophotonics. Spatial deviations as small as a few nanometers can alter the properties of near-field coupled optical nanostructures. Several studies have reported assemblies of few nanoparticle structures with controlled spacing using DNA nanostructures with variable yield. Here, we report multi-tether design strategies and attachment yields for homo- and hetero-nanoparticle arrays templated by DNA origami nanotubes. Nanoparticle attachment yield via DNA hybridization is comparable with streptavidin-biotin binding. Independent of the number of binding sites, >97% site-occupation was achieved with four tethers and 99.2% site-occupation is theoretically possible with five tethers. The interparticle distance was within 2 nm of all design specifications and the nanoparticle spatial deviations decreased with interparticle spacing. Modified geometric, binomial, and trinomial distributions indicate that site-bridging, steric hindrance, and electrostatic repulsion were not dominant barriers to self-assembly and both tethers and binding sites were statistically independent at high particle densities.High precision, high yield, and high density self-assembly of nanoparticles into arrays is essential for nanophotonics. Spatial deviations as small as a few nanometers can alter the properties of near-field coupled optical nanostructures. Several studies have reported assemblies of few nanoparticle structures with controlled spacing using DNA nanostructures with variable yield. Here, we report multi-tether design strategies and attachment yields for homo- and hetero-nanoparticle arrays templated by DNA origami nanotubes. Nanoparticle attachment yield via DNA hybridization is comparable with streptavidin-biotin binding. Independent of the number of binding sites, >97% site-occupation was achieved with four tethers and 99.2% site-occupation is theoretically possible with five tethers. The interparticle distance was within 2 nm of all design specifications and the nanoparticle spatial deviations decreased with interparticle spacing. Modified geometric, binomial, and trinomial distributions indicate that site-bridging, steric hindrance, and electrostatic repulsion were not dominant barriers to self-assembly and both tethers and binding sites were statistically independent at high particle densities. Electronic supplementary information (ESI) available. See DOI: 10.1039/c4nr03069a

  4. A New Scoring System to Predict the Risk for High-risk Adenoma and Comparison of Existing Risk Calculators.

    PubMed

    Murchie, Brent; Tandon, Kanwarpreet; Hakim, Seifeldin; Shah, Kinchit; O'Rourke, Colin; Castro, Fernando J

    2017-04-01

    Colorectal cancer (CRC) screening guidelines likely over-generalizes CRC risk, 35% of Americans are not up to date with screening, and there is growing incidence of CRC in younger patients. We developed a practical prediction model for high-risk colon adenomas in an average-risk population, including an expanded definition of high-risk polyps (≥3 nonadvanced adenomas), exposing higher than average-risk patients. We also compared results with previously created calculators. Patients aged 40 to 59 years, undergoing first-time average-risk screening or diagnostic colonoscopies were evaluated. Risk calculators for advanced adenomas and high-risk adenomas were created based on age, body mass index, sex, race, and smoking history. Previously established calculators with similar risk factors were selected for comparison of concordance statistic (c-statistic) and external validation. A total of 5063 patients were included. Advanced adenomas, and high-risk adenomas were seen in 5.7% and 7.4% of the patient population, respectively. The c-statistic for our calculator was 0.639 for the prediction of advanced adenomas, and 0.650 for high-risk adenomas. When applied to our population, all previous models had lower c-statistic results although one performed similarly. Our model compares favorably to previously established prediction models. Age and body mass index were used as continuous variables, likely improving the c-statistic. It also reports absolute predictive probabilities of advanced and high-risk polyps, allowing for more individualized risk assessment of CRC.

  5. Sedimentological analysis and bed thickness statistics from a Carboniferous deep-water channel-levee complex: Myall Trough, SE Australia

    NASA Astrophysics Data System (ADS)

    Palozzi, Jason; Pantopoulos, George; Maravelis, Angelos G.; Nordsvan, Adam; Zelilidis, Avraam

    2018-02-01

    This investigation presents an outcrop-based integrated study of internal division analysis and statistical treatment of turbidite bed thickness applied to a Carboniferous deep-water channel-levee complex in the Myall Trough, southeast Australia. Turbidite beds of the studied succession are characterized by a range of sedimentary structures grouped into two main associations, a thick-bedded and a thin-bedded one, that reflect channel-fill and overbank/levee deposits, respectively. Three vertically stacked channel-levee cycles have been identified. Results of statistical analysis of bed thickness, grain-size and internal division patterns applied on the studied channel-levee succession, indicate that turbidite bed thickness data seem to be well characterized by a bimodal lognormal distribution, which is possibly reflecting the difference between deposition from lower-density flows (in a levee/overbank setting) and very high-density flows (in a channel fill setting). Power law and exponential distributions were observed to hold only for the thick-bedded parts of the succession and cannot characterize the whole bed thickness range of the studied sediments. The succession also exhibits non-random clustering of bed thickness and grain-size measurements. The studied sediments are also characterized by the presence of statistically detected fining-upward sandstone packets. A novel quantitative approach (change-point analysis) is proposed for the detection of those packets. Markov permutation statistics also revealed the existence of order in the alternation of internal divisions in the succession expressed by an optimal internal division cycle reflecting two main types of gravity flow events deposited within both thick-bedded conglomeratic and thin-bedded sandstone associations. The analytical methods presented in this study can be used as additional tools for quantitative analysis and recognition of depositional environments in hydrocarbon-bearing research of ancient deep-water channel-levee settings.

  6. Image statistics for surface reflectance perception.

    PubMed

    Sharan, Lavanya; Li, Yuanzhen; Motoyoshi, Isamu; Nishida, Shin'ya; Adelson, Edward H

    2008-04-01

    Human observers can distinguish the albedo of real-world surfaces even when the surfaces are viewed in isolation, contrary to the Gelb effect. We sought to measure this ability and to understand the cues that might underlie it. We took photographs of complex surfaces such as stucco and asked observers to judge their diffuse reflectance by comparing them to a physical Munsell scale. Their judgments, while imperfect, were highly correlated with the true reflectance. The judgments were also highly correlated with certain image statistics, such as moment and percentile statistics of the luminance and subband histograms. When we digitally manipulated these statistics in an image, human judgments were correspondingly altered. Moreover, linear combinations of such statistics allow a machine vision system (operating within the constrained world of single surfaces) to estimate albedo with an accuracy similar to that of human observers. Taken together, these results indicate that some simple image statistics have a strong influence on the judgment of surface reflectance.

  7. Turbomachine Sealing and Secondary Flows - Part 3. Part 3; Review of Power-Stream Support, Unsteady Flow Systems, Seal and Disk Cavity Flows, Engine Externals, and Life and Reliability Issues

    NASA Technical Reports Server (NTRS)

    Hendricks, R. C.; Steinetz, B. M.; Zaretsky, E. V.; Athavale, M. M.; Przekwas, A. J.

    2004-01-01

    The issues and components supporting the engine power stream are reviewed. It is essential that companies pay close attention to engine sealing issues, particularly on the high-pressure spool or high-pressure pumps. Small changes in these systems are reflected throughout the entire engine. Although cavity, platform, and tip sealing are complex and have a significant effect on component and engine performance, computational tools (e.g., NASA-developed INDSEAL, SCISEAL, and ADPAC) are available to help guide the designer and the experimenter. Gas turbine engine and rocket engine externals must all function efficiently with a high degree of reliability in order for the engine to run but often receive little attention until they malfunction. Within the open literature statistically significant data for critical engine components are virtually nonexistent; the classic approach is deterministic. Studies show that variations with loading can have a significant effect on component performance and life. Without validation data they are just studies. These variations and deficits in statistical databases require immediate attention.

  8. Prompt Injections of Highly Relativistic Electrons Induced by Interplanetary Shocks: A Statistical Study of Van Allen Probes Observations

    NASA Technical Reports Server (NTRS)

    Schiller, Q.; Kanekal, S. G.; Jian, L. K,; Li, X.; Jones, A.; Baker, D. N.; Jaynes, A.; Spence, H. E.

    2016-01-01

    We conduct a statistical study on the sudden response of outer radiation belt electrons due to interplanetary (IP) shocks during the Van Allen Probes era, i.e., 2012 to 2015. Data from the Relativistic Electron-Proton Telescope instrument on board Van Allen Probes are used to investigate the highly relativistic electron response (E greater than 1.8 MeV) within the first few minutes after shock impact. We investigate the relationship of IP shock parameters, such as Mach number, with the highly relativistic electron response, including spectral properties and radial location of the shock-induced injection. We find that the driving solar wind structure of the shock does not affect occurrence for enhancement events, 25% of IP shocks are associated with prompt energization, and 14% are associated with MeV electron depletion. Parameters that represent IP shock strength are found to correlate best with highest levels of energization, suggesting that shock strength may play a key role in the severity of the enhancements. However, not every shock results in an enhancement, indicating that magnetospheric preconditioning may be required.

  9. Multifactor-Dimensionality Reduction Reveals High-Order Interactions among Estrogen-Metabolism Genes in Sporadic Breast Cancer

    PubMed Central

    Ritchie, Marylyn D.; Hahn, Lance W.; Roodi, Nady; Bailey, L. Renee; Dupont, William D.; Parl, Fritz F.; Moore, Jason H.

    2001-01-01

    One of the greatest challenges facing human geneticists is the identification and characterization of susceptibility genes for common complex multifactorial human diseases. This challenge is partly due to the limitations of parametric-statistical methods for detection of gene effects that are dependent solely or partially on interactions with other genes and with environmental exposures. We introduce multifactor-dimensionality reduction (MDR) as a method for reducing the dimensionality of multilocus information, to improve the identification of polymorphism combinations associated with disease risk. The MDR method is nonparametric (i.e., no hypothesis about the value of a statistical parameter is made), is model-free (i.e., it assumes no particular inheritance model), and is directly applicable to case-control and discordant-sib-pair studies. Using simulated case-control data, we demonstrate that MDR has reasonable power to identify interactions among two or more loci in relatively small samples. When it was applied to a sporadic breast cancer case-control data set, in the absence of any statistically significant independent main effects, MDR identified a statistically significant high-order interaction among four polymorphisms from three different estrogen-metabolism genes. To our knowledge, this is the first report of a four-locus interaction associated with a common complex multifactorial disease. PMID:11404819

  10. Statistical Optimization of 1,3-Propanediol (1,3-PD) Production from Crude Glycerol by Considering Four Objectives: 1,3-PD Concentration, Yield, Selectivity, and Productivity.

    PubMed

    Supaporn, Pansuwan; Yeom, Sung Ho

    2018-04-30

    This study investigated the biological conversion of crude glycerol generated from a commercial biodiesel production plant as a by-product to 1,3-propanediol (1,3-PD). Statistical analysis was employed to derive a statistical model for the individual and interactive effects of glycerol, (NH 4 ) 2 SO 4 , trace elements, pH, and cultivation time on the four objectives: 1,3-PD concentration, yield, selectivity, and productivity. Optimum conditions for each objective with its maximum value were predicted by statistical optimization, and experiments under the optimum conditions verified the predictions. In addition, by systematic analysis of the values of four objectives, optimum conditions for 1,3-PD concentration (49.8 g/L initial glycerol, 4.0 g/L of (NH 4 ) 2 SO 4 , 2.0 mL/L of trace element, pH 7.5, and 11.2 h of cultivation time) were determined to be the global optimum culture conditions for 1,3-PD production. Under these conditions, we could achieve high 1,3-PD yield (47.4%), 1,3-PD selectivity (88.8%), and 1,3-PD productivity (2.1/g/L/h) as well as high 1,3-PD concentration (23.6 g/L).

  11. Mesh Dependence on Shear Driven Boundary Layers in Stable Stratification Generated by Large Eddy-Simulation

    NASA Astrophysics Data System (ADS)

    Berg, Jacob; Patton, Edward G.; Sullivan, Peter S.

    2017-11-01

    The effect of mesh resolution and size on shear driven atmospheric boundary layers in a stable stratified environment is investigated with the NCAR pseudo-spectral LES model (J. Atmos. Sci. v68, p2395, 2011 and J. Atmos. Sci. v73, p1815, 2016). The model applies FFT in the two horizontal directions and finite differencing in the vertical direction. With vanishing heat flux at the surface and a capping inversion entraining potential temperature into the boundary layer the situation is often called the conditional neutral atmospheric boundary layer (ABL). Due to its relevance in high wind applications such as wind power meteorology, we emphasize on second order statistics important for wind turbines including spectral information. The simulations range from mesh sizes of 643 to 10243 grid points. Due to the non-stationarity of the problem, different simulations are compared at equal eddy-turnover times. Whereas grid convergence is mostly achieved in the middle portion of the ABL, statistics close to the surface of the ABL, where the presence of the ground limits the growth of the energy containing eddies, second order statistics are not converged on the studies meshes. Higher order structure functions also reveal non-Gaussian statistics highly dependent on the resolution.

  12. Implementation of statistical process control for proteomic experiments via LC MS/MS.

    PubMed

    Bereman, Michael S; Johnson, Richard; Bollinger, James; Boss, Yuval; Shulman, Nick; MacLean, Brendan; Hoofnagle, Andrew N; MacCoss, Michael J

    2014-04-01

    Statistical process control (SPC) is a robust set of tools that aids in the visualization, detection, and identification of assignable causes of variation in any process that creates products, services, or information. A tool has been developed termed Statistical Process Control in Proteomics (SProCoP) which implements aspects of SPC (e.g., control charts and Pareto analysis) into the Skyline proteomics software. It monitors five quality control metrics in a shotgun or targeted proteomic workflow. None of these metrics require peptide identification. The source code, written in the R statistical language, runs directly from the Skyline interface, which supports the use of raw data files from several of the mass spectrometry vendors. It provides real time evaluation of the chromatographic performance (e.g., retention time reproducibility, peak asymmetry, and resolution), and mass spectrometric performance (targeted peptide ion intensity and mass measurement accuracy for high resolving power instruments) via control charts. Thresholds are experiment- and instrument-specific and are determined empirically from user-defined quality control standards that enable the separation of random noise and systematic error. Finally, Pareto analysis provides a summary of performance metrics and guides the user to metrics with high variance. The utility of these charts to evaluate proteomic experiments is illustrated in two case studies.

  13. Statistical issues in searches for new phenomena in High Energy Physics

    NASA Astrophysics Data System (ADS)

    Lyons, Louis; Wardle, Nicholas

    2018-03-01

    Many analyses of data in High Energy Physics are concerned with searches for New Physics. We review the statistical issues that arise in such searches, and then illustrate these using the specific example of the recent successful search for the Higgs boson, produced in collisions between high energy protons at CERN’s Large Hadron Collider.

  14. Monitoring and Evaluation: Statistical Support for Life-cycle Studies, Annual Report 2003.

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Skalski, John

    2003-11-01

    The ongoing mission of this project is the development of statistical tools for analyzing fisheries tagging data in the most precise and appropriate manner possible. This mission also includes providing statistical guidance on the best ways to design large-scale tagging studies. This mission continues because the technologies for conducting fish tagging studies continuously evolve. In just the last decade, fisheries biologists have seen the evolution from freeze-brands and coded wire tags (CWT) to passive integrated transponder (PIT) tags, balloon-tags, radiotelemetry, and now, acoustic-tags. With each advance, the technology holds the promise of more detailed and precise information. However, the technologymore » for analyzing and interpreting the data also becomes more complex as the tagging techniques become more sophisticated. The goal of the project is to develop the analytical tools in parallel with the technical advances in tagging studies, so that maximum information can be extracted on a timely basis. Associated with this mission is the transfer of these analytical capabilities to the field investigators to assure consistency and the highest levels of design and analysis throughout the fisheries community. Consequently, this project provides detailed technical assistance on the design and analysis of tagging studies to groups requesting assistance throughout the fisheries community. Ideally, each project and each investigator would invest in the statistical support needed for the successful completion of their study. However, this is an ideal that is rarely if every attained. Furthermore, there is only a small pool of highly trained scientists in this specialized area of tag analysis here in the Northwest. Project 198910700 provides the financial support to sustain this local expertise on the statistical theory of tag analysis at the University of Washington and make it available to the fisheries community. Piecemeal and fragmented support from various agencies and organizations would be incapable of maintaining a center of expertise. The mission of the project is to help assure tagging studies are designed and analyzed from the onset to extract the best available information using state-of-the-art statistical methods. The overarching goals of the project is to assure statistically sound survival studies so that fish managers can focus on the management implications of their findings and not be distracted by concerns whether the studies are statistically reliable or not. Specific goals and objectives of the study include the following: (1) Provide consistent application of statistical methodologies for survival estimation across all salmon life cycle stages to assure comparable performance measures and assessment of results through time, to maximize learning and adaptive management opportunities, and to improve and maintain the ability to responsibly evaluate the success of implemented Columbia River FWP salmonid mitigation programs and identify future mitigation options. (2) Improve analytical capabilities to conduct research on survival processes of wild and hatchery chinook and steelhead during smolt outmigration, to improve monitoring and evaluation capabilities and assist in-season river management to optimize operational and fish passage strategies to maximize survival. (3) Extend statistical support to estimate ocean survival and in-river survival of returning adults. Provide statistical guidance in implementing a river-wide adult PIT-tag detection capability. (4) Develop statistical methods for survival estimation for all potential users and make this information available through peer-reviewed publications, statistical software, and technology transfers to organizations such as NOAA Fisheries, the Fish Passage Center, US Fish and Wildlife Service, US Geological Survey (USGS), US Army Corps of Engineers (USACE), Public Utility Districts (PUDs), the Independent Scientific Advisory Board (ISAB), and other members of the Northwest fisheries community. (5) Provide and maintain statistical software for tag analysis and user support. (6) Provide improvements in statistical theory and software as requested by user groups. These improvements include extending software capabilities to address new research issues, adapting tagging techniques to new study designs, and extending the analysis capabilities to new technologies such as radio-tags and acoustic-tags.« less

  15. High Variability in Cellular Stoichiometry of Carbon, Nitrogen, and Phosphorus Within Classes of Marine Eukaryotic Phytoplankton Under Sufficient Nutrient Conditions.

    PubMed

    Garcia, Nathan S; Sexton, Julie; Riggins, Tracey; Brown, Jeff; Lomas, Michael W; Martiny, Adam C

    2018-01-01

    Current hypotheses suggest that cellular elemental stoichiometry of marine eukaryotic phytoplankton such as the ratios of cellular carbon:nitrogen:phosphorus (C:N:P) vary between phylogenetic groups. To investigate how phylogenetic structure, cell volume, growth rate, and temperature interact to affect the cellular elemental stoichiometry of marine eukaryotic phytoplankton, we examined the C:N:P composition in 30 isolates across 7 classes of marine phytoplankton that were grown with a sufficient supply of nutrients and nitrate as the nitrogen source. The isolates covered a wide range in cell volume (5 orders of magnitude), growth rate (<0.01-0.9 d -1 ), and habitat temperature (2-24°C). Our analysis indicates that C:N:P is highly variable, with statistical model residuals accounting for over half of the total variance and no relationship between phylogeny and elemental stoichiometry. Furthermore, our data indicated that variability in C:P, N:P, and C:N within Bacillariophyceae (diatoms) was as high as that among all of the isolates that we examined. In addition, a linear statistical model identified a positive relationship between diatom cell volume and C:P and N:P. Among all of the isolates that we examined, the statistical model identified temperature as a significant factor, consistent with the temperature-dependent translation efficiency model, but temperature only explained 5% of the total statistical model variance. While some of our results support data from previous field studies, the high variability of elemental ratios within Bacillariophyceae contradicts previous work that suggests that this cosmopolitan group of microalgae has consistently low C:P and N:P ratios in comparison with other groups.

  16. The Hazard of Graduation: Analysis of Three Multivariate Statistics Used to Study Multi-Institutional Attendance

    ERIC Educational Resources Information Center

    Muehlberg, Jessica Marie

    2013-01-01

    Adelman (2006) observed that a large quantity of research on retention is "institution-specific or use institutional characteristics as independent variables" (p. 81). However, he observed that over 60% of the students he studied attended multiple institutions making the calculation of institutional effects highly problematic. He argued…

  17. Building Better Discipline Strategies for Schools by Fuzzy Logics

    ERIC Educational Resources Information Center

    Chang, Dian-Fu; Juan, Ya-Yun; Chou, Wen-Ching

    2014-01-01

    This study aims to realize better discipline strategies for applying in high schools. We invited 400 teachers to participate the survey and collected their perceptions on the discipline strategies in terms of the acceptance of strategies and their effectiveness in schools. Based on the idea of fuzzy statistics, this study transformed the fuzzy…

  18. [Typing and Shorthand in the Small High School.

    ERIC Educational Resources Information Center

    Fedel, Joan; Starbuck, Ethel

    Two studies conducted in the field of business education are presented in this report by the Colorado State Department of Education. In one study, individualized instruction procedures and individual work packets were developed for students in both first- and second-year typing. Descriptive statistics presented for the 2 groups over a 3-year…

  19. Student Math Achievement and Out-of-Field Teaching

    ERIC Educational Resources Information Center

    Hill, Jason G.; Dalton, Ben

    2013-01-01

    This study investigates the distribution of math teachers with a major or certification in math using data from the National Center for Education Statistics' High School Longitudinal Study of 2009 (HSLS:09). The authors discuss the limitations of existing data sources for measuring teacher qualifications, such as the Schools and Staffing Survey…

  20. Kiss High Blood Pressure Goodbye: The Relationship between Dark Chocolate and Hypertension

    ERIC Educational Resources Information Center

    Nordmoe, Eric D.

    2008-01-01

    This article reports on a delicious finding from a recent study claiming a causal link between dark chocolate consumption and blood pressure reductions. In the article, I provide ideas for using this study to whet student appetites for a discussion of statistical ideas, including experimental design, measurement error and inference methods.

  1. Risk factors for epithelial ovarian cancer in the female population of Belgrade, Serbia: a case-control study.

    PubMed

    Gazibara, Tatjana; Filipović, Aleksandra; Kesić, Vesna; Kisiĉ-Tepavcević, Darija; Pekmezović, Tatjana

    2013-12-01

    Ovarian cancer (OC) comprises 3% of all cancers, but it is the fifth most common cause of cancer death in women. The aim of this case-control study was to determine the risk factors for OC in the female population of Belgrade, Serbia. A total of 80 consecutive patients were enrolled in the study between 2006 and 2008 in two national referral centers for OC in Serbia. The control subjects were recruited during the regular gynecological check-ups in the Public Health Center of the corresponding municipalities. All the study participants were interviewed during their visits to the above mentioned institutions by two physicians using the same questionnaire. In order to analyze the influence of specific exposure to the risk of the disease, we categorized variables according to the cut-off values. Odds ratios (OR) and 95% confidence intervals (95% CI) were calculated separately for each variable using univariate conditional logistic regression analysis. There were no statistically significant differences in educational level, years of schooling, occupational and employment status between patients with OC and women in the control group. Oral contraceptives use and other contraceptive methods (condoms, mechanical contraceptive devices) were highly statistically significantly more frequent among women in the control group (OR = 0.2, 95% CI 0.1-0.7, p = 0.005; OR = 0.1, 95% CI 0.01-0.5, p = 0.001, respectively). The patients with OC practiced sports for 6.3 +/- 2.1 years, and controls for 11.8 +/- 9.9 years. Sport and recreation activities were statistically significantly protective (OR = 0.2, p = 0.011; OR = 0.4, p = 0.019). Tea consumption on daily basis had a highly statistically significat protective effect (OR = 0.3, p = 0.001). Oral contraceptives use and physical activity were independent protective factors for OC in this study.

  2. An investigation into the effects of temporal resolution on hepatic dynamic contrast-enhanced MRI in volunteers and in patients with hepatocellular carcinoma

    NASA Astrophysics Data System (ADS)

    Gill, Andrew B.; Black, Richard T.; Bowden, David J.; Priest, Andrew N.; Graves, Martin J.; Lomas, David J.

    2014-06-01

    This study investigated the effect of temporal resolution on the dual-input pharmacokinetic (PK) modelling of dynamic contrast-enhanced MRI (DCE-MRI) data from normal volunteer livers and from patients with hepatocellular carcinoma. Eleven volunteers and five patients were examined at 3 T. Two sections, one optimized for the vascular input functions (VIF) and one for the tissue, were imaged within a single heart-beat (HB) using a saturation-recovery fast gradient echo sequence. The data was analysed using a dual-input single-compartment PK model. The VIFs and/or uptake curves were then temporally sub-sampled (at interval ▵t = [2-20] s) before being subject to the same PK analysis. Statistical comparisons of tumour and normal tissue PK parameter values using a 5% significance level gave rise to the same study results when temporally sub-sampling the VIFs to HB < ▵t <4 s. However, sub-sampling to ▵t > 4 s did adversely affect the statistical comparisons. Temporal sub-sampling of just the liver/tumour tissue uptake curves at ▵t ≤ 20 s, whilst using high temporal resolution VIFs, did not substantially affect PK parameter statistical comparisons. In conclusion, there is no practical advantage to be gained from acquiring very high temporal resolution hepatic DCE-MRI data. Instead the high temporal resolution could be usefully traded for increased spatial resolution or SNR.

  3. Network Data: Statistical Theory and New Models

    DTIC Science & Technology

    2016-02-17

    SECURITY CLASSIFICATION OF: During this period of review, Bin Yu worked on many thrusts of high-dimensional statistical theory and methodologies. Her...research covered a wide range of topics in statistics including analysis and methods for spectral clustering for sparse and structured networks...2,7,8,21], sparse modeling (e.g. Lasso) [4,10,11,17,18,19], statistical guarantees for the EM algorithm [3], statistical analysis of algorithm leveraging

  4. Statistical Analyses of High-Resolution Aircraft and Satellite Observations of Sea Ice: Applications for Improving Model Simulations

    NASA Astrophysics Data System (ADS)

    Farrell, S. L.; Kurtz, N. T.; Richter-Menge, J.; Harbeck, J. P.; Onana, V.

    2012-12-01

    Satellite-derived estimates of ice thickness and observations of ice extent over the last decade point to a downward trend in the basin-scale ice volume of the Arctic Ocean. This loss has broad-ranging impacts on the regional climate and ecosystems, as well as implications for regional infrastructure, marine navigation, national security, and resource exploration. New observational datasets at small spatial and temporal scales are now required to improve our understanding of physical processes occurring within the ice pack and advance parameterizations in the next generation of numerical sea-ice models. High-resolution airborne and satellite observations of the sea ice are now available at meter-scale resolution or better that provide new details on the properties and morphology of the ice pack across basin scales. For example the NASA IceBridge airborne campaign routinely surveys the sea ice of the Arctic and Southern Oceans with an advanced sensor suite including laser and radar altimeters and digital cameras that together provide high-resolution measurements of sea ice freeboard, thickness, snow depth and lead distribution. Here we present statistical analyses of the ice pack primarily derived from the following IceBridge instruments: the Digital Mapping System (DMS), a nadir-looking, high-resolution digital camera; the Airborne Topographic Mapper, a scanning lidar; and the University of Kansas snow radar, a novel instrument designed to estimate snow depth on sea ice. Together these instruments provide data from which a wide range of sea ice properties may be derived. We provide statistics on lead distribution and spacing, lead width and area, floe size and distance between floes, as well as ridge height, frequency and distribution. The goals of this study are to (i) identify unique statistics that can be used to describe the characteristics of specific ice regions, for example first-year/multi-year ice, diffuse ice edge/consolidated ice pack, and convergent/divergent ice zones, (ii) provide datasets that support enhanced parameterizations in numerical models as well as model initialization and validation, (iii) parameters of interest to Arctic stakeholders for marine navigation and ice engineering studies, and (iv) statistics that support algorithm development for the next-generation of airborne and satellite altimeters, including NASA's ICESat-2 mission. We describe the potential contribution our results can make towards the improvement of coupled ice-ocean numerical models, and discuss how data synthesis and integration with high-resolution models may improve our understanding of sea ice variability and our capabilities in predicting the future state of the ice pack.

  5. Oral health literacy among clients visiting a rural dental college in North India-a cross-sectional study.

    PubMed

    Ramandeep, Gambhir; Arshdeep, Singh; Vinod, Kapoor; Parampreet, Pannu

    2014-07-01

    Limited health literacy among adults is one of the many barriers to better oral health outcomes. It is not uncommon to find people who consider understanding oral health information a challenge. Therefore, the present study assessed oral health literacy among clients visiting Gian Sagar Dental College and Hospital, Rajpura. A cross-sectional study was conducted on 450participants who visited the Out Patient Department (OPD) of Gian Sagar Dental College and Hospital for a period of two months (Nov-Dec, 2013). A questionnaire was given to each of the participants. Oral health literacy was graded on a 12-point Likert scale based on the total score. Oral Health Literacy of the participants was assessed as low, medium and high on the basis of responses. Statistical analysis was done using SPSS-15 statistical package. ANOVA and Student t-test were used to do comparisons between groups. Low oral health literacy scores were reported in 60.2% (271) participants. More than 60% of the study participants had knowledge about dental terms such as 'dental caries,' and 'oral cancer.' Only 22% of the graduates had a high literacy score. Mean oral health literacy score according to educational qualification was statistically significant (p<0.05), whereas there was no significant difference in terms of age and gender (p>0.05). The majority of the participants had low literacy scores. There is a need to address these problems especially among rural population by health care providers and the government.

  6. Does high-flow nasal cannula oxygen improve outcome in acute hypoxemic respiratory failure? A systematic review and meta-analysis.

    PubMed

    Lin, Si-Ming; Liu, Kai-Xiong; Lin, Zhi-Hong; Lin, Pei-Hong

    2017-10-01

    To evaluate the efficacy of high-flow nasal cannula (HFNC) in the rate of intubation and mortality for patients with acute hypoxemic respiratory failure. We searched Pubmed, EMBASE, and the Cochrane Library for relevant studies. Two reviewers extracted data and reviewed the quality of the studies independently. The primary outcome was the rate of intubation; secondary outcome was mortality in the hospital. Study-level data were pooled using a random-effects model when I2 was >50% or a fixed-effects model when I2 was <50%. Eight randomized controlled studies with a total of 1,818patients were considered. Pooled analysis showed that no statistically significant difference was found between groups regarding the rate of intubation (odds ratio [OR] = 0.79; 95% confidence interval [CI]: 0.60-1.04; P = 0.09; I2 = 36%) and no statistically significant difference was found between groups regarding hospital mortality (OR = 0.89; 95% CI: 0.62-127; P = 0.51; I2 = 47%). The use of HFNC showed a trend toward reduction in the intubation rate, which did not meet statistical significance, in patients with acute respiratory failure compared with conventional oxygen therapy (COT) and noninvasive ventilation (NIV). Moreover no difference in mortality. So, Large, well-designed, randomized, multi-center trials are needed to confirm the effects of HFNC in acute hypoxemic respiratory failure patients. Copyright © 2017 Elsevier Ltd. All rights reserved.

  7. Subsonic Aircraft Safety Icing Study

    NASA Technical Reports Server (NTRS)

    Jones, Sharon Monica; Reveley, Mary S.; Evans, Joni K.; Barrientos, Francesca A.

    2008-01-01

    NASA's Integrated Resilient Aircraft Control (IRAC) Project is one of four projects within the agency s Aviation Safety Program (AvSafe) in the Aeronautics Research Mission Directorate (ARMD). The IRAC Project, which was redesigned in the first half of 2007, conducts research to advance the state of the art in aircraft control design tools and techniques. A "Key Decision Point" was established for fiscal year 2007 with the following expected outcomes: document the most currently available statistical/prognostic data associated with icing for subsonic transport, summarize reports by subject matter experts in icing research on current knowledge of icing effects on control parameters and establish future requirements for icing research for subsonic transports including the appropriate alignment. This study contains: (1) statistical analyses of accident and incident data conducted by NASA researchers for this "Key Decision Point", (2) an examination of icing in other recent statistically based studies, (3) a summary of aviation safety priority lists that have been developed by various subject-matter experts, including the significance of aircraft icing research in these lists and (4) suggested future requirements for NASA icing research. The review of several studies by subject-matter experts was summarized into four high-priority icing research areas. Based on the Integrated Resilient Aircraft Control (IRAC) Project goals and objectives, the IRAC project was encouraged to conduct work in all of the high-priority icing research areas that were identified, with the exception of the developing of methods to sense and document actual icing conditions.

  8. External Validation of Risk Scores for Major Bleeding in a Population-Based Cohort of Transient Ischemic Attack and Ischemic Stroke Patients.

    PubMed

    Hilkens, Nina A; Li, Linxin; Rothwell, Peter M; Algra, Ale; Greving, Jacoba P

    2018-03-01

    The S 2 TOP-BLEED score may help to identify patients at high risk of bleeding on antiplatelet drugs after a transient ischemic attack or ischemic stroke. The score was derived on trial populations, and its performance in a real-world setting is unknown. We aimed to externally validate the S 2 TOP-BLEED score for major bleeding in a population-based cohort and to compare its performance with other risk scores for bleeding. We studied risk of bleeding in 2072 patients with a transient ischemic attack or ischemic stroke on antiplatelet agents in the population-based OXVASC (Oxford Vascular Study) according to 3 scores: S 2 TOP-BLEED, REACH, and Intracranial-B 2 LEED 3 S. Performance was assessed with C statistics and calibration plots. During 8302 patient-years of follow-up, 117 patients had a major bleed. The S 2 TOP-BLEED score showed a C statistic of 0.69 (95% confidence interval [CI], 0.64-0.73) and accurate calibration for 3-year risk of major bleeding. The S 2 TOP-BLEED score was much more predictive of fatal bleeding than nonmajor bleeding (C statistics 0.77; 95% CI, 0.69-0.85 and 0.50; 95% CI, 0.44-0.58). The REACH score had a C statistic of 0.63 (95% CI, 0.58-0.69) for major bleeding and the Intracranial-B 2 LEED 3 S score a C statistic of 0.60 (95% CI, 0.51-0.70) for intracranial bleeding. The ratio of ischemic events versus bleeds decreased across risk groups of bleeding from 6.6:1 in the low-risk group to 1.8:1 in the high-risk group. The S 2 TOP-BLEED score shows modest performance in a population-based cohort of patients with a transient ischemic attack or ischemic stroke. Although bleeding risks were associated with risks of ischemic events, risk stratification may still be useful to identify a subgroup of patients at particularly high risk of bleeding, in whom preventive measures are indicated. © 2018 The Authors.

  9. ELEVATED LEVELS OF SODIUM IN COMMUNITY DRINKING WATER

    EPA Science Inventory

    A comparison study of students from towns with differing levels of sodium in drinking water revealed statistically significantly higher blood pressure distributions among the students from the town with high sodium levels. Differences were found in both systolic and diastolic rea...

  10. Patterns of shading tolerance determined from experimental light reduction studies of seagrasses

    EPA Science Inventory

    An extensive review of the experimental literature on seagrass shading evaluated the relationship between experimental light reductions, duration of experiment and seagrass response metrics to determine whether there were consistent statistical patterns. There were highly signif...

  11. Molecular weight analyses and enzymatic degradation profiles of the soft-tissue fillers Belotero Balance, Restylane, and Juvéderm Ultra.

    PubMed

    Flynn, Timothy Corcoran; Thompson, David H; Hyun, Seok-Hee

    2013-10-01

    In this study, the authors sought to determine the molecular weight distribution of three hyaluronic acids-Belotero Balance, Restylane, and Juvéderm Ultra-and their rates of degradation following exposure to hyaluronidase. Lot consistency of Belotero Balance also was analyzed. Three lots of Belotero Balance were analyzed using liquid chromatography techniques. The product was found to have high-molecular-weight and low-molecular-weight species. One lot of Belotero Balance was compared to one lot each of Juvéderm Ultra and Restylane. Molecular weights of the species were analyzed. The hyaluronic acids were exposed to ovine testicular hyaluronidase at six time points-baseline and 0.5, 1, 2, 6, and 24 hours-to determine degradation rates. Belotero Balance lots were remarkably consistent. Belotero Balance had the largest high-molecular-weight species, followed by Juvéderm Ultra and Restylane (p < 0.001). Low-molecular-weight differences among all three hyaluronic acids were not statistically significant. Percentages of high-molecular-weight polymer differ among the three materials, with Belotero Balance having the highest fraction of high-molecular-weight polymer. Degradation of the high-molecular-weight species over time showed different molecular weights of the high-molecular-weight fraction. Rates of degradation of the hyaluronic acids following exposure to ovine testicular hyaluronidase were similar. All hyaluronic acids were fully degraded at 24 hours. Fractions of high-molecular-weight polymer differ across the hyaluronic acids tested. The low-molecular-weight differences are not statistically significant. The high-molecular-weight products have different molecular weights at the 0.5- and 2-hour time points when exposed to ovine testicular hyaluronidase and are not statistically different at 24 hours.

  12. Qualitative Literature Review of the Prevalence of Depression in Medical Students Compared to Students in Non-medical Degrees.

    PubMed

    Bacchi, Stephen; Licinio, Julio

    2015-06-01

    The purpose of this study is to review studies published in English between 1 January 2000 and 16 June 2014, in peer-reviewed journals, that have assessed the prevalence of depression, comparing medical students and non-medical students with a single evaluation method. The databases PubMed, Medline, EMBASE, PsycINFO, and Scopus were searched for eligible articles. Searches used combinations of the Medical Subject Headings medical student and depression. Titles and abstracts were reviewed to determine eligibility before full-text articles were retrieved, which were then also reviewed. Twelve studies met eligibility criteria. Non-medical groups surveyed included dentistry, business, humanities, nursing, pharmacy, and architecture students. One study found statistically significant results suggesting that medical students had a higher prevalence of depression than groups of non-medical students; five studies found statistically significant results indicating that the prevalence of depression in medical students was less than that in groups of non-medical students; four studies found no statistically significant difference, and two studies did not report on the statistical significance of their findings. One study was longitudinal, and 11 studies were cross-sectional. While there are limitations to these comparisons, in the main, the reviewed literature suggests that medical students have similar or lower rates of depression compared to certain groups of non-medical students. A lack of longitudinal studies meant that potential common underlying causes could not be discerned, highlighting the need for further research in this area. The high rates of depression among medical students indicate the continuing need for interventions to reduce depression.

  13. A Frequency Domain Approach to Pretest Analysis Model Correlation and Model Updating for the Mid-Frequency Range

    DTIC Science & Technology

    2009-02-01

    range of modal analysis and the high frequency region of statistical energy analysis , is referred to as the mid-frequency range. The corresponding...frequency range of modal analysis and the high frequency region of statistical energy analysis , is referred to as the mid-frequency range. The...predictions. The averaging process is consistent with the averaging done in statistical energy analysis for stochastic systems. The FEM will always

  14. College Enrollment and Work Activity of 2005 High School Graduates. Bureau of Labor Statistics News. USDL 06-514

    ERIC Educational Resources Information Center

    Bureau of Labor Statistics, 2006

    2006-01-01

    In October 2005, 68.6 percent of high school graduates from the class of 2005 were enrolled in colleges or universities, according to data released on March 24, 2006 by the U.S. Department of Labor's Bureau of Labor Statistics. The college enrollment rate for recent high school graduates was a historical high for the series dating back to 1959.…

  15. Research Design and Statistical Methods in Indian Medical Journals: A Retrospective Survey

    PubMed Central

    Hassan, Shabbeer; Yellur, Rajashree; Subramani, Pooventhan; Adiga, Poornima; Gokhale, Manoj; Iyer, Manasa S.; Mayya, Shreemathi S.

    2015-01-01

    Good quality medical research generally requires not only an expertise in the chosen medical field of interest but also a sound knowledge of statistical methodology. The number of medical research articles which have been published in Indian medical journals has increased quite substantially in the past decade. The aim of this study was to collate all evidence on study design quality and statistical analyses used in selected leading Indian medical journals. Ten (10) leading Indian medical journals were selected based on impact factors and all original research articles published in 2003 (N = 588) and 2013 (N = 774) were categorized and reviewed. A validated checklist on study design, statistical analyses, results presentation, and interpretation was used for review and evaluation of the articles. Main outcomes considered in the present study were – study design types and their frequencies, error/defects proportion in study design, statistical analyses, and implementation of CONSORT checklist in RCT (randomized clinical trials). From 2003 to 2013: The proportion of erroneous statistical analyses did not decrease (χ2=0.592, Φ=0.027, p=0.4418), 25% (80/320) in 2003 compared to 22.6% (111/490) in 2013. Compared with 2003, significant improvement was seen in 2013; the proportion of papers using statistical tests increased significantly (χ2=26.96, Φ=0.16, p<0.0001) from 42.5% (250/588) to 56.7 % (439/774). The overall proportion of errors in study design decreased significantly (χ2=16.783, Φ=0.12 p<0.0001), 41.3% (243/588) compared to 30.6% (237/774). In 2013, randomized clinical trials designs has remained very low (7.3%, 43/588) with majority showing some errors (41 papers, 95.3%). Majority of the published studies were retrospective in nature both in 2003 [79.1% (465/588)] and in 2013 [78.2% (605/774)]. Major decreases in error proportions were observed in both results presentation (χ2=24.477, Φ=0.17, p<0.0001), 82.2% (263/320) compared to 66.3% (325/490) and interpretation (χ2=25.616, Φ=0.173, p<0.0001), 32.5% (104/320) compared to 17.1% (84/490), though some serious ones were still present. Indian medical research seems to have made no major progress regarding using correct statistical analyses, but error/defects in study designs have decreased significantly. Randomized clinical trials are quite rarely published and have high proportion of methodological problems. PMID:25856194

  16. Research design and statistical methods in Indian medical journals: a retrospective survey.

    PubMed

    Hassan, Shabbeer; Yellur, Rajashree; Subramani, Pooventhan; Adiga, Poornima; Gokhale, Manoj; Iyer, Manasa S; Mayya, Shreemathi S

    2015-01-01

    Good quality medical research generally requires not only an expertise in the chosen medical field of interest but also a sound knowledge of statistical methodology. The number of medical research articles which have been published in Indian medical journals has increased quite substantially in the past decade. The aim of this study was to collate all evidence on study design quality and statistical analyses used in selected leading Indian medical journals. Ten (10) leading Indian medical journals were selected based on impact factors and all original research articles published in 2003 (N = 588) and 2013 (N = 774) were categorized and reviewed. A validated checklist on study design, statistical analyses, results presentation, and interpretation was used for review and evaluation of the articles. Main outcomes considered in the present study were - study design types and their frequencies, error/defects proportion in study design, statistical analyses, and implementation of CONSORT checklist in RCT (randomized clinical trials). From 2003 to 2013: The proportion of erroneous statistical analyses did not decrease (χ2=0.592, Φ=0.027, p=0.4418), 25% (80/320) in 2003 compared to 22.6% (111/490) in 2013. Compared with 2003, significant improvement was seen in 2013; the proportion of papers using statistical tests increased significantly (χ2=26.96, Φ=0.16, p<0.0001) from 42.5% (250/588) to 56.7 % (439/774). The overall proportion of errors in study design decreased significantly (χ2=16.783, Φ=0.12 p<0.0001), 41.3% (243/588) compared to 30.6% (237/774). In 2013, randomized clinical trials designs has remained very low (7.3%, 43/588) with majority showing some errors (41 papers, 95.3%). Majority of the published studies were retrospective in nature both in 2003 [79.1% (465/588)] and in 2013 [78.2% (605/774)]. Major decreases in error proportions were observed in both results presentation (χ2=24.477, Φ=0.17, p<0.0001), 82.2% (263/320) compared to 66.3% (325/490) and interpretation (χ2=25.616, Φ=0.173, p<0.0001), 32.5% (104/320) compared to 17.1% (84/490), though some serious ones were still present. Indian medical research seems to have made no major progress regarding using correct statistical analyses, but error/defects in study designs have decreased significantly. Randomized clinical trials are quite rarely published and have high proportion of methodological problems.

  17. A Retrospective Survey of Research Design and Statistical Analyses in Selected Chinese Medical Journals in 1998 and 2008

    PubMed Central

    Jin, Zhichao; Yu, Danghui; Zhang, Luoman; Meng, Hong; Lu, Jian; Gao, Qingbin; Cao, Yang; Ma, Xiuqiang; Wu, Cheng; He, Qian; Wang, Rui; He, Jia

    2010-01-01

    Background High quality clinical research not only requires advanced professional knowledge, but also needs sound study design and correct statistical analyses. The number of clinical research articles published in Chinese medical journals has increased immensely in the past decade, but study design quality and statistical analyses have remained suboptimal. The aim of this investigation was to gather evidence on the quality of study design and statistical analyses in clinical researches conducted in China for the first decade of the new millennium. Methodology/Principal Findings Ten (10) leading Chinese medical journals were selected and all original articles published in 1998 (N = 1,335) and 2008 (N = 1,578) were thoroughly categorized and reviewed. A well-defined and validated checklist on study design, statistical analyses, results presentation, and interpretation was used for review and evaluation. Main outcomes were the frequencies of different types of study design, error/defect proportion in design and statistical analyses, and implementation of CONSORT in randomized clinical trials. From 1998 to 2008: The error/defect proportion in statistical analyses decreased significantly ( = 12.03, p<0.001), 59.8% (545/1,335) in 1998 compared to 52.2% (664/1,578) in 2008. The overall error/defect proportion of study design also decreased ( = 21.22, p<0.001), 50.9% (680/1,335) compared to 42.40% (669/1,578). In 2008, design with randomized clinical trials remained low in single digit (3.8%, 60/1,578) with two-third showed poor results reporting (defects in 44 papers, 73.3%). Nearly half of the published studies were retrospective in nature, 49.3% (658/1,335) in 1998 compared to 48.2% (761/1,578) in 2008. Decreases in defect proportions were observed in both results presentation ( = 93.26, p<0.001), 92.7% (945/1,019) compared to 78.2% (1023/1,309) and interpretation ( = 27.26, p<0.001), 9.7% (99/1,019) compared to 4.3% (56/1,309), some serious ones persisted. Conclusions/Significance Chinese medical research seems to have made significant progress regarding statistical analyses, but there remains ample room for improvement regarding study designs. Retrospective clinical studies are the most often used design, whereas randomized clinical trials are rare and often show methodological weaknesses. Urgent implementation of the CONSORT statement is imperative. PMID:20520824

  18. Universal statistics of vortex tangles in three-dimensional random waves

    NASA Astrophysics Data System (ADS)

    Taylor, Alexander J.

    2018-02-01

    The tangled nodal lines (wave vortices) in random, three-dimensional wavefields are studied as an exemplar of a fractal loop soup. Their statistics are a three-dimensional counterpart to the characteristic random behaviour of nodal domains in quantum chaos, but in three dimensions the filaments can wind around one another to give distinctly different large scale behaviours. By tracing numerically the structure of the vortices, their conformations are shown to follow recent analytical predictions for random vortex tangles with periodic boundaries, where the local disorder of the model ‘averages out’ to produce large scale power law scaling relations whose universality classes do not depend on the local physics. These results explain previous numerical measurements in terms of an explicit effect of the periodic boundaries, where the statistics of the vortices are strongly affected by the large scale connectedness of the system even at arbitrarily high energies. The statistics are investigated primarily for static (monochromatic) wavefields, but the analytical results are further shown to directly describe the reconnection statistics of vortices evolving in certain dynamic systems, or occurring during random perturbations of the static configuration.

  19. Pathway analysis with next-generation sequencing data.

    PubMed

    Zhao, Jinying; Zhu, Yun; Boerwinkle, Eric; Xiong, Momiao

    2015-04-01

    Although pathway analysis methods have been developed and successfully applied to association studies of common variants, the statistical methods for pathway-based association analysis of rare variants have not been well developed. Many investigators observed highly inflated false-positive rates and low power in pathway-based tests of association of rare variants. The inflated false-positive rates and low true-positive rates of the current methods are mainly due to their lack of ability to account for gametic phase disequilibrium. To overcome these serious limitations, we develop a novel statistic that is based on the smoothed functional principal component analysis (SFPCA) for pathway association tests with next-generation sequencing data. The developed statistic has the ability to capture position-level variant information and account for gametic phase disequilibrium. By intensive simulations, we demonstrate that the SFPCA-based statistic for testing pathway association with either rare or common or both rare and common variants has the correct type 1 error rates. Also the power of the SFPCA-based statistic and 22 additional existing statistics are evaluated. We found that the SFPCA-based statistic has a much higher power than other existing statistics in all the scenarios considered. To further evaluate its performance, the SFPCA-based statistic is applied to pathway analysis of exome sequencing data in the early-onset myocardial infarction (EOMI) project. We identify three pathways significantly associated with EOMI after the Bonferroni correction. In addition, our preliminary results show that the SFPCA-based statistic has much smaller P-values to identify pathway association than other existing methods.

  20. DOE Office of Scientific and Technical Information (OSTI.GOV)

    West, Bradley M.; Stuckelberger, Michael; Guthrey, Harvey

    We present that statistical and correlative analysis are increasingly important in the design and study of new materials, from semiconductors to metals. Non-destructive measurement techniques, with high spatial resolution, capable of correlating composition and/or structure with device properties, are few and far between. For the case of polycrystalline and inhomogeneous materials, the added challenge is that nanoscale resolution is in general not compatible with the large sampling areas necessary to have a statistical representation of the specimen under study. For the study of grain cores and grain boundaries in polycrystalline solar absorbers this is of particular importance since their dissimilarmore » behavior and variability throughout the samples makes it difficult to draw conclusions and ultimately optimize the material. In this study, we present a nanoscale in-operando approach based on the multimodal utilization of synchrotron nano x-ray fluorescence and x-ray beam induced current collected for grain core and grain boundary areas and correlated pixel-by-pixel in fully operational Cu(In (1-x)Ga x)Se 2 solar cells. We observe that low gallium cells have grain boundaries that over perform compared to the grain cores and high gallium cells have boundaries that under perform. In conclusion, these results demonstrate how nanoscale correlative X-ray microscopy can guide research pathways towards grain engineering low cost, high efficiency solar cells.« less

  1. The effect of an integrated high school science curriculum on student achievement, knowledge retention, and science attitudes

    NASA Astrophysics Data System (ADS)

    Smith, Kimberly A.

    The research study investigates the effectiveness of an integrated high school science curriculum on student achievement, knowledge retention and science attitudes using quantitative and qualitative research. Data was collected from tenth grade students, in a small urban high school in Kansas City, Missouri, who were enrolled in a traditional Biology course or an integrated Environmental Science course. Quantitative data was collected in Phase 1 of the study. Data collected for academic achievement included pretest and posttest scores on the CTBS MATN exam. Data collected for knowledge retention included post-posttest scores on the CTBS MATN exam. Data collected for science attitudes were scores on a pretest and posttest using the TOSRA. SPSS was used to analyze the data using independent samples t-tests, one-way ANCOVA's and paired samples statistics. Qualitative data was collected in Phase 2 of the study. Data included responses to open-ended interview questions using three focus groups. Data was analyzed for common themes. Data analysis revealed the integrated Environmental Science course had a statistically significant impact on academic achievement, knowledge retention and positive science attitudes. Gender and socioeconomic status did not influence results. The study also determined that the CTBS MATN exam was not an accurate predictor of scores on state testing as was previously thought.

  2. Diagnosis checking of statistical analysis in RCTs indexed in PubMed.

    PubMed

    Lee, Paul H; Tse, Andy C Y

    2017-11-01

    Statistical analysis is essential for reporting of the results of randomized controlled trials (RCTs), as well as evaluating their effectiveness. However, the validity of a statistical analysis also depends on whether the assumptions of that analysis are valid. To review all RCTs published in journals indexed in PubMed during December 2014 to provide a complete picture of how RCTs handle assumptions of statistical analysis. We reviewed all RCTs published in December 2014 that appeared in journals indexed in PubMed using the Cochrane highly sensitive search strategy. The 2014 impact factors of the journals were used as proxies for their quality. The type of statistical analysis used and whether the assumptions of the analysis were tested were reviewed. In total, 451 papers were included. Of the 278 papers that reported a crude analysis for the primary outcomes, 31 (27·2%) reported whether the outcome was normally distributed. Of the 172 papers that reported an adjusted analysis for the primary outcomes, diagnosis checking was rarely conducted, with only 20%, 8·6% and 7% checked for generalized linear model, Cox proportional hazard model and multilevel model, respectively. Study characteristics (study type, drug trial, funding sources, journal type and endorsement of CONSORT guidelines) were not associated with the reporting of diagnosis checking. The diagnosis of statistical analyses in RCTs published in PubMed-indexed journals was usually absent. Journals should provide guidelines about the reporting of a diagnosis of assumptions. © 2017 Stichting European Society for Clinical Investigation Journal Foundation.

  3. Anderson Localization in Quark-Gluon Plasma

    NASA Astrophysics Data System (ADS)

    Kovács, Tamás G.; Pittler, Ferenc

    2010-11-01

    At low temperature the low end of the QCD Dirac spectrum is well described by chiral random matrix theory. In contrast, at high temperature there is no similar statistical description of the spectrum. We show that at high temperature the lowest part of the spectrum consists of a band of statistically uncorrelated eigenvalues obeying essentially Poisson statistics and the corresponding eigenvectors are extremely localized. Going up in the spectrum the spectral density rapidly increases and the eigenvectors become more and more delocalized. At the same time the spectral statistics gradually crosses over to the bulk statistics expected from the corresponding random matrix ensemble. This phenomenon is reminiscent of Anderson localization in disordered conductors. Our findings are based on staggered Dirac spectra in quenched lattice simulations with the SU(2) gauge group.

  4. Low power and type II errors in recent ophthalmology research.

    PubMed

    Khan, Zainab; Milko, Jordan; Iqbal, Munir; Masri, Moness; Almeida, David R P

    2016-10-01

    To investigate the power of unpaired t tests in prospective, randomized controlled trials when these tests failed to detect a statistically significant difference and to determine the frequency of type II errors. Systematic review and meta-analysis. We examined all prospective, randomized controlled trials published between 2010 and 2012 in 4 major ophthalmology journals (Archives of Ophthalmology, British Journal of Ophthalmology, Ophthalmology, and American Journal of Ophthalmology). Studies that used unpaired t tests were included. Power was calculated using the number of subjects in each group, standard deviations, and α = 0.05. The difference between control and experimental means was set to be (1) 20% and (2) 50% of the absolute value of the control's initial conditions. Power and Precision version 4.0 software was used to carry out calculations. Finally, the proportion of articles with type II errors was calculated. β = 0.3 was set as the largest acceptable value for the probability of type II errors. In total, 280 articles were screened. Final analysis included 50 prospective, randomized controlled trials using unpaired t tests. The median power of tests to detect a 50% difference between means was 0.9 and was the same for all 4 journals regardless of the statistical significance of the test. The median power of tests to detect a 20% difference between means ranged from 0.26 to 0.9 for the 4 journals. The median power of these tests to detect a 50% and 20% difference between means was 0.9 and 0.5 for tests that did not achieve statistical significance. A total of 14% and 57% of articles with negative unpaired t tests contained results with β > 0.3 when power was calculated for differences between means of 50% and 20%, respectively. A large portion of studies demonstrate high probabilities of type II errors when detecting small differences between means. The power to detect small difference between means varies across journals. It is, therefore, worthwhile for authors to mention the minimum clinically important difference for individual studies. Journals can consider publishing statistical guidelines for authors to use. Day-to-day clinical decisions rely heavily on the evidence base formed by the plethora of studies available to clinicians. Prospective, randomized controlled clinical trials are highly regarded as a robust study and are used to make important clinical decisions that directly affect patient care. The quality of study designs and statistical methods in major clinical journals is improving overtime, 1 and researchers and journals are being more attentive to statistical methodologies incorporated by studies. The results of well-designed ophthalmic studies with robust methodologies, therefore, have the ability to modify the ways in which diseases are managed. Copyright © 2016 Canadian Ophthalmological Society. Published by Elsevier Inc. All rights reserved.

  5. Detection of proximal caries using quantitative light-induced fluorescence-digital and laser fluorescence: a comparative study.

    PubMed

    Yoon, Hyung-In; Yoo, Min-Jeong; Park, Eun-Jin

    2017-12-01

    The purpose of this study was to evaluate the in vitro validity of quantitative light-induced fluorescence-digital (QLF-D) and laser fluorescence (DIAGNOdent) for assessing proximal caries in extracted premolars, using digital radiography as reference method. A total of 102 extracted premolars with similar lengths and shapes were used. A single operator conducted all the examinations using three different detection methods (bitewing radiography, QLF-D, and DIAGNOdent). The bitewing x-ray scale, QLF-D fluorescence loss (ΔF), and DIAGNOdent peak readings were compared and statistically analyzed. Each method showed an excellent reliability. The correlation coefficient between bitewing radiography and QLF-D, DIAGNOdent were -0.644 and 0.448, respectively, while the value between QLF-D and DIAGNOdent was -0.382. The kappa statistics for bitewing radiography and QLF-D had a higher diagnosis consensus than those for bitewing radiography and DIAGNOdent. The QLF-D was moderately to highly accurate (AUC = 0.753 - 0.908), while DIAGNOdent was moderately to less accurate (AUC = 0.622 - 0.784). All detection methods showed statistically significant correlation and high correlation between the bitewing radiography and QLF-D. QLF-D was found to be a valid and reliable alternative diagnostic method to digital bitewing radiography for in vitro detection of proximal caries.

  6. Clinical evaluation of selected Yogic procedures in individuals with low back pain

    PubMed Central

    Pushpika Attanayake, A. M.; Somarathna, K. I. W. K.; Vyas, G. H.; Dash, S. C.

    2010-01-01

    The present study has been conducted to evaluate selected yogic procedures on individuals with low back pain. The understanding of back pain as one of the commonest clinical presentations during clinical practice made the path to the present study. It has also been calculated that more than three-quarters of the world's population experience back pain at some time in their lives. Twelve patients were selected and randomly divided into two groups, viz., group A yogic group and group B control group. Advice for life style and diet was given for all the patients. The effect of the therapy was assessed subjectively and objectively. Particular scores drawn for yogic group and control group were individually analyzed before and after treatment and the values were compared using standard statistical protocols. Yogic intervention revealed 79% relief in both subjective and objective parameters (i.e., 7 out of 14 parameters showed statistically highly significant P < 0.01 results, while 4 showed significant results P < 0.05). Comparative effect of yogic group and control group showed 79% relief in both subjective and objective parameters. (i.e., total 6 out of 14 parameters showed statistically highly significant (P < 0.01) results, while 5 showed significant results (P < 0.05). PMID:22131719

  7. Use of a statistical model of the whole femur in a large scale, multi-model study of femoral neck fracture risk.

    PubMed

    Bryan, Rebecca; Nair, Prasanth B; Taylor, Mark

    2009-09-18

    Interpatient variability is often overlooked in orthopaedic computational studies due to the substantial challenges involved in sourcing and generating large numbers of bone models. A statistical model of the whole femur incorporating both geometric and material property variation was developed as a potential solution to this problem. The statistical model was constructed using principal component analysis, applied to 21 individual computer tomography scans. To test the ability of the statistical model to generate realistic, unique, finite element (FE) femur models it was used as a source of 1000 femurs to drive a study on femoral neck fracture risk. The study simulated the impact of an oblique fall to the side, a scenario known to account for a large proportion of hip fractures in the elderly and have a lower fracture load than alternative loading approaches. FE model generation, application of subject specific loading and boundary conditions, FE processing and post processing of the solutions were completed automatically. The generated models were within the bounds of the training data used to create the statistical model with a high mesh quality, able to be used directly by the FE solver without remeshing. The results indicated that 28 of the 1000 femurs were at highest risk of fracture. Closer analysis revealed the percentage of cortical bone in the proximal femur to be a crucial differentiator between the failed and non-failed groups. The likely fracture location was indicated to be intertrochantic. Comparison to previous computational, clinical and experimental work revealed support for these findings.

  8. Shift work, long working hours and preterm birth: a systematic review and meta-analysis.

    PubMed

    van Melick, M J G J; van Beukering, M D M; Mol, B W; Frings-Dresen, M H W; Hulshof, C T J

    2014-11-01

    Specific physical activities or working conditions are suspected for increasing the risk of preterm birth (PTB). The aim of this meta-analysis is to review and summarize the pre-existing evidence on the effect of shift work or long working hours on the risk of PTB. We conducted a systematic search in MEDLINE and EMBASE (1990-2013) for observational and intervention studies with original data. We only included articles that met our specific criteria for language, exposure, outcome, data collection and original data that were of at least of moderate quality. The data of the included studies were pooled. Eight high-quality studies and eight moderate-quality studies were included in the meta-analysis. In these studies, no clear or statistically significant relationship between shift work and PTB was found. The summary estimate OR for performing shift work during pregnancy and the risk of PTB were 1.04 (95% CI 0.90-1.20). For long working hours during pregnancy, the summary estimate OR was 1.25 (95% CI 1.01-1.54), indicating a marginally statistically significant relationship but an only slightly elevated risk. Although in many of the included studies a positive association between long working hours and PTB was seen this did reach only marginal statistical significance. In the studies included in this review, working in shifts or in night shifts during pregnancy was not significantly associated with an increased risk for PTB. For both risk factors, due to the lack of high-quality studies focusing on the risks per trimester, in particular the third trimester, a firm conclusion about an association cannot be stated.

  9. A Web Site that Provides Resources for Assessing Students' Statistical Literacy, Reasoning and Thinking

    ERIC Educational Resources Information Center

    Garfield, Joan; delMas, Robert

    2010-01-01

    The Assessment Resource Tools for Improving Statistical Thinking (ARTIST) Web site was developed to provide high-quality assessment resources for faculty who teach statistics at the tertiary level but resources are also useful to statistics teachers at the secondary level. This article describes some of the numerous ARTIST resources and suggests…

  10. Use of simulation-based learning in undergraduate nurse education: An umbrella systematic review.

    PubMed

    Cant, Robyn P; Cooper, Simon J

    2017-02-01

    To conduct a systematic review to appraise and review evidence on the impact of simulation-based education for undergraduate/pre-licensure nursing students, using existing reviews of literature. An umbrella review (review of reviews). Cumulative Index of Nursing and Allied Health Literature (CINAHLPlus), PubMed, and Google Scholar. Reviews of literature conducted between 2010 and 2015 regarding simulation-based education for pre-licensure nursing students. The Joanna Briggs Institute methodology for conduct of an umbrella review was used to inform the review process. Twenty-five systematic reviews of literature were included, of which 14 were recent (2013-2015). Most described the level of evidence of component studies as a mix of experimental and quasi-experimental designs. The reviews measured around 14 different main outcome variables, thus limiting the number of primary studies that each individual review could pool to appraise. Many reviews agreed on the key learning outcome of knowledge acquisition, although no overall quantitative effect was derived. Three of four high-quality reviews found that simulation supported psychomotor development; a fourth found too few high quality studies to make a statistical comparison. Simulation statistically improved self-efficacy in pretest-posttest studies, and in experimental designs self-efficacy was superior to that of other teaching methods; lower level research designs limiting further comparison. The reviews commonly reported strong student satisfaction with simulation education and some reported improved confidence and/or critical thinking. This umbrella review took a global view of 25 reviews of simulation research in nursing education, comprising over 700 primary studies. To discern overall outcomes across reviews, statistical comparison of quantitative results (effect size) must be the key comparator. Simulation-based education contributes to students' learning in a number of ways when integrated into pre-licensure nursing curricula. Overall, use of a constellation of instruments and a lack of high quality study designs mean that there are still some gaps in evidence of effects that need to be addressed. Copyright © 2016 Elsevier Ltd. All rights reserved.

  11. Reliability and statistical power analysis of cortical and subcortical FreeSurfer metrics in a large sample of healthy elderly.

    PubMed

    Liem, Franziskus; Mérillat, Susan; Bezzola, Ladina; Hirsiger, Sarah; Philipp, Michel; Madhyastha, Tara; Jäncke, Lutz

    2015-03-01

    FreeSurfer is a tool to quantify cortical and subcortical brain anatomy automatically and noninvasively. Previous studies have reported reliability and statistical power analyses in relatively small samples or only selected one aspect of brain anatomy. Here, we investigated reliability and statistical power of cortical thickness, surface area, volume, and the volume of subcortical structures in a large sample (N=189) of healthy elderly subjects (64+ years). Reliability (intraclass correlation coefficient) of cortical and subcortical parameters is generally high (cortical: ICCs>0.87, subcortical: ICCs>0.95). Surface-based smoothing increases reliability of cortical thickness maps, while it decreases reliability of cortical surface area and volume. Nevertheless, statistical power of all measures benefits from smoothing. When aiming to detect a 10% difference between groups, the number of subjects required to test effects with sufficient power over the entire cortex varies between cortical measures (cortical thickness: N=39, surface area: N=21, volume: N=81; 10mm smoothing, power=0.8, α=0.05). For subcortical regions this number is between 16 and 76 subjects, depending on the region. We also demonstrate the advantage of within-subject designs over between-subject designs. Furthermore, we publicly provide a tool that allows researchers to perform a priori power analysis and sensitivity analysis to help evaluate previously published studies and to design future studies with sufficient statistical power. Copyright © 2014 Elsevier Inc. All rights reserved.

  12. Prevalence of Tuberculosis among Veterans, Military Personnel and their Families in East Azerbaijan Province Violators of the last 15 Years.

    PubMed

    Azad Aminjan, Maboud; Moaddab, Seyyed Reza; Hosseini Ravandi, Mohammad; Kazemi Haki, Behzad

    2015-10-01

    Nowadays in the world, tuberculosis is the second largest killer of adults after HIV. Due to the location of presidios that is mostly located in hazardous zones soldiers and army personnel are considered high risk, therefore we decided to determine the prevalence of tuberculosis status in this group of people. This was a cross-sectional descriptive research that studied the prevalence of pulmonary tuberculosis in soldiers and military personnel in the last 15 years in tuberculosis and lung disease research center at Tabriz University of Medical Sciences. The statistical population consisted of all the soldiers and military personnel. The detection method in this study was based on microscopic examination following Ziehl-Neelsen Stain and in Leuven Stein Johnson culturing. Descriptive statistics was used for statistical analysis and statistical values less than 0.05 were considered significant. By review information in this center since the 1988-2013 with 72 military personnel suffering from tuberculosis, it was revealed that among them 30 women, 42 men, 14 soldiers, 29 family members, and 29 military personnel are pointed. A significant correlation was found between TB rates among military personnel and their families. Although in recent years, the national statistics indicate a decline of tuberculosis, but the results of our study showed that TB is still a serious disease that must comply with the first symptoms of tuberculosis in military personnel and their families that should be diagnosed as soon as possible.

  13. Clinical and economic impact of antibiotic resistance in developing countries: A systematic review and meta-analysis.

    PubMed

    Founou, Raspail Carrel; Founou, Luria Leslie; Essack, Sabiha Yusuf

    2017-01-01

    Despite evidence of the high prevalence of antibiotic resistant infections in developing countries, studies on the clinical and economic impact of antibiotic resistance (ABR) to inform interventions to contain its emergence and spread are limited. The aim of this study was to analyze the published literature on the clinical and economic implications of ABR in developing countries. A systematic search was carried out in Medline via PubMed and Web of Sciences and included studies published from January 01, 2000 to December 09, 2016. All papers were considered and a quality assessment was performed using the Newcastle-Ottawa quality assessment scale (NOS). Of 27 033 papers identified, 40 studies met the strict inclusion and exclusion criteria and were finally included in the qualitative and quantitative analysis. Mortality was associated with resistant bacteria, and statistical significance was evident with an odds ratio (OR) 2.828 (95%CI, 2.231-3.584; p = 0.000). ESKAPE pathogens was associated with the highest risk of mortality and with high statistical significance (OR 3.217; 95%CIs; 2.395-4.321; p = 0.001). Eight studies showed that ABR, and especially antibiotic-resistant ESKAPE bacteria significantly increased health care costs. ABR is associated with a high mortality risk and increased economic costs with ESKAPE pathogens implicated as the main cause of increased mortality. Patients with non-communicable disease co-morbidities were identified as high-risk populations.

  14. Clinical and economic impact of antibiotic resistance in developing countries: A systematic review and meta-analysis

    PubMed Central

    Founou, Luria Leslie; Essack, Sabiha Yusuf

    2017-01-01

    Introduction Despite evidence of the high prevalence of antibiotic resistant infections in developing countries, studies on the clinical and economic impact of antibiotic resistance (ABR) to inform interventions to contain its emergence and spread are limited. The aim of this study was to analyze the published literature on the clinical and economic implications of ABR in developing countries. Methods A systematic search was carried out in Medline via PubMed and Web of Sciences and included studies published from January 01, 2000 to December 09, 2016. All papers were considered and a quality assessment was performed using the Newcastle-Ottawa quality assessment scale (NOS). Results Of 27 033 papers identified, 40 studies met the strict inclusion and exclusion criteria and were finally included in the qualitative and quantitative analysis. Mortality was associated with resistant bacteria, and statistical significance was evident with an odds ratio (OR) 2.828 (95%CI, 2.231–3.584; p = 0.000). ESKAPE pathogens was associated with the highest risk of mortality and with high statistical significance (OR 3.217; 95%CIs; 2.395–4.321; p = 0.001). Eight studies showed that ABR, and especially antibiotic-resistant ESKAPE bacteria significantly increased health care costs. Conclusion ABR is associated with a high mortality risk and increased economic costs with ESKAPE pathogens implicated as the main cause of increased mortality. Patients with non-communicable disease co-morbidities were identified as high-risk populations. PMID:29267306

  15. Common statistical and research design problems in manuscripts submitted to high-impact psychiatry journals: what editors and reviewers want authors to know.

    PubMed

    Harris, Alex H S; Reeder, Rachelle; Hyun, Jenny K

    2009-10-01

    Journal editors and statistical reviewers are often in the difficult position of catching serious problems in submitted manuscripts after the research is conducted and data have been analyzed. We sought to learn from editors and reviewers of major psychiatry journals what common statistical and design problems they most often find in submitted manuscripts and what they wished to communicate to authors regarding these issues. Our primary goal was to facilitate communication between journal editors/reviewers and researchers/authors and thereby improve the scientific and statistical quality of research and submitted manuscripts. Editors and statistical reviewers of 54 high-impact psychiatry journals were surveyed to learn what statistical or design problems they encounter most often in submitted manuscripts. Respondents completed the survey online. The authors analyzed survey text responses using content analysis procedures to identify major themes related to commonly encountered statistical or research design problems. Editors and reviewers (n=15) who handle manuscripts from 39 different high-impact psychiatry journals responded to the survey. The most commonly cited problems regarded failure to map statistical models onto research questions, improper handling of missing data, not controlling for multiple comparisons, not understanding the difference between equivalence and difference trials, and poor controls in quasi-experimental designs. The scientific quality of psychiatry research and submitted reports could be greatly improved if researchers became sensitive to, or sought consultation on frequently encountered methodological and analytic issues.

  16. Evaluating the Male and Female Students' Welcome of the Cultural-Art Plans

    ERIC Educational Resources Information Center

    Eshaghian, Masomeh; Saadatmand, Zohreh

    2015-01-01

    The purpose of this study is to evaluate the welcome of culture-art plans in the high schools of Khomeyni Shahr City from perspectives of the educational coaches and students in the 2013. The present study is a descriptive-survey research. The statistical population of this study includes the educational coaches and students participating in the…

  17. Constructed Response Tests in the NELS:88 High School Effectiveness Study. National Education Longitudinal Study of 1988 Second Followup. Statistical Analysis Report.

    ERIC Educational Resources Information Center

    Pollock, Judith M.; And Others

    This report describes an experiment in constructed response testing undertaken in conjunction with the National Education Longitudinal Study of 1988 (NELS:88). Constructed response questions are those that require students to produce their own response rather than selecting the correct answer from several options. Participants in this experiment…

  18. Reconnection properties in Kelvin-Helmholtz instabilities

    NASA Astrophysics Data System (ADS)

    Vernisse, Y.; Lavraud, B.; Eriksson, S.; Gershman, D. J.; Dorelli, J.; Pollock, C. J.; Giles, B. L.; Aunai, N.; Avanov, L. A.; Burch, J.; Chandler, M. O.; Coffey, V. N.; Dargent, J.; Ergun, R.; Farrugia, C. J.; Genot, V. N.; Graham, D.; Hasegawa, H.; Jacquey, C.; Kacem, I.; Khotyaintsev, Y. V.; Li, W.; Magnes, W.; Marchaudon, A.; Moore, T. E.; Paterson, W. R.; Penou, E.; Phan, T.; Retino, A.; Schwartz, S. J.; Saito, Y.; Sauvaud, J. A.; Schiff, C.; Torbert, R. B.; Wilder, F. D.; Yokota, S.

    2017-12-01

    Kelvin-Helmholtz instabilities are particular laboratories to study strong guide field reconnection processes. In particular, unlike the usual dayside magnetopause, the conditions across the magnetopause in KH vortices are quasi-symmetric, with low differences in beta and magnetic shear angle. We study these properties by means of statistical analysis of the high-resolution data of the Magnetospheric Multiscale mission. Several events of Kelvin-Helmholtz instabilities pas the terminator plane and a long lasting dayside instabilities event where used in order to produce this statistical analysis. Early results present a consistency between the data and the theory. In addition, the results emphasize the importance of the thickness of the magnetopause as a driver of magnetic reconnection in low magnetic shear events.

  19. DOE Office of Scientific and Technical Information (OSTI.GOV)

    Marekova, Elisaveta

    Series of relatively large earthquakes in different regions of the Earth are studied. The regions chooses are of a high seismic activity and has a good contemporary network for recording of the seismic events along them. The main purpose of this investigation is the attempt to describe analytically the seismic process in the space and time. We are considering the statistical distributions the distances and the times between consecutive earthquakes (so called pair analysis). Studies conducted on approximating the statistical distribution of the parameters of consecutive seismic events indicate the existence of characteristic functions that describe them best. Such amore » mathematical description allows the distributions of the examined parameters to be compared to other model distributions.« less

  20. Breaking Free of Sample Size Dogma to Perform Innovative Translational Research

    PubMed Central

    Bacchetti, Peter; Deeks, Steven G.; McCune, Joseph M.

    2011-01-01

    Innovative clinical and translational research is often delayed or prevented by reviewers’ expectations that any study performed in humans must be shown in advance to have high statistical power. This supposed requirement is not justifiable and is contradicted by the reality that increasing sample size produces diminishing marginal returns. Studies of new ideas often must start small (sometimes even with an N of 1) because of cost and feasibility concerns, and recent statistical work shows that small sample sizes for such research can produce more projected scientific value per dollar spent than larger sample sizes. Renouncing false dogma about sample size would remove a serious barrier to innovation and translation. PMID:21677197

  1. Statistical and Detailed Analysis on Fiber Reinforced Self-Compacting Concrete Containing Admixtures- A State of Art of Review

    NASA Astrophysics Data System (ADS)

    Athiyamaan, V.; Mohan Ganesh, G.

    2017-11-01

    Self-Compacting Concrete is one of the special concretes that have ability to flow and consolidate on its own weight, completely fill the formwork even in the presence of dense reinforcement; whilst maintaining its homogeneity throughout the formwork without any requirement for vibration. Researchers all over the world are developing high performance concrete by adding various Fibers, admixtures in different proportions. Various different kinds Fibers like glass, steel, carbon, Poly propylene and aramid Fibers provide improvement in concrete properties like tensile strength, fatigue characteristic, durability, shrinkage, impact, erosion resistance and serviceability of concrete[6]. It includes fundamental study on fiber reinforced self-compacting concrete with admixtures; its rheological properties, mechanical properties and overview study on design methodology statistical approaches regarding optimizing the concrete performances. The study has been classified into seven basic chapters: introduction, phenomenal study on material properties review on self-compacting concrete, overview on fiber reinforced self-compacting concrete containing admixtures, review on design and analysis of experiment; a statistical approach, summary of existing works on FRSCC and statistical modeling, literature review and, conclusion. It is so eminent to know the resent studies that had been done on polymer based binder materials (fly ash, metakaolin, GGBS, etc.), fiber reinforced concrete and SCC; to do an effective research on fiber reinforced self-compacting concrete containing admixtures. The key aim of the study is to sort-out the research gap and to gain a complete knowledge on polymer based Self compacting fiber reinforced concrete.

  2. Predictors of high out-of-pocket healthcare expenditure: an analysis using Bangladesh household income and expenditure survey, 2010.

    PubMed

    Molla, Azaher Ali; Chi, Chunhuei; Mondaca, Alicia Lorena Núñez

    2017-01-31

    Predictors of high out-of-pocket household healthcare expenditure are essential for creating effective health system finance policy. In Bangladesh, 63.3% of health expenditure is out-of-pocket and born by households. It is imperative to know what determines household health expenditure. This study aims to investigate the predicting factors of high out-of-pocket household healthcare expenditure targeting to put forward policy recommendations on equity in financial burden. Bangladesh household income and expenditure survey 2010 provides data for this study. Predictors of high out-of-pocket household healthcare expenditure were analyzed using multiple linear regressions. We have modeled non-linear relationship using logarithmic form of linear regression. Heteroscedasticity and multicollinearity were checked using Breusch-Pagan/Cook-Weishberg and VIF tests. Normality of the residuals was checked using Kernel density curve. We applied required adjustment for survey data, so that standard errors and parameters estimation are valid. Presence of chronic disease and household income were found to be the most influential and statistically significant (p < 0.001) predictors of high household healthcare expenditure. Households in rural areas spend 7% less than urban dwellers. The results show that a 100% increase in female members in a family leads to a 2% decrease in household health expenditure. Household income, health shocks in families, and family size are other statistically significant predictors of household healthcare expenditure. Proportion of elderly and under-five members in the family show some positive influence on health expenditure, though statistically nonsignificant. The findings call for emphasizing prevention of chronic diseases, as it is a strong predictor of household health expenditure. Innovative insurance scheme needs to be devised to prevent household from being impoverished due to health shocks in the family. Policy makers are urged to design an alternative source of healthcare financing in Bangladesh to minimize the burden of high OOP healthcare expenditure.

  3. Constraining nuclear photon strength functions by the decay properties of photo-excited states

    NASA Astrophysics Data System (ADS)

    Isaak, J.; Savran, D.; Krtička, M.; Ahmed, M. W.; Beller, J.; Fiori, E.; Glorius, J.; Kelley, J. H.; Löher, B.; Pietralla, N.; Romig, C.; Rusev, G.; Scheck, M.; Schnorrenberger, L.; Silva, J.; Sonnabend, K.; Tonchev, A. P.; Tornow, W.; Weller, H. R.; Zweidinger, M.

    2013-12-01

    A new approach for constraining the low-energy part of the electric dipole Photon Strength Function (E1-PSF) is presented. Experiments at the Darmstadt High-Intensity Photon Setup and the High Intensity γ→-Ray Source have been performed to investigate the decay properties of 130Te between 5.50 and 8.15 MeV excitation energy. In particular, the average γ-ray branching ratio to the ground state and the population intensity of low-lying excited states have been studied. A comparison to the statistical model shows that the latter is sensitive to the low-energy behavior of the E1-PSF, while the average ground state branching ratio cannot be described by the statistical model in the energy range between 5.5 and 6.5 MeV.

  4. Evolution of Precipitation Particle Size Distributions within MC3E Systems and its Impact on Aerosol-Cloud-Precipitation Interactions: Final Report

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Kollias, Pavlos

    2017-08-08

    This is a multi-institutional, collaborative project using observations and modeling to study the evolution (e.g. formation and growth) of hydrometeors in continental convective clouds. Our contribution was in data analysis for the generation of high-value cloud and precipitation products and derive cloud statistics for model validation. There are two areas in data analysis that we contributed: i) the development of novel, state-of-the-art dual-wavelength radar algorithms for the retrieval of cloud microphysical properties and ii) the evaluation of large domain, high-resolution models using comprehensive multi-sensor observations. Our research group developed statistical summaries from numerous sensors and developed retrievals of vertical airmore » motion in deep convection.« less

  5. Statistical Study of High-Velocity Compact Clouds Based on the Complete CO Imagings of the Central Molecular Zone

    NASA Astrophysics Data System (ADS)

    Tokuyama, Sekito; Oka, Tomoharu; Takekawa, Shunya; Yamada, Masaya; Iwata, Yuhei; Tsujimoto, Shiho

    2017-01-01

    High-velocity compact clouds (HVCCs) is one of the populations of peculiar clouds detected in the Central Molecular Zone (CMZ) of our Galaxy. They have compact appearances (< 5 pc) and large velocity widths (> 50 km s-1). Several explanations for the origin of HVCC were proposed; e.g., a series of supernovae (SN) explosions (Oka et al. 1999) or a gravitational kick by a point-like gravitational source (Oka et al. 2016). To investigate the statistical property of HVCCs, a complete list of them is acutely necessary. However, the previous list is not complete since the identification procedure included automated processes and manual selection (Nagai 2008). Here we developed an automated procedure to identify HVCCs in a spectral line data.

  6. The case for increasing the statistical power of eddy covariance ecosystem studies: why, where and how?

    PubMed

    Hill, Timothy; Chocholek, Melanie; Clement, Robert

    2017-06-01

    Eddy covariance (EC) continues to provide invaluable insights into the dynamics of Earth's surface processes. However, despite its many strengths, spatial replication of EC at the ecosystem scale is rare. High equipment costs are likely to be partially responsible. This contributes to the low sampling, and even lower replication, of ecoregions in Africa, Oceania (excluding Australia) and South America. The level of replication matters as it directly affects statistical power. While the ergodicity of turbulence and temporal replication allow an EC tower to provide statistically robust flux estimates for its footprint, these principles do not extend to larger ecosystem scales. Despite the challenge of spatially replicating EC, it is clearly of interest to be able to use EC to provide statistically robust flux estimates for larger areas. We ask: How much spatial replication of EC is required for statistical confidence in our flux estimates of an ecosystem? We provide the reader with tools to estimate the number of EC towers needed to achieve a given statistical power. We show that for a typical ecosystem, around four EC towers are needed to have 95% statistical confidence that the annual flux of an ecosystem is nonzero. Furthermore, if the true flux is small relative to instrument noise and spatial variability, the number of towers needed can rise dramatically. We discuss approaches for improving statistical power and describe one solution: an inexpensive EC system that could help by making spatial replication more affordable. However, we note that diverting limited resources from other key measurements in order to allow spatial replication may not be optimal, and a balance needs to be struck. While individual EC towers are well suited to providing fluxes from the flux footprint, we emphasize that spatial replication is essential for statistically robust fluxes if a wider ecosystem is being studied. © 2016 The Authors Global Change Biology Published by John Wiley & Sons Ltd.

  7. Statistical estimation of femur micro-architecture using optimal shape and density predictors.

    PubMed

    Lekadir, Karim; Hazrati-Marangalou, Javad; Hoogendoorn, Corné; Taylor, Zeike; van Rietbergen, Bert; Frangi, Alejandro F

    2015-02-26

    The personalization of trabecular micro-architecture has been recently shown to be important in patient-specific biomechanical models of the femur. However, high-resolution in vivo imaging of bone micro-architecture using existing modalities is still infeasible in practice due to the associated acquisition times, costs, and X-ray radiation exposure. In this study, we describe a statistical approach for the prediction of the femur micro-architecture based on the more easily extracted subject-specific bone shape and mineral density information. To this end, a training sample of ex vivo micro-CT images is used to learn the existing statistical relationships within the low and high resolution image data. More specifically, optimal bone shape and mineral density features are selected based on their predictive power and used within a partial least square regression model to estimate the unknown trabecular micro-architecture within the anatomical models of new subjects. The experimental results demonstrate the accuracy of the proposed approach, with average errors of 0.07 for both the degree of anisotropy and tensor norms. Copyright © 2015 Elsevier Ltd. All rights reserved.

  8. A study on the use of Gumbel approximation with the Bernoulli spatial scan statistic.

    PubMed

    Read, S; Bath, P A; Willett, P; Maheswaran, R

    2013-08-30

    The Bernoulli version of the spatial scan statistic is a well established method of detecting localised spatial clusters in binary labelled point data, a typical application being the epidemiological case-control study. A recent study suggests the inferential accuracy of several versions of the spatial scan statistic (principally the Poisson version) can be improved, at little computational cost, by using the Gumbel distribution, a method now available in SaTScan(TM) (www.satscan.org). We study in detail the effect of this technique when applied to the Bernoulli version and demonstrate that it is highly effective, albeit with some increase in false alarm rates at certain significance thresholds. We explain how this increase is due to the discrete nature of the Bernoulli spatial scan statistic and demonstrate that it can affect even small p-values. Despite this, we argue that the Gumbel method is actually preferable for very small p-values. Furthermore, we extend previous research by running benchmark trials on 12 000 synthetic datasets, thus demonstrating that the overall detection capability of the Bernoulli version (i.e. ratio of power to false alarm rate) is not noticeably affected by the use of the Gumbel method. We also provide an example application of the Gumbel method using data on hospital admissions for chronic obstructive pulmonary disease. Copyright © 2013 John Wiley & Sons, Ltd.

  9. Accounting for competing risks in randomized controlled trials: a review and recommendations for improvement.

    PubMed

    Austin, Peter C; Fine, Jason P

    2017-04-15

    In studies with survival or time-to-event outcomes, a competing risk is an event whose occurrence precludes the occurrence of the primary event of interest. Specialized statistical methods must be used to analyze survival data in the presence of competing risks. We conducted a review of randomized controlled trials with survival outcomes that were published in high-impact general medical journals. Of 40 studies that we identified, 31 (77.5%) were potentially susceptible to competing risks. However, in the majority of these studies, the potential presence of competing risks was not accounted for in the statistical analyses that were described. Of the 31 studies potentially susceptible to competing risks, 24 (77.4%) reported the results of a Kaplan-Meier survival analysis, while only five (16.1%) reported using cumulative incidence functions to estimate the incidence of the outcome over time in the presence of competing risks. The former approach will tend to result in an overestimate of the incidence of the outcome over time, while the latter approach will result in unbiased estimation of the incidence of the primary outcome over time. We provide recommendations on the analysis and reporting of randomized controlled trials with survival outcomes in the presence of competing risks. © 2017 The Authors. Statistics in Medicine published by John Wiley & Sons Ltd. © 2017 The Authors. Statistics in Medicine published by John Wiley & Sons Ltd.

  10. High-Throughput Assay Optimization and Statistical Interpolation of Rubella-Specific Neutralizing Antibody Titers

    PubMed Central

    Lambert, Nathaniel D.; Pankratz, V. Shane; Larrabee, Beth R.; Ogee-Nwankwo, Adaeze; Chen, Min-hsin; Icenogle, Joseph P.

    2014-01-01

    Rubella remains a social and economic burden due to the high incidence of congenital rubella syndrome (CRS) in some countries. For this reason, an accurate and efficient high-throughput measure of antibody response to vaccination is an important tool. In order to measure rubella-specific neutralizing antibodies in a large cohort of vaccinated individuals, a high-throughput immunocolorimetric system was developed. Statistical interpolation models were applied to the resulting titers to refine quantitative estimates of neutralizing antibody titers relative to the assayed neutralizing antibody dilutions. This assay, including the statistical methods developed, can be used to assess the neutralizing humoral immune response to rubella virus and may be adaptable for assessing the response to other viral vaccines and infectious agents. PMID:24391140

  11. Risk Factors of Voice Disorders and Impact of Vocal Hygiene Awareness Program Among Teachers in Public Schools in Egypt.

    PubMed

    Bolbol, Sarah A; Zalat, Marwa M; Hammam, Rehab A M; Elnakeb, Nasser L

    2017-03-01

    Even though many studies have explored the problem of voice disorders among teachers worldwide, this problem is still not adequately studied in Egypt. The following study was conducted to investigate the risk factors of voice disorders among an Egyptian sample of school teachers, to measure the effect of a vocal hygiene awareness program on them, and to investigate their vocal cord lesions. One hundred fifty-six teachers working in public schools and 180 administrative workers in the Faculty of Medicine in the same city participated in this study. They completed a self-administered questionnaire investigating voice disorders, and were subjected to a voice awareness program and a clinical examination. Voice-related symptoms and Voice Handicap Index were statistically significantly higher among teachers compared with the control subjects. Work duration and high frequency of classes per week of ≥15 were the most statistically significant indicators influencing a teacher's voice. Three months after application of vocal hygiene awareness program, the teachers who were studied showed a statistically significant increase in their awareness about vocal hygiene tips. Egyptian teachers working in public schools are dealing with classes that include a great number of students per class. They also have to deal with unprofessional facilities and limited assisting resources. Therefore, they are highly exposed to the risk of voice-related disorders. Increasing awareness about healthy behavior with the voice in their occupations will help in improving their quality of work and in minimizing any permanent impairments and/or disability. Copyright © 2017 The Voice Foundation. Published by Elsevier Inc. All rights reserved.

  12. Manufacturing Industries with High Concentrations of Scientists and Engineers Lead in 1965-77 Employment Growth. Science Resources Studies Highlights, April 20, 1979.

    ERIC Educational Resources Information Center

    National Science Foundation, Washington, DC. Div. of Science Resources Studies.

    Presented are the results of a survey of over 100,000 manufacturing establishments, conducted for the National Science Foundation by the Bureau of Labor Statistics, covering average annual employment for calendar year 1977. Industries whose relative concentration of scientists and engineers was high in 1977, such as petroleum refining, chemicals,…

  13. The Effect of Music Participation on Mathematical Achievement and Overall Academic Achievement of High School Students

    ERIC Educational Resources Information Center

    Cox, H. A.; Stephens, L. J.

    2006-01-01

    A study was conducted on high school students, comparing those with some music credits to those with none. No statistically significant difference was found in their mean math grade point averages (GPA) or their mean cumulative GPAs. Students were then separated into two groups based on the number of music credits. Students who had earned at least…

  14. Studies on Antiviral and Immuno-Regulation Activity of Low Molecular Weight Fucoidan from Laminaria japonica

    NASA Astrophysics Data System (ADS)

    Sun, Taohua; Zhang, Xinhui; Miao, Ying; Zhou, Yang; Shi, Jie; Yan, Meixing; Chen, Anjin

    2018-06-01

    The antiviral activity in vitro and in vivo and the effect of the immune system of two fucoidan fractions with low molecular weight and different sulfate content from Laminaria japonica (LMW fucoidans) were investigated in order to examine the possible mechanism. In vitro, I-type influenza virus, adenovirus and Parainfluenza virus I were used to infect Hep-2, Hela and MDCK cells, respectively. And 50% tissue culture infective dose was calculated to detect the antiviral activity of two LMW fucoidans. The results indicated that compared with the control group, 2 kinds of LMW fucoidans had remarkable antiviral activity in vitro in middle and high doses, while at low doses, the antiviral activity of 2 kinds of LMW fucoidans was not statistically different from that in the blank control group. And there was no statistically difference between two LMW fucoidans in antiviral activity. In vivo, LMW fucoidans could prolong the survival time of virus-infected mice, and could improve the lung index of virus-infected mice significantly, which have statistical differences with the control group significantly ( p < 0.01). However, the survival time of the two LMW fucoidans was not statistically significant ( p > 0.05). In this study, it was shown that both of two LMW fucoidans (LF1, LF2) could increase the thymus index, spleen index, phagocytic index, phagocytosis coefficient and half hemolysin value in middle and high doses, which suggested that LMW fucoidans could play an antiviral role by improving the quality of immune organs, improving immune cell phagocytosis and humoral immunity.

  15. Analysis of traffic crash data in Kentucky (2011-2015).

    DOT National Transportation Integrated Search

    2016-09-01

    This report documents an analysis of traffic crash data in Kentucky for the years of 2011 through 2015. A primary objective of this study was to determine average crash statistics for Kentucky highways. Rates were calculated for various types of high...

  16. Analysis of Traffic Crash Data in Kentucky (2012-2016).

    DOT National Transportation Integrated Search

    2017-09-01

    This report documents an analysis of traffic crash data in Kentucky for the years of 2012 through 2016. A primary objective of this study was to determine average crash statistics for Kentucky highways. Rates were calculated for various types of high...

  17. Analysis of traffic crash data in Kentucky (2009-2013).

    DOT National Transportation Integrated Search

    2014-09-01

    This report documents an analysis of traffic crash data in Kentucky for the years of 2009 through 2013. A primary objective of this study was to determine average crash statistics for Kentucky highways. Rates were calculated for various types of high...

  18. Use of iPhone technology in improving acetabular component position in total hip arthroplasty.

    PubMed

    Tay, Xiau Wei; Zhang, Benny Xu; Gayagay, George

    2017-09-01

    Improper acetabular cup positioning is associated with high risk of complications after total hip arthroplasty. The aim of our study is to objectively compare 3 methods, namely (1) free hand, (2) alignment jig (Sputnik), and (3) iPhone application to identify an easy, reproducible, and accurate method in improving acetabular cup placement. We designed a simple setup and carried out a simple experiment (see Method section). Using statistical analysis, the difference in inclination angles using iPhone application compared with the freehand method was found to be statistically significant ( F [2,51] = 4.17, P = .02) in the "untrained group". There is no statistical significance detected for the other groups. This suggests a potential role for iPhone applications in junior surgeons in overcoming the steep learning curve.

  19. Detection of changes of high-frequency activity by statistical time-frequency analysis in epileptic spikes

    PubMed Central

    Kobayashi, Katsuhiro; Jacobs, Julia; Gotman, Jean

    2013-01-01

    Objective A novel type of statistical time-frequency analysis was developed to elucidate changes of high-frequency EEG activity associated with epileptic spikes. Methods The method uses the Gabor Transform and detects changes of power in comparison to background activity using t-statistics that are controlled by the false discovery rate (FDR) to correct type I error of multiple testing. The analysis was applied to EEGs recorded at 2000 Hz from three patients with mesial temporal lobe epilepsy. Results Spike-related increase of high-frequency oscillations (HFOs) was clearly shown in the FDR-controlled t-spectra: it was most dramatic in spikes recorded from the hippocampus when the hippocampus was the seizure onset zone (SOZ). Depression of fast activity was observed immediately after the spikes, especially consistently in the discharges from the hippocampal SOZ. It corresponded to the slow wave part in case of spike-and-slow-wave complexes, but it was noted even in spikes without apparent slow waves. In one patient, a gradual increase of power above 200 Hz preceded spikes. Conclusions FDR-controlled t-spectra clearly detected the spike-related changes of HFOs that were unclear in standard power spectra. Significance We developed a promising tool to study the HFOs that may be closely linked to the pathophysiology of epileptogenesis. PMID:19394892

  20. Evaluation and comparison of statistical methods for early temporal detection of outbreaks: A simulation-based study

    PubMed Central

    Le Strat, Yann

    2017-01-01

    The objective of this paper is to evaluate a panel of statistical algorithms for temporal outbreak detection. Based on a large dataset of simulated weekly surveillance time series, we performed a systematic assessment of 21 statistical algorithms, 19 implemented in the R package surveillance and two other methods. We estimated false positive rate (FPR), probability of detection (POD), probability of detection during the first week, sensitivity, specificity, negative and positive predictive values and F1-measure for each detection method. Then, to identify the factors associated with these performance measures, we ran multivariate Poisson regression models adjusted for the characteristics of the simulated time series (trend, seasonality, dispersion, outbreak sizes, etc.). The FPR ranged from 0.7% to 59.9% and the POD from 43.3% to 88.7%. Some methods had a very high specificity, up to 99.4%, but a low sensitivity. Methods with a high sensitivity (up to 79.5%) had a low specificity. All methods had a high negative predictive value, over 94%, while positive predictive values ranged from 6.5% to 68.4%. Multivariate Poisson regression models showed that performance measures were strongly influenced by the characteristics of time series. Past or current outbreak size and duration strongly influenced detection performances. PMID:28715489

Top