Statistical Significance Testing.
ERIC Educational Resources Information Center
McLean, James E., Ed.; Kaufman, Alan S., Ed.
1998-01-01
The controversy about the use or misuse of statistical significance testing has become the major methodological issue in educational research. This special issue contains three articles that explore the controversy, three commentaries on these articles, an overall response, and three rejoinders by the first three authors. They are: (1)…
Antecedents of students' achievement in statistics
NASA Astrophysics Data System (ADS)
Awaludin, Izyan Syazana; Razak, Ruzanna Ab; Harris, Hezlin; Selamat, Zarehan
2015-02-01
The applications of statistics in most fields have been vast. Many degree programmes at local universities require students to enroll in at least one statistics course. The standard of these courses varies across different degree programmes. This is because of students' diverse academic backgrounds in which some comes far from the field of statistics. The high failure rate in statistics courses for non-science stream students had been concerning every year. The purpose of this research is to investigate the antecedents of students' achievement in statistics. A total of 272 students participated in the survey. Multiple linear regression was applied to examine the relationship between the factors and achievement. We found that statistics anxiety was a significant predictor of students' achievement. We also found that students' age has significant effect to achievement. Older students are more likely to achieve lowers scores in statistics. Student's level of study also has a significant impact on their achievement in statistics.
Statistically significant relational data mining :
Berry, Jonathan W.; Leung, Vitus Joseph; Phillips, Cynthia Ann; Pinar, Ali; Robinson, David Gerald; Berger-Wolf, Tanya; Bhowmick, Sanjukta; Casleton, Emily; Kaiser, Mark; Nordman, Daniel J.; Wilson, Alyson G.
2014-02-01
This report summarizes the work performed under the project (3z(BStatitically significant relational data mining.(3y (BThe goal of the project was to add more statistical rigor to the fairly ad hoc area of data mining on graphs. Our goal was to develop better algorithms and better ways to evaluate algorithm quality. We concetrated on algorithms for community detection, approximate pattern matching, and graph similarity measures. Approximate pattern matching involves finding an instance of a relatively small pattern, expressed with tolerance, in a large graph of data observed with uncertainty. This report gathers the abstracts and references for the eight refereed publications that have appeared as part of this work. We then archive three pieces of research that have not yet been published. The first is theoretical and experimental evidence that a popular statistical measure for comparison of community assignments favors over-resolved communities over approximations to a ground truth. The second are statistically motivated methods for measuring the quality of an approximate match of a small pattern in a large graph. The third is a new probabilistic random graph model. Statisticians favor these models for graph analysis. The new local structure graph model overcomes some of the issues with popular models such as exponential random graph models and latent variable models.
Significant results: statistical or clinical?
2016-01-01
The null hypothesis significance test method is popular in biological and medical research. Many researchers have used this method for their research without exact knowledge, though it has both merits and shortcomings. Readers will know its shortcomings, as well as several complementary or alternative methods, as such the estimated effect size and the confidence interval. PMID:27066201
Statistical Significance of Threading Scores
Fayyaz Movaghar, Afshin; Launay, Guillaume; Schbath, Sophie; Gibrat, Jean-François
2012-01-01
Abstract We present a general method for assessing threading score significance. The threading score of a protein sequence, thread onto a given structure, should be compared with the threading score distribution of a random amino-acid sequence, of the same length, thread on the same structure; small p-values point significantly high scores. We claim that, due to general protein contact map properties, this reference distribution is a Weibull extreme value distribution whose parameters depend on the threading method, the structure, the length of the query and the random sequence simulation model used. These parameters can be estimated off-line with simulated sequence samples, for different sequence lengths. They can further be interpolated at the exact length of a query, enabling the quick computation of the p-value. PMID:22149633
Chiou, Chei-Chang; Wang, Yu-Min; Lee, Li-Tze
2014-08-01
Statistical knowledge is widely used in academia; however, statistics teachers struggle with the issue of how to reduce students' statistics anxiety and enhance students' statistics learning. This study assesses the effectiveness of a "one-minute paper strategy" in reducing students' statistics-related anxiety and in improving students' statistics-related achievement. Participants were 77 undergraduates from two classes enrolled in applied statistics courses. An experiment was implemented according to a pretest/posttest comparison group design. The quasi-experimental design showed that the one-minute paper strategy significantly reduced students' statistics anxiety and improved students' statistics learning achievement. The strategy was a better instructional tool than the textbook exercise for reducing students' statistics anxiety and improving students' statistics achievement.
Statistical significance of the gallium anomaly
Giunti, Carlo; Laveder, Marco
2011-06-15
We calculate the statistical significance of the anomalous deficit of electron neutrinos measured in the radioactive source experiments of the GALLEX and SAGE solar neutrino detectors, taking into account the uncertainty of the detection cross section. We found that the statistical significance of the anomaly is {approx}3.0{sigma}. A fit of the data in terms of neutrino oscillations favors at {approx}2.7{sigma} short-baseline electron neutrino disappearance with respect to the null hypothesis of no oscillations.
The insignificance of statistical significance testing
Johnson, Douglas H.
1999-01-01
Despite their use in scientific journals such as The Journal of Wildlife Management, statistical hypothesis tests add very little value to the products of research. Indeed, they frequently confuse the interpretation of data. This paper describes how statistical hypothesis tests are often viewed, and then contrasts that interpretation with the correct one. I discuss the arbitrariness of P-values, conclusions that the null hypothesis is true, power analysis, and distinctions between statistical and biological significance. Statistical hypothesis testing, in which the null hypothesis about the properties of a population is almost always known a priori to be false, is contrasted with scientific hypothesis testing, which examines a credible null hypothesis about phenomena in nature. More meaningful alternatives are briefly outlined, including estimation and confidence intervals for determining the importance of factors, decision theory for guiding actions in the face of uncertainty, and Bayesian approaches to hypothesis testing and other statistical practices.
Statistical Significance vs. Practical Significance: An Exploration through Health Education
ERIC Educational Resources Information Center
Rosen, Brittany L.; DeMaria, Andrea L.
2012-01-01
The purpose of this paper is to examine the differences between statistical and practical significance, including strengths and criticisms of both methods, as well as provide information surrounding the application of various effect sizes and confidence intervals within health education research. Provided are recommendations, explanations and…
Determining the Statistical Significance of Relative Weights
ERIC Educational Resources Information Center
Tonidandel, Scott; LeBreton, James M.; Johnson, Jeff W.
2009-01-01
Relative weight analysis is a procedure for estimating the relative importance of correlated predictors in a regression equation. Because the sampling distribution of relative weights is unknown, researchers using relative weight analysis are unable to make judgments regarding the statistical significance of the relative weights. J. W. Johnson…
Statistical significance testing and clinical trials.
Krause, Merton S
2011-09-01
The efficacy of treatments is better expressed for clinical purposes in terms of these treatments' outcome distributions and their overlapping rather than in terms of the statistical significance of these distributions' mean differences, because clinical practice is primarily concerned with the outcome of each individual client rather than with the mean of the variety of outcomes in any group of clients. Reports of the obtained outcome distributions for the comparison groups of all competently designed and executed randomized clinical trials should be publicly available no matter what the statistical significance of the mean differences among these groups, because all of these studies' outcome distributions provide clinically useful information about the efficacy of the treatments compared.
Systematic identification of statistically significant network measures
NASA Astrophysics Data System (ADS)
Ziv, Etay; Koytcheff, Robin; Middendorf, Manuel; Wiggins, Chris
2005-01-01
We present a graph embedding space (i.e., a set of measures on graphs) for performing statistical analyses of networks. Key improvements over existing approaches include discovery of “motif hubs” (multiple overlapping significant subgraphs), computational efficiency relative to subgraph census, and flexibility (the method is easily generalizable to weighted and signed graphs). The embedding space is based on scalars, functionals of the adjacency matrix representing the network. Scalars are global, involving all nodes; although they can be related to subgraph enumeration, there is not a one-to-one mapping between scalars and subgraphs. Improvements in network randomization and significance testing—we learn the distribution rather than assuming Gaussianity—are also presented. The resulting algorithm establishes a systematic approach to the identification of the most significant scalars and suggests machine-learning techniques for network classification.
Finding Statistically Significant Communities in Networks
Lancichinetti, Andrea; Radicchi, Filippo; Ramasco, José J.; Fortunato, Santo
2011-01-01
Community structure is one of the main structural features of networks, revealing both their internal organization and the similarity of their elementary units. Despite the large variety of methods proposed to detect communities in graphs, there is a big need for multi-purpose techniques, able to handle different types of datasets and the subtleties of community structure. In this paper we present OSLOM (Order Statistics Local Optimization Method), the first method capable to detect clusters in networks accounting for edge directions, edge weights, overlapping communities, hierarchies and community dynamics. It is based on the local optimization of a fitness function expressing the statistical significance of clusters with respect to random fluctuations, which is estimated with tools of Extreme and Order Statistics. OSLOM can be used alone or as a refinement procedure of partitions/covers delivered by other techniques. We have also implemented sequential algorithms combining OSLOM with other fast techniques, so that the community structure of very large networks can be uncovered. Our method has a comparable performance as the best existing algorithms on artificial benchmark graphs. Several applications on real networks are shown as well. OSLOM is implemented in a freely available software (http://www.oslom.org), and we believe it will be a valuable tool in the analysis of networks. PMID:21559480
Social significance of community structure: Statistical view
NASA Astrophysics Data System (ADS)
Li, Hui-Jia; Daniels, Jasmine J.
2015-01-01
Community structure analysis is a powerful tool for social networks that can simplify their topological and functional analysis considerably. However, since community detection methods have random factors and real social networks obtained from complex systems always contain error edges, evaluating the significance of a partitioned community structure is an urgent and important question. In this paper, integrating the specific characteristics of real society, we present a framework to analyze the significance of a social community. The dynamics of social interactions are modeled by identifying social leaders and corresponding hierarchical structures. Instead of a direct comparison with the average outcome of a random model, we compute the similarity of a given node with the leader by the number of common neighbors. To determine the membership vector, an efficient community detection algorithm is proposed based on the position of the nodes and their corresponding leaders. Then, using a log-likelihood score, the tightness of the community can be derived. Based on the distribution of community tightness, we establish a connection between p -value theory and network analysis, and then we obtain a significance measure of statistical form . Finally, the framework is applied to both benchmark networks and real social networks. Experimental results show that our work can be used in many fields, such as determining the optimal number of communities, analyzing the social significance of a given community, comparing the performance among various algorithms, etc.
Social significance of community structure: statistical view.
Li, Hui-Jia; Daniels, Jasmine J
2015-01-01
Community structure analysis is a powerful tool for social networks that can simplify their topological and functional analysis considerably. However, since community detection methods have random factors and real social networks obtained from complex systems always contain error edges, evaluating the significance of a partitioned community structure is an urgent and important question. In this paper, integrating the specific characteristics of real society, we present a framework to analyze the significance of a social community. The dynamics of social interactions are modeled by identifying social leaders and corresponding hierarchical structures. Instead of a direct comparison with the average outcome of a random model, we compute the similarity of a given node with the leader by the number of common neighbors. To determine the membership vector, an efficient community detection algorithm is proposed based on the position of the nodes and their corresponding leaders. Then, using a log-likelihood score, the tightness of the community can be derived. Based on the distribution of community tightness, we establish a connection between p-value theory and network analysis, and then we obtain a significance measure of statistical form . Finally, the framework is applied to both benchmark networks and real social networks. Experimental results show that our work can be used in many fields, such as determining the optimal number of communities, analyzing the social significance of a given community, comparing the performance among various algorithms, etc.
Achieving statistical power through research design sensitivity.
Beck, C T
1994-11-01
The challenge for nurse researchers is to design their intervention studies with sufficient sensitivity to detect the treatment effects they are investigating. In order to meet this challenge, researchers must understand the factors that influence statistical power. Underpowered studies can result in a majority of null results in a research area when, in fact, the interventions are effective. The sensitivity of a research design is not a function of just one element of the design but of the entire research design: its plan, implementation and statistical analysis. When discussing factors that can increase a research design's statistical power, attention is most often focused on increasing sample size. This paper addresses a variety of factors and techniques, other than increasing sample size, that nurse researchers can use to enhance the sensitivity of a research design so that it can attain adequate power.
Statistical Significance of Clustering using Soft Thresholding
Huang, Hanwen; Liu, Yufeng; Yuan, Ming; Marron, J. S.
2015-01-01
Clustering methods have led to a number of important discoveries in bioinformatics and beyond. A major challenge in their use is determining which clusters represent important underlying structure, as opposed to spurious sampling artifacts. This challenge is especially serious, and very few methods are available, when the data are very high in dimension. Statistical Significance of Clustering (SigClust) is a recently developed cluster evaluation tool for high dimensional low sample size data. An important component of the SigClust approach is the very definition of a single cluster as a subset of data sampled from a multivariate Gaussian distribution. The implementation of SigClust requires the estimation of the eigenvalues of the covariance matrix for the null multivariate Gaussian distribution. We show that the original eigenvalue estimation can lead to a test that suffers from severe inflation of type-I error, in the important case where there are a few very large eigenvalues. This paper addresses this critical challenge using a novel likelihood based soft thresholding approach to estimate these eigenvalues, which leads to a much improved SigClust. Major improvements in SigClust performance are shown by both mathematical analysis, based on the new notion of Theoretical Cluster Index, and extensive simulation studies. Applications to some cancer genomic data further demonstrate the usefulness of these improvements. PMID:26755893
[Significance of medical statistics in insurance medicine].
Becher, J
2001-03-01
Knowledge of medical statistics is of great benefit to every insurance medical officer as they facilitate communication with actuaries, allow officers to make their own calculations and are the basis for correctly interpreting medical journals. Only about 20% of original work in medicine today is published without statistics or only with descriptive statistics--and this trend is falling. The reader of medical publications should be in a position to make a critical analysis of the methodology and content, since one cannot always rely on the conclusions drawn by the authors: statistical errors appear very frequently in medical publications. Due to the specific methodological features involved, the assessment of meta-analyses demands special attention. The number of published meta-analyses has risen 40-fold over the last ten years. Important examples for the practical use of statistical methods in insurance medicine include estimating extramortality from published survival analyses and evaluating diagnostic test results. The purpose of this article is to highlight statistical problems and issues of relevance to insurance medicine and to establish the bases for understanding them.
Statistics Anxiety, State Anxiety during an Examination, and Academic Achievement
ERIC Educational Resources Information Center
Macher, Daniel; Paechter, Manuela; Papousek, Ilona; Ruggeri, Kai; Freudenthaler, H. Harald; Arendasy, Martin
2013-01-01
Background: A large proportion of students identify statistics courses as the most anxiety-inducing courses in their curriculum. Many students feel impaired by feelings of state anxiety in the examination and therefore probably show lower achievements. Aims: The study investigates how statistics anxiety, attitudes (e.g., interest, mathematical…
Attitudes and Achievement in Statistics: A Meta-Analysis Study
ERIC Educational Resources Information Center
Emmioglu, Esma; Capa-Aydin, Yesim
2012-01-01
This study examined the relationships among statistics achievement and four components of attitudes toward statistics (Cognitive Competence, Affect, Value, and Difficulty) as assessed by the SATS. Meta-analysis results revealed that the size of relationships differed by the geographical region in which the studies were conducted as well as by the…
Did Tanzania Achieve the Second Millennium Development Goal? Statistical Analysis
ERIC Educational Resources Information Center
Magoti, Edwin
2016-01-01
Development Goal "Achieve universal primary education", the challenges faced, along with the way forward towards achieving the fourth Sustainable Development Goal "Ensure inclusive and equitable quality education and promote lifelong learning opportunities for all". Statistics show that Tanzania has made very promising steps…
The Effects of Statistical Analysis Software and Calculators on Statistics Achievement
ERIC Educational Resources Information Center
Christmann, Edwin P.
2009-01-01
This study compared the effects of microcomputer-based statistical software and hand-held calculators on the statistics achievement of university males and females. The subjects, 73 graduate students enrolled in univariate statistics classes at a public comprehensive university, were randomly assigned to groups that used either microcomputer-based…
ERIC Educational Resources Information Center
Ciftci, S. Koza; Karadag, Engin; Akdal, Pinar
2014-01-01
The purpose of this study was to determine the effect of statistics instruction using computer-based tools, on statistics anxiety, attitude, and achievement. This study was designed as quasi-experimental research and the pattern used was a matched pre-test/post-test with control group design. Data was collected using three scales: a Statistics…
Testing the Difference of Correlated Agreement Coefficients for Statistical Significance
ERIC Educational Resources Information Center
Gwet, Kilem L.
2016-01-01
This article addresses the problem of testing the difference between two correlated agreement coefficients for statistical significance. A number of authors have proposed methods for testing the difference between two correlated kappa coefficients, which require either the use of resampling methods or the use of advanced statistical modeling…
The Use of Meta-Analytic Statistical Significance Testing
ERIC Educational Resources Information Center
Polanin, Joshua R.; Pigott, Terri D.
2015-01-01
Meta-analysis multiplicity, the concept of conducting multiple tests of statistical significance within one review, is an underdeveloped literature. We address this issue by considering how Type I errors can impact meta-analytic results, suggest how statistical power may be affected through the use of multiplicity corrections, and propose how…
Reviewer Bias for Statistically Significant Results: A Reexamination.
ERIC Educational Resources Information Center
Fagley, N. S.; McKinney, I. Jean
1983-01-01
Reexamines the article by Atkinson, Furlong, and Wampold (1982) and questions their conclusion that reviewers were biased toward statistically significant results. A statistical power analysis shows the power of their bogus study was low. Low power in a study reporting nonsignificant findings is a valid reason for recommending not to publish.…
The questioned p value: clinical, practical and statistical significance.
Jiménez-Paneque, Rosa
2016-09-09
The use of p-value and statistical significance have been questioned since the early 80s in the last century until today. Much has been discussed about it in the field of statistics and its applications, especially in Epidemiology and Public Health. As a matter of fact, the p-value and its equivalent, statistical significance, are difficult concepts to grasp for the many health professionals some way involved in research applied to their work areas. However, its meaning should be clear in intuitive terms although it is based on theoretical concepts of the field of Statistics. This paper attempts to present the p-value as a concept that applies to everyday life and therefore intuitively simple but whose proper use cannot be separated from theoretical and methodological elements of inherent complexity. The reasons behind the criticism received by the p-value and its isolated use are intuitively explained, mainly the need to demarcate statistical significance from clinical significance and some of the recommended remedies for these problems are approached as well. It finally refers to the current trend to vindicate the p-value appealing to the convenience of its use in certain situations and the recent statement of the American Statistical Association in this regard.
Statistical significance test for transition matrices of atmospheric Markov chains
NASA Technical Reports Server (NTRS)
Vautard, Robert; Mo, Kingtse C.; Ghil, Michael
1990-01-01
Low-frequency variability of large-scale atmospheric dynamics can be represented schematically by a Markov chain of multiple flow regimes. This Markov chain contains useful information for the long-range forecaster, provided that the statistical significance of the associated transition matrix can be reliably tested. Monte Carlo simulation yields a very reliable significance test for the elements of this matrix. The results of this test agree with previously used empirical formulae when each cluster of maps identified as a distinct flow regime is sufficiently large and when they all contain a comparable number of maps. Monte Carlo simulation provides a more reliable way to test the statistical significance of transitions to and from small clusters. It can determine the most likely transitions, as well as the most unlikely ones, with a prescribed level of statistical significance.
SOARing Into Strategic Planning: Engaging Nurses to Achieve Significant Outcomes.
Wadsworth, Barbara; Felton, Fiona; Linus, Rita
2016-01-01
In 2013, a new system chief nursing officer engaged the nursing leaders and staff in an Appreciative Inquiry process utilizing strengths, opportunities, aspirations, and results (SOAR), and a Journey of Excellence to assess and understand the current environment. The ultimate goal was to engage all nurses in strategic planning and goal setting to connect their patient care to the system strategic initiatives. This work led to the creation of a nursing vision, a revised professional practice model and greater council alignment, resulting in significant positive change and ongoing advancement throughout the system. The shared decision-making structure was key to the process with a direct connection of each council's goals, leading to the successful achievement of 34 of the 36 goals in 2 years. This article outlines the process, tools, and staff engagement strategies used to achieve system-wide success. This methodology has improved the outcomes across the organization in both small and system-wide work groups. This work can easily be replicated and adapted to help disparate staffs brought together through mergers or acquisitions to become aligned as a new team. This process, model, and framework, provides structure and results in significant outcomes that recognizes and celebrates the work of individual entities while aligning future strategies and goals.
Statistical Significance and Effect Size: Two Sides of a Coin.
ERIC Educational Resources Information Center
Fan, Xitao
This paper suggests that statistical significance testing and effect size are two sides of the same coin; they complement each other, but do not substitute for one another. Good research practice requires that both should be taken into consideration to make sound quantitative decisions. A Monte Carlo simulation experiment was conducted, and a…
Interpretation of Statistical Significance Testing: A Matter of Perspective.
ERIC Educational Resources Information Center
McClure, John; Suen, Hoi K.
1994-01-01
This article compares three models that have been the foundation for approaches to the analysis of statistical significance in early childhood research--the Fisherian and the Neyman-Pearson models (both considered "classical" approaches), and the Bayesian model. The article concludes that all three models have a place in the analysis of research…
Your Chi-Square Test Is Statistically Significant: Now What?
ERIC Educational Resources Information Center
Sharpe, Donald
2015-01-01
Applied researchers have employed chi-square tests for more than one hundred years. This paper addresses the question of how one should follow a statistically significant chi-square test result in order to determine the source of that result. Four approaches were evaluated: calculating residuals, comparing cells, ransacking, and partitioning. Data…
Estimation of the geochemical threshold and its statistical significance
Miesch, A.T.
1981-01-01
A statistic is proposed for estimating the geochemical threshold and its statistical significance, or it may be used to identify a group of extreme values that can be tested for significance by other means. The statistic is the maximum gap between adjacent values in an ordered array after each gap has been adjusted for the expected frequency. The values in the ordered array are geochemical values transformed by either ln(?? - ??) or ln(?? - ??) and then standardized so that the mean is zero and the variance is unity. The expected frequency is taken from a fitted normal curve with unit area. The midpoint of an adjusted gap that exceeds the corresponding critical value may be taken as an estimate of the geochemical threshold, and the associated probability indicates the likelihood that the threshold separates two geochemical populations. The adjusted gap test may fail to identify threshold values if the variation tends to be continuous from background values to the higher values that reflect mineralized ground. However, the test will serve to identify other anomalies that may be too subtle to have been noted by other means. ?? 1981.
Beyond Statistical Significance: Implications of Network Structure on Neuronal Activity
Vlachos, Ioannis; Aertsen, Ad; Kumar, Arvind
2012-01-01
It is a common and good practice in experimental sciences to assess the statistical significance of measured outcomes. For this, the probability of obtaining the actual results is estimated under the assumption of an appropriately chosen null-hypothesis. If this probability is smaller than some threshold, the results are deemed statistically significant and the researchers are content in having revealed, within their own experimental domain, a “surprising” anomaly, possibly indicative of a hitherto hidden fragment of the underlying “ground-truth”. What is often neglected, though, is the actual importance of these experimental outcomes for understanding the system under investigation. We illustrate this point by giving practical and intuitive examples from the field of systems neuroscience. Specifically, we use the notion of embeddedness to quantify the impact of a neuron's activity on its downstream neurons in the network. We show that the network response strongly depends on the embeddedness of stimulated neurons and that embeddedness is a key determinant of the importance of neuronal activity on local and downstream processing. We extrapolate these results to other fields in which networks are used as a theoretical framework. PMID:22291581
Sibling Competition & Growth Tradeoffs. Biological vs. Statistical Significance
Kramer, Karen L.; Veile, Amanda; Otárola-Castillo, Erik
2016-01-01
Early childhood growth has many downstream effects on future health and reproduction and is an important measure of offspring quality. While a tradeoff between family size and child growth outcomes is theoretically predicted in high-fertility societies, empirical evidence is mixed. This is often attributed to phenotypic variation in parental condition. However, inconsistent study results may also arise because family size confounds the potentially differential effects that older and younger siblings can have on young children’s growth. Additionally, inconsistent results might reflect that the biological significance associated with different growth trajectories is poorly understood. This paper addresses these concerns by tracking children’s monthly gains in height and weight from weaning to age five in a high fertility Maya community. We predict that: 1) as an aggregate measure family size will not have a major impact on child growth during the post weaning period; 2) competition from young siblings will negatively impact child growth during the post weaning period; 3) however because of their economic value, older siblings will have a negligible effect on young children’s growth. Accounting for parental condition, we use linear mixed models to evaluate the effects that family size, younger and older siblings have on children’s growth. Congruent with our expectations, it is younger siblings who have the most detrimental effect on children’s growth. While we find statistical evidence of a quantity/quality tradeoff effect, the biological significance of these results is negligible in early childhood. Our findings help to resolve why quantity/quality studies have had inconsistent results by showing that sibling competition varies with sibling age composition, not just family size, and that biological significance is distinct from statistical significance. PMID:26938742
Boise Inc. St. Helens Paper Mill Achieves Significant Fuel Savings
Not Available
2008-05-01
This case study describes how the Boise Inc. paper mill in St. Helens, Oregon, achieved annual savings of approximately 154,000 MMBtu and more than $1 million after receiving a DOE Save Energy Now energy assessment and implementing recommendations to improve the efficiency of its steam system.
Boise Inc. St. Helens Paper Mill Achieves Significant Fuel Savings
2008-05-01
This case study describes how the Boise Inc. paper mill in St. Helens, Oregon, achieved annual savings of approximately 154,000 MMBtu and more than $1 million. This was accomplished after receiving a DOE Save Energy Now energy assessment and implementing recommendations to improve the efficiency of its steam system.
Emotional Intelligence Skills: Significant Factors in Freshmen Achievement and Retention.
ERIC Educational Resources Information Center
Nelson, Darwin B.; Nelson, Kaye W.
This study investigated the role of emotional skills in the academic achievement and retention of university freshmen. The research group was a randomly selected sample of first semester freshmen students (N=165), and cumulative grade point average was used as the criterion for academic success. The study was designed to investigate: (a) the…
ERIC Educational Resources Information Center
Spinella, Sarah
2011-01-01
As result replicability is essential to science and difficult to achieve through external replicability, the present paper notes the insufficiency of null hypothesis statistical significance testing (NHSST) and explains the bootstrap as a plausible alternative, with a heuristic example to illustrate the bootstrap method. The bootstrap relies on…
Fostering Students' Statistical Literacy through Significant Learning Experience
ERIC Educational Resources Information Center
Krishnan, Saras
2015-01-01
A major objective of statistics education is to develop students' statistical literacy that enables them to be educated users of data in context. Teaching statistics in today's educational settings is not an easy feat because teachers have a huge task in keeping up with the demands of the new generation of learners. The present day students have…
A Tutorial on Hunting Statistical Significance by Chasing N.
Szucs, Denes
2016-01-01
There is increasing concern about the replicability of studies in psychology and cognitive neuroscience. Hidden data dredging (also called p-hacking) is a major contributor to this crisis because it substantially increases Type I error resulting in a much larger proportion of false positive findings than the usually expected 5%. In order to build better intuition to avoid, detect and criticize some typical problems, here I systematically illustrate the large impact of some easy to implement and so, perhaps frequent data dredging techniques on boosting false positive findings. I illustrate several forms of two special cases of data dredging. First, researchers may violate the data collection stopping rules of null hypothesis significance testing by repeatedly checking for statistical significance with various numbers of participants. Second, researchers may group participants post hoc along potential but unplanned independent grouping variables. The first approach 'hacks' the number of participants in studies, the second approach 'hacks' the number of variables in the analysis. I demonstrate the high amount of false positive findings generated by these techniques with data from true null distributions. I also illustrate that it is extremely easy to introduce strong bias into data by very mild selection and re-testing. Similar, usually undocumented data dredging steps can easily lead to having 20-50%, or more false positives.
A Tutorial on Hunting Statistical Significance by Chasing N
Szucs, Denes
2016-01-01
There is increasing concern about the replicability of studies in psychology and cognitive neuroscience. Hidden data dredging (also called p-hacking) is a major contributor to this crisis because it substantially increases Type I error resulting in a much larger proportion of false positive findings than the usually expected 5%. In order to build better intuition to avoid, detect and criticize some typical problems, here I systematically illustrate the large impact of some easy to implement and so, perhaps frequent data dredging techniques on boosting false positive findings. I illustrate several forms of two special cases of data dredging. First, researchers may violate the data collection stopping rules of null hypothesis significance testing by repeatedly checking for statistical significance with various numbers of participants. Second, researchers may group participants post hoc along potential but unplanned independent grouping variables. The first approach ‘hacks’ the number of participants in studies, the second approach ‘hacks’ the number of variables in the analysis. I demonstrate the high amount of false positive findings generated by these techniques with data from true null distributions. I also illustrate that it is extremely easy to introduce strong bias into data by very mild selection and re-testing. Similar, usually undocumented data dredging steps can easily lead to having 20–50%, or more false positives. PMID:27713723
Creating a Middle Grades Environment that Significantly Improves Student Achievement
ERIC Educational Resources Information Center
L'Esperance, Mark E.; Lenker, Ethan; Bullock, Ann; Lockamy, Becky; Mason, Cathy
2013-01-01
This article offers an overview of the framework that Sampson County Public Schools (North Carolina) used to critically reflect on the current state of their middle grades schools. The article also highlights the changes that resulted from the district-wide analysis and the ways in which these changes led to a significant increase in the academic…
Barnacle geese achieve significant energetic savings by changing posture.
Tickle, Peter G; Nudds, Robert L; Codd, Jonathan R
2012-01-01
Here we report the resting metabolic rate in barnacle geese (Branta leucopsis) and provide evidence for the significant energetic effect of posture. Under laboratory conditions flow-through respirometry together with synchronous recording of behaviour enabled a calculation of how metabolic rate varies with posture. Our principal finding is that standing bipedally incurs a 25% increase in metabolic rate compared to birds sitting on the ground. In addition to the expected decrease in energy consumption of hindlimb postural muscles when sitting, we hypothesise that a change in breathing mechanics represents one potential mechanism for at least part of the observed difference in energetic cost. Due to the significant effect of posture, future studies of resting metabolic rates need to take into account and/or report differences in posture.
Shukla, R.; Yu Daohai; Fulk, F.
1995-12-31
Short-term toxicity tests with aquatic organisms are a valuable measurement tool in the assessment of the toxicity of effluents, environmental samples and single chemicals. Currently toxicity tests are utilized in a wide range of US EPA regulatory activities including effluent discharge compliance. In the current approach for determining the No Observed Effect Concentration, an effluent concentration is presumed safe if there is no statistically significant difference in toxicant response versus control response. The conclusion of a safe concentration may be due to the fact that it truly is safe, or alternatively, that the ability of the statistical test to detect an effect, given its existence, is inadequate. Results of research of a new statistical approach, the basis of which is to move away from a demonstration of no difference to a demonstration of equivalence, will be discussed. The concept of observed confidence distributions, first suggested by Cox, is proposed as a measure of the strength of evidence for practically equivalent responses between a given effluent concentration and the control. The research included determination of intervals of practically equivalent responses as a function of the variability of control response. The approach is illustrated using reproductive data from tests with Ceriodaphnia dubia and survival and growth data from tests with fathead minnow. The data are from the US EPA`s National Reference Toxicant Database.
Tipping points in the arctic: eyeballing or statistical significance?
Carstensen, Jacob; Weydmann, Agata
2012-02-01
Arctic ecosystems have experienced and are projected to experience continued large increases in temperature and declines in sea ice cover. It has been hypothesized that small changes in ecosystem drivers can fundamentally alter ecosystem functioning, and that this might be particularly pronounced for Arctic ecosystems. We present a suite of simple statistical analyses to identify changes in the statistical properties of data, emphasizing that changes in the standard error should be considered in addition to changes in mean properties. The methods are exemplified using sea ice extent, and suggest that the loss rate of sea ice accelerated by factor of ~5 in 1996, as reported in other studies, but increases in random fluctuations, as an early warning signal, were observed already in 1990. We recommend to employ the proposed methods more systematically for analyzing tipping points to document effects of climate change in the Arctic.
Statistics in Public Understanding of Science review: How to achieve high statistical standards?
Crettaz von Roten, Fabienne
2016-02-01
This article proposes a checklist to improve statistical reporting in the manuscripts submitted to Public Understanding of Science. Generally, these guidelines will allow the reviewers (and readers) to judge whether the evidence provided in the manuscript is relevant. The article ends with other suggestions for a better statistical quality of the journal.
Statistical downscaling rainfall using artificial neural network: significantly wetter Bangkok?
NASA Astrophysics Data System (ADS)
Vu, Minh Tue; Aribarg, Thannob; Supratid, Siriporn; Raghavan, Srivatsan V.; Liong, Shie-Yui
2016-11-01
Artificial neural network (ANN) is an established technique with a flexible mathematical structure that is capable of identifying complex nonlinear relationships between input and output data. The present study utilizes ANN as a method of statistically downscaling global climate models (GCMs) during the rainy season at meteorological site locations in Bangkok, Thailand. The study illustrates the applications of the feed forward back propagation using large-scale predictor variables derived from both the ERA-Interim reanalyses data and present day/future GCM data. The predictors are first selected over different grid boxes surrounding Bangkok region and then screened by using principal component analysis (PCA) to filter the best correlated predictors for ANN training. The reanalyses downscaled results of the present day climate show good agreement against station precipitation with a correlation coefficient of 0.8 and a Nash-Sutcliffe efficiency of 0.65. The final downscaled results for four GCMs show an increasing trend of precipitation for rainy season over Bangkok by the end of the twenty-first century. The extreme values of precipitation determined using statistical indices show strong increases of wetness. These findings will be useful for policy makers in pondering adaptation measures due to flooding such as whether the current drainage network system is sufficient to meet the changing climate and to plan for a range of related adaptation/mitigation measures.
Wilkinson, Michael
2014-03-01
Decisions about support for predictions of theories in light of data are made using statistical inference. The dominant approach in sport and exercise science is the Neyman-Pearson (N-P) significance-testing approach. When applied correctly it provides a reliable procedure for making dichotomous decisions for accepting or rejecting zero-effect null hypotheses with known and controlled long-run error rates. Type I and type II error rates must be specified in advance and the latter controlled by conducting an a priori sample size calculation. The N-P approach does not provide the probability of hypotheses or indicate the strength of support for hypotheses in light of data, yet many scientists believe it does. Outcomes of analyses allow conclusions only about the existence of non-zero effects, and provide no information about the likely size of true effects or their practical/clinical value. Bayesian inference can show how much support data provide for different hypotheses, and how personal convictions should be altered in light of data, but the approach is complicated by formulating probability distributions about prior subjective estimates of population effects. A pragmatic solution is magnitude-based inference, which allows scientists to estimate the true magnitude of population effects and how likely they are to exceed an effect magnitude of practical/clinical importance, thereby integrating elements of subjective Bayesian-style thinking. While this approach is gaining acceptance, progress might be hastened if scientists appreciate the shortcomings of traditional N-P null hypothesis significance testing.
Assessing statistical significance in multivariable genome wide association analysis
Buzdugan, Laura; Kalisch, Markus; Navarro, Arcadi; Schunk, Daniel; Fehr, Ernst; Bühlmann, Peter
2016-01-01
Motivation: Although Genome Wide Association Studies (GWAS) genotype a very large number of single nucleotide polymorphisms (SNPs), the data are often analyzed one SNP at a time. The low predictive power of single SNPs, coupled with the high significance threshold needed to correct for multiple testing, greatly decreases the power of GWAS. Results: We propose a procedure in which all the SNPs are analyzed in a multiple generalized linear model, and we show its use for extremely high-dimensional datasets. Our method yields P-values for assessing significance of single SNPs or groups of SNPs while controlling for all other SNPs and the family wise error rate (FWER). Thus, our method tests whether or not a SNP carries any additional information about the phenotype beyond that available by all the other SNPs. This rules out spurious correlations between phenotypes and SNPs that can arise from marginal methods because the ‘spuriously correlated’ SNP merely happens to be correlated with the ‘truly causal’ SNP. In addition, the method offers a data driven approach to identifying and refining groups of SNPs that jointly contain informative signals about the phenotype. We demonstrate the value of our method by applying it to the seven diseases analyzed by the Wellcome Trust Case Control Consortium (WTCCC). We show, in particular, that our method is also capable of finding significant SNPs that were not identified in the original WTCCC study, but were replicated in other independent studies. Availability and implementation: Reproducibility of our research is supported by the open-source Bioconductor package hierGWAS. Contact: peter.buehlmann@stat.math.ethz.ch Supplementary information: Supplementary data are available at Bioinformatics online. PMID:27153677
Evaluating clinical significance: incorporating robust statistics with normative comparison tests.
van Wieringen, Katrina; Cribbie, Robert A
2014-05-01
The purpose of this study was to evaluate a modified test of equivalence for conducting normative comparisons when distribution shapes are non-normal and variances are unequal. A Monte Carlo study was used to compare the empirical Type I error rates and power of the proposed Schuirmann-Yuen test of equivalence, which utilizes trimmed means, with that of the previously recommended Schuirmann and Schuirmann-Welch tests of equivalence when the assumptions of normality and variance homogeneity are satisfied, as well as when they are not satisfied. The empirical Type I error rates of the Schuirmann-Yuen were much closer to the nominal α level than those of the Schuirmann or Schuirmann-Welch tests, and the power of the Schuirmann-Yuen was substantially greater than that of the Schuirmann or Schuirmann-Welch tests when distributions were skewed or outliers were present. The Schuirmann-Yuen test is recommended for assessing clinical significance with normative comparisons.
Lies, damned lies and statistics: Clinical importance versus statistical significance in research.
Mellis, Craig
2017-02-28
Correctly performed and interpreted statistics play a crucial role for both those who 'produce' clinical research, and for those who 'consume' this research. Unfortunately, however, there are many misunderstandings and misinterpretations of statistics by both groups. In particular, there is a widespread lack of appreciation for the severe limitations with p values. This is a particular problem with small sample sizes and low event rates - common features of many published clinical trials. These issues have resulted in increasing numbers of false positive clinical trials (false 'discoveries'), and the well-publicised inability to replicate many of the findings. While chance clearly plays a role in these errors, many more are due to either poorly performed or badly misinterpreted statistics. Consequently, it is essential that whenever p values appear, these need be accompanied by both 95% confidence limits and effect sizes. These will enable readers to immediately assess the plausible range of results, and whether or not the effect is clinically meaningful.
ERIC Educational Resources Information Center
Primi, Caterina; Donati, Maria Anna; Chiesi, Francesca
2016-01-01
Among the wide range of factors related to the acquisition of statistical knowledge, competence in basic mathematics, including basic probability, has received much attention. In this study, a mediation model was estimated to derive the total, direct, and indirect effects of mathematical competence on statistics achievement taking into account…
Significance probability mapping: the final touch in t-statistic mapping.
Hassainia, F; Petit, D; Montplaisir, J
1994-01-01
Significance Probability Mapping (SPM), based on Student's t-statistic, is widely used for comparing mean brain topography maps of two groups. The map resulting from this process represents the distribution of t-values over the entire scalp. However, t-values by themselves cannot reveal whether or not group differences are significant. Significance levels associated with a few t-values are therefore commonly indicated on map legends to give the reader an idea of the significance levels of t-values. Nevertheless, a precise significance level topography cannot be achieved with these few significance values. We introduce a new kind of map which directly displays significance level topography in order to relieve the reader from converting multiple t-values to their corresponding significance probabilities, and to obtain a good quantification and a better localization of regions with significant differences between groups. As an illustration of this type of map, we present a comparison of EEG activity in Alzheimer's patients and age-matched control subjects for both wakefulness and REM sleep.
Understanding the Sampling Distribution and Its Use in Testing Statistical Significance.
ERIC Educational Resources Information Center
Breunig, Nancy A.
Despite the increasing criticism of statistical significance testing by researchers, particularly in the publication of the 1994 American Psychological Association's style manual, statistical significance test results are still popular in journal articles. For this reason, it remains important to understand the logic of inferential statistics. A…
ERIC Educational Resources Information Center
Monterde-i-Bort, Hector; Frias-Navarro, Dolores; Pascual-Llobell, Juan
2010-01-01
The empirical study we present here deals with a pedagogical issue that has not been thoroughly explored up until now in our field. Previous empirical studies in other sectors have identified the opinions of researchers about this topic, showing that completely unacceptable interpretations have been made of significance tests and other statistical…
ERIC Educational Resources Information Center
Otto, Luther B.
1977-01-01
Girl friends are significant-others who influence young men's career aspirations and achievements. Girl friends and same sex peers evaluate a youth's educational potential using broader criteria than do parents. (Author/MV)
Student Achievement in Undergraduate Statistics: The Potential Value of Allowing Failure
ERIC Educational Resources Information Center
Ferrandino, Joseph A.
2016-01-01
This article details what resulted when I re-designed my undergraduate statistics course to allow failure as a learning strategy and focused on achievement rather than performance. A variety of within and between sample t-tests are utilized to determine the impact of unlimited test and quiz opportunities on student learning on both quizzes and…
ERIC Educational Resources Information Center
Norris, John M.
2015-01-01
Traditions of statistical significance testing in second language (L2) quantitative research are strongly entrenched in how researchers design studies, select analyses, and interpret results. However, statistical significance tests using "p" values are commonly misinterpreted by researchers, reviewers, readers, and others, leading to…
"What If" Analyses: Ways to Interpret Statistical Significance Test Results Using EXCEL or "R"
ERIC Educational Resources Information Center
Ozturk, Elif
2012-01-01
The present paper aims to review two motivations to conduct "what if" analyses using Excel and "R" to understand the statistical significance tests through the sample size context. "What if" analyses can be used to teach students what statistical significance tests really do and in applied research either prospectively to estimate what sample size…
A Review of Post-1994 Literature on Whether Statistical Significance Tests Should Be Banned.
ERIC Educational Resources Information Center
Sullivan, Jeremy R.
This paper summarizes the literature regarding statistical significance testing with an emphasis on: (1) the post-1994 literature in various disciplines; (2) alternatives to statistical significance testing; and (3) literature exploring why researchers have demonstrably failed to be influenced by the 1994 American Psychological Association…
The Historical Growth of Statistical Significance Testing in Psychology--and Its Future Prospects.
ERIC Educational Resources Information Center
Hubbard, Raymond; Ryan, Patricia A.
2000-01-01
Examined the historical growth in the popularity of statistical significance testing using a random sample of data from 12 American Psychological Association journals. Results replicate and extend findings from a study that used only one such journal. Discusses the role of statistical significance testing and the use of replication and…
ERIC Educational Resources Information Center
Fidalgo, Angel M.; Alavi, Seyed Mohammad; Amirian, Seyed Mohammad Reza
2014-01-01
This study examines three controversial aspects in differential item functioning (DIF) detection by logistic regression (LR) models: first, the relative effectiveness of different analytical strategies for detecting DIF; second, the suitability of the Wald statistic for determining the statistical significance of the parameters of interest; and…
NASA Technical Reports Server (NTRS)
Xu, Kuan-Man
2006-01-01
A new method is proposed to compare statistical differences between summary histograms, which are the histograms summed over a large ensemble of individual histograms. It consists of choosing a distance statistic for measuring the difference between summary histograms and using a bootstrap procedure to calculate the statistical significance level. Bootstrapping is an approach to statistical inference that makes few assumptions about the underlying probability distribution that describes the data. Three distance statistics are compared in this study. They are the Euclidean distance, the Jeffries-Matusita distance and the Kuiper distance. The data used in testing the bootstrap method are satellite measurements of cloud systems called cloud objects. Each cloud object is defined as a contiguous region/patch composed of individual footprints or fields of view. A histogram of measured values over footprints is generated for each parameter of each cloud object and then summary histograms are accumulated over all individual histograms in a given cloud-object size category. The results of statistical hypothesis tests using all three distances as test statistics are generally similar, indicating the validity of the proposed method. The Euclidean distance is determined to be most suitable after comparing the statistical tests of several parameters with distinct probability distributions among three cloud-object size categories. Impacts on the statistical significance levels resulting from differences in the total lengths of satellite footprint data between two size categories are also discussed.
NASA Astrophysics Data System (ADS)
Vermeesch, Pieter
2011-02-01
In my Eos Forum of 24 November 2009 (90(47), 443), I used the chi-square test to reject the null hypothesis that earthquakes occur independent of the weekday to make the point that statistical significance should not be confused with geological significance. Of the five comments on my article, only the one by Sornette and Pisarenko [2011] disputes this conclusion, while the remaining comments take issue with certain aspects of the geophysical case study. In this reply I will address all of these points, after providing some necessary further background about statistical tests. Two types of error can result from a hypothesis test. A Type I error occurs when a true null hypothesis is erroneously rejected by chance. A Type II error occurs when a false null hypothesis is erroneously accepted by chance. By definition, the p value is the probability, under the null hypothesis, of obtaining a test statistic at least as extreme as the one observed. In other words, the smaller the p value, the lower the probability that a Type I error has been made. In light of the exceedingly small p value of the earthquake data set, Tseng and Chen's [2011] assertion that a Type I error has been committed is clearly wrong. How about Type II errors?
Chládek, J; Brázdil, M; Halámek, J; Plešinger, F; Jurák, P
2013-01-01
We present an off-line analysis procedure for exploring brain activity recorded from intra-cerebral electroencephalographic data (SEEG). The objective is to determine the statistical differences between different types of stimulations in the time-frequency domain. The procedure is based on computing relative signal power change and subsequent statistical analysis. An example of characteristic statistically significant event-related de/synchronization (ERD/ERS) detected across different frequency bands following different oddball stimuli is presented. The method is used for off-line functional classification of different brain areas.
Coulson, Melissa; Healey, Michelle; Fidler, Fiona; Cumming, Geoff
2010-01-01
A statistically significant result, and a non-significant result may differ little, although significance status may tempt an interpretation of difference. Two studies are reported that compared interpretation of such results presented using null hypothesis significance testing (NHST), or confidence intervals (CIs). Authors of articles published in psychology, behavioral neuroscience, and medical journals were asked, via email, to interpret two fictitious studies that found similar results, one statistically significant, and the other non-significant. Responses from 330 authors varied greatly, but interpretation was generally poor, whether results were presented as CIs or using NHST. However, when interpreting CIs respondents who mentioned NHST were 60% likely to conclude, unjustifiably, the two results conflicted, whereas those who interpreted CIs without reference to NHST were 95% likely to conclude, justifiably, the two results were consistent. Findings were generally similar for all three disciplines. An email survey of academic psychologists confirmed that CIs elicit better interpretations if NHST is not invoked. Improved statistical inference can result from encouragement of meta-analytic thinking and use of CIs but, for full benefit, such highly desirable statistical reform requires also that researchers interpret CIs without recourse to NHST.
Coulson, Melissa; Healey, Michelle; Fidler, Fiona; Cumming, Geoff
2010-01-01
A statistically significant result, and a non-significant result may differ little, although significance status may tempt an interpretation of difference. Two studies are reported that compared interpretation of such results presented using null hypothesis significance testing (NHST), or confidence intervals (CIs). Authors of articles published in psychology, behavioral neuroscience, and medical journals were asked, via email, to interpret two fictitious studies that found similar results, one statistically significant, and the other non-significant. Responses from 330 authors varied greatly, but interpretation was generally poor, whether results were presented as CIs or using NHST. However, when interpreting CIs respondents who mentioned NHST were 60% likely to conclude, unjustifiably, the two results conflicted, whereas those who interpreted CIs without reference to NHST were 95% likely to conclude, justifiably, the two results were consistent. Findings were generally similar for all three disciplines. An email survey of academic psychologists confirmed that CIs elicit better interpretations if NHST is not invoked. Improved statistical inference can result from encouragement of meta-analytic thinking and use of CIs but, for full benefit, such highly desirable statistical reform requires also that researchers interpret CIs without recourse to NHST. PMID:21607077
ERIC Educational Resources Information Center
Thompson, Bruce
This paper evaluates the logic underlying various criticisms of statistical significance testing and makes specific recommendations for scientific and editorial practice that might better increase the knowledge base. Reliance on the traditional hypothesis testing model has led to a major bias against nonsignificant results and to misinterpretation…
ERIC Educational Resources Information Center
Snyder, Patricia; Lawson, Stephen
Magnitude of effect measures (MEMs), when adequately understood and correctly used, are important aids for researchers who do not want to rely solely on tests of statistical significance in substantive result interpretation. The MEM tells how much of the dependent variable can be controlled, predicted, or explained by the independent variables.…
Alphas and Asterisks: The Development of Statistical Significance Testing Standards in Sociology
ERIC Educational Resources Information Center
Leahey, Erin
2005-01-01
In this paper, I trace the development of statistical significance testing standards in sociology by analyzing data from articles published in two prestigious sociology journals between 1935 and 2000. I focus on the role of two key elements in the diffusion literature, contagion and rationality, as well as the role of institutional factors. I…
Statistical Significance of the Trends in Monthly Heavy Precipitation Over the US
Mahajan, Salil; North, Dr. Gerald R.; Saravanan, Dr. R.; Genton, Dr. Marc G.
2012-01-01
Trends in monthly heavy precipitation, defined by a return period of one year, are assessed for statistical significance in observations and Global Climate Model (GCM) simulations over the contiguous United States using Monte Carlo non-parametric and parametric bootstrapping techniques. The results from the two Monte Carlo approaches are found to be similar to each other, and also to the traditional non-parametric Kendall's {tau} test, implying the robustness of the approach. Two different observational data-sets are employed to test for trends in monthly heavy precipitation and are found to exhibit consistent results. Both data-sets demonstrate upward trends, one of which is found to be statistically significant at the 95% confidence level. Upward trends similar to observations are observed in some climate model simulations of the twentieth century, but their statistical significance is marginal. For projections of the twenty-first century, a statistically significant upwards trend is observed in most of the climate models analyzed. The change in the simulated precipitation variance appears to be more important in the twenty-first century projections than changes in the mean precipitation. Stochastic fluctuations of the climate-system are found to be dominate monthly heavy precipitation as some GCM simulations show a downwards trend even in the twenty-first century projections when the greenhouse gas forcings are strong.
Weighing the costs of different errors when determining statistical significant during monitoring
Technology Transfer Automated Retrieval System (TEKTRAN)
Selecting appropriate significance levels when constructing confidence intervals and performing statistical analyses with rangeland monitoring data is not a straightforward process. This process is burdened by the conventional selection of “95% confidence” (i.e., Type I error rate, a =0.05) as the d...
Interpreting Statistical Significance Test Results: A Proposed New "What If" Method.
ERIC Educational Resources Information Center
Kieffer, Kevin M.; Thompson, Bruce
As the 1994 publication manual of the American Psychological Association emphasized, "p" values are affected by sample size. As a result, it can be helpful to interpret the results of statistical significant tests in a sample size context by conducting so-called "what if" analyses. However, these methods can be inaccurate…
Recent Literature on Whether Statistical Significance Tests Should or Should Not Be Banned.
ERIC Educational Resources Information Center
Deegear, James
This paper summarizes the literature regarding statistical significant testing with an emphasis on recent literature in various discipline and literature exploring why researchers have demonstrably failed to be influenced by the American Psychological Association publication manual's encouragement to report effect sizes. Also considered are…
ERIC Educational Resources Information Center
Linting, Marielle; van Os, Bart Jan; Meulman, Jacqueline J.
2011-01-01
In this paper, the statistical significance of the contribution of variables to the principal components in principal components analysis (PCA) is assessed nonparametrically by the use of permutation tests. We compare a new strategy to a strategy used in previous research consisting of permuting the columns (variables) of a data matrix…
ERIC Educational Resources Information Center
Ervin, Nancy S.
How accurately deltas (statistics measuring the difficulty of items) established by pre-test populations reflect deltas obtained from final form populations, and the consequent utility of pre-test deltas for constructing final (operational test) forms to meet developed statistical specifications were studied. Data were examined from five subject…
2010-01-01
Background The null hypothesis significance test (NHST) is the most frequently used statistical method, although its inferential validity has been widely criticized since its introduction. In 1988, the International Committee of Medical Journal Editors (ICMJE) warned against sole reliance on NHST to substantiate study conclusions and suggested supplementary use of confidence intervals (CI). Our objective was to evaluate the extent and quality in the use of NHST and CI, both in English and Spanish language biomedical publications between 1995 and 2006, taking into account the International Committee of Medical Journal Editors recommendations, with particular focus on the accuracy of the interpretation of statistical significance and the validity of conclusions. Methods Original articles published in three English and three Spanish biomedical journals in three fields (General Medicine, Clinical Specialties and Epidemiology - Public Health) were considered for this study. Papers published in 1995-1996, 2000-2001, and 2005-2006 were selected through a systematic sampling method. After excluding the purely descriptive and theoretical articles, analytic studies were evaluated for their use of NHST with P-values and/or CI for interpretation of statistical "significance" and "relevance" in study conclusions. Results Among 1,043 original papers, 874 were selected for detailed review. The exclusive use of P-values was less frequent in English language publications as well as in Public Health journals; overall such use decreased from 41% in 1995-1996 to 21% in 2005-2006. While the use of CI increased over time, the "significance fallacy" (to equate statistical and substantive significance) appeared very often, mainly in journals devoted to clinical specialties (81%). In papers originally written in English and Spanish, 15% and 10%, respectively, mentioned statistical significance in their conclusions. Conclusions Overall, results of our review show some improvements in
Discrete Fourier Transform: statistical effect size and significance of Fourier components.
NASA Astrophysics Data System (ADS)
Crockett, Robin
2016-04-01
A key analytical technique in the context of investigating cyclic/periodic features in time-series (and other sequential data) is the Discrete (Fast) Fourier Transform (DFT/FFT). However, assessment of the statistical effect-size and significance of the Fourier components in the DFT/FFT spectrum can be subjective and variable. This presentation will outline an approach and method for the statistical evaluation of the effect-size and significance of individual Fourier components from their DFT/FFT coefficients. The effect size is determined in terms of the proportions of the variance in the time-series that individual components account for. The statistical significance is determined using an hypothesis-test / p-value approach with respect to a null hypothesis that the time-series has no linear dependence on a given frequency (of a Fourier component). This approach also allows spectrograms to be presented in terms of these statistical parameters. The presentation will use sunspot cycles as an illustrative example.
Evidence for t{bar t} production at the Tevatron: Statistical significance and cross section
Koningsberg, J.; CDF Collaboration
1994-09-01
We summarize here the results of the ``counting experiments`` by the CDF Collaboration in the search of t{bar t} production in p{bar p} collisions at {radical}s = 1800 TeV at the Tevatron. We analyze their statistical significance by calculating the probability that the observed excess is a fluctuation of the expected backgrounds, and assuming the excess is from top events, extract a measurement of the t{bar t} production cross-section.
NASA Astrophysics Data System (ADS)
Alves, Gelio; Wang, Guanghui; Ogurtsov, Aleksey Y.; Drake, Steven K.; Gucek, Marjan; Suffredini, Anthony F.; Sacks, David B.; Yu, Yi-Kuo
2016-02-01
Correct and rapid identification of microorganisms is the key to the success of many important applications in health and safety, including, but not limited to, infection treatment, food safety, and biodefense. With the advance of mass spectrometry (MS) technology, the speed of identification can be greatly improved. However, the increasing number of microbes sequenced is challenging correct microbial identification because of the large number of choices present. To properly disentangle candidate microbes, one needs to go beyond apparent morphology or simple `fingerprinting'; to correctly prioritize the candidate microbes, one needs to have accurate statistical significance in microbial identification. We meet these challenges by using peptidome profiles of microbes to better separate them and by designing an analysis method that yields accurate statistical significance. Here, we present an analysis pipeline that uses tandem MS (MS/MS) spectra for microbial identification or classification. We have demonstrated, using MS/MS data of 81 samples, each composed of a single known microorganism, that the proposed pipeline can correctly identify microorganisms at least at the genus and species levels. We have also shown that the proposed pipeline computes accurate statistical significances, i.e., E-values for identified peptides and unified E-values for identified microorganisms. The proposed analysis pipeline has been implemented in MiCId, a freely available software for Microorganism Classification and Identification. MiCId is available for download at http://www.ncbi.nlm.nih.gov/CBBresearch/Yu/downloads.html.
Statistical significance of the rich-club phenomenon in complex networks
NASA Astrophysics Data System (ADS)
Jiang, Zhi-Qiang; Zhou, Wei-Xing
2008-04-01
We propose that the rich-club phenomenon in complex networks should be defined in the spirit of bootstrapping, in which a null model is adopted to assess the statistical significance of the rich-club detected. Our method can serve as a definition of the rich-club phenomenon and is applied to analyze three real networks and three model networks. The results show significant improvement compared with previously reported results. We report a dilemma with an exceptional example, showing that there does not exist an omnipotent definition for the rich-club phenomenon.
ERIC Educational Resources Information Center
Hood, Michelle; Creed, Peter A.; Neumann, David L.
2012-01-01
We tested a model of the relationship between attitudes toward statistics and achievement based on Eccles' Expectancy Value Model (1983). Participants (n = 149; 83% female) were second-year Australian university students in a psychology statistics course (mean age = 23.36 years, SD = 7.94 years). We obtained demographic details, past performance,…
NASA Astrophysics Data System (ADS)
Eggert, Silke; Walter, Thomas R.
2009-06-01
The study of volcanic triggering and interaction with the tectonic surroundings has received special attention in recent years, using both direct field observations and historical descriptions of eruptions and earthquake activity. Repeated reports of clustered eruptions and earthquakes may imply that interaction is important in some subregions. However, the subregions likely to suffer such clusters have not been systematically identified, and the processes responsible for the observed interaction remain unclear. We first review previous works about the clustered occurrence of eruptions and earthquakes, and describe selected events. We further elaborate available databases and confirm a statistically significant relationship between volcanic eruptions and earthquakes on the global scale. Moreover, our study implies that closed volcanic systems in particular tend to be activated in association with a tectonic earthquake trigger. We then perform a statistical study at the subregional level, showing that certain subregions are especially predisposed to concurrent eruption-earthquake sequences, whereas such clustering is statistically less significant in other subregions. Based on this study, we argue that individual and selected observations may bias the perceptible weight of coupling. The activity at volcanoes located in the predisposed subregions (e.g., Japan, Indonesia, Melanesia), however, often unexpectedly changes in association with either an imminent or a past earthquake.
Zou, Fei; Fine, Jason P.; Hu, Jianhua; Lin, D. Y.
2004-01-01
Assessing genome-wide statistical significance is an important and difficult problem in multipoint linkage analysis. Due to multiple tests on the same genome, the usual pointwise significance level based on the chi-square approximation is inappropriate. Permutation is widely used to determine genome-wide significance. Theoretical approximations are available for simple experimental crosses. In this article, we propose a resampling procedure to assess the significance of genome-wide QTL mapping for experimental crosses. The proposed method is computationally much less intensive than the permutation procedure (in the order of 102 or higher) and is applicable to complex breeding designs and sophisticated genetic models that cannot be handled by the permutation and theoretical methods. The usefulness of the proposed method is demonstrated through simulation studies and an application to a Drosophila backcross. PMID:15611194
On the statistical significance of surface air temperature trends in the Eurasian Arctic region
NASA Astrophysics Data System (ADS)
Franzke, C.
2012-12-01
This study investigates the statistical significance of the trends of station temperature time series from the European Climate Assessment & Data archive poleward of 60°N. The trends are identified by different methods and their significance is assessed by three different null models of climate noise. All stations show a warming trend but only 17 out of the 109 considered stations have trends which cannot be explained as arising from intrinsic climate fluctuations when tested against any of the three null models. Out of those 17, only one station exhibits a warming trend which is significant against all three null models. The stations with significant warming trends are located mainly in Scandinavia and Iceland.
How to get statistically significant effects in any ERP experiment (and why you shouldn't).
Luck, Steven J; Gaspelin, Nicholas
2017-01-01
ERP experiments generate massive datasets, often containing thousands of values for each participant, even after averaging. The richness of these datasets can be very useful in testing sophisticated hypotheses, but this richness also creates many opportunities to obtain effects that are statistically significant but do not reflect true differences among groups or conditions (bogus effects). The purpose of this paper is to demonstrate how common and seemingly innocuous methods for quantifying and analyzing ERP effects can lead to very high rates of significant but bogus effects, with the likelihood of obtaining at least one such bogus effect exceeding 50% in many experiments. We focus on two specific problems: using the grand-averaged data to select the time windows and electrode sites for quantifying component amplitudes and latencies, and using one or more multifactor statistical analyses. Reanalyses of prior data and simulations of typical experimental designs are used to show how these problems can greatly increase the likelihood of significant but bogus results. Several strategies are described for avoiding these problems and for increasing the likelihood that significant effects actually reflect true differences among groups or conditions.
Statistical significance estimation of a signal within the GooFit framework on GPUs
NASA Astrophysics Data System (ADS)
Cristella, Leonardo; Di Florio, Adriano; Pompili, Alexis
2017-03-01
In order to test the computing capabilities of GPUs with respect to traditional CPU cores a high-statistics toy Monte Carlo technique has been implemented both in ROOT/RooFit and GooFit frameworks with the purpose to estimate the statistical significance of the structure observed by CMS close to the kinematical boundary of the J/ψϕ invariant mass in the three-body decay B+ → J/ψϕK+. GooFit is a data analysis open tool under development that interfaces ROOT/RooFit to CUDA platform on nVidia GPU. The optimized GooFit application running on GPUs hosted by servers in the Bari Tier2 provides striking speed-up performances with respect to the RooFit application parallelised on multiple CPUs by means of PROOF-Lite tool. The considerable resulting speed-up, evident when comparing concurrent GooFit processes allowed by CUDA Multi Process Service and a RooFit/PROOF-Lite process with multiple CPU workers, is presented and discussed in detail. By means of GooFit it has also been possible to explore the behaviour of a likelihood ratio test statistic in different situations in which the Wilks Theorem may or may not apply because its regularity conditions are not satisfied.
Tables of Significance Points for the Variance-Weighted Kolmogorov-Smirnov Statistics.
1981-02-19
NIEDERHAUSEN NO001-76- C - 075 UNCLASSIFIED TR-298 NL Lmmi TABLES OF SIGNIFICANCE POINTS FOR THE VARIANCE-WEIGHTED KOLMOGOROV-SMIRNOV STATISTICS BY Heinrich...Niederhausen TECHNICAL REPORT NO. 298 FEBRUARY 19, 1981 Prepared Under Contract N00014-76- C -0475 (NR-042-267) For the Office of Naval Research Herbert...satisfying 0 < V0 < 10 and v, < Vi-i ¥i IN,* The following functions define a 4-Sheffer sequence (see (A.12)) for the derivative operator D: d i if x ɘ f(, c
NASA Astrophysics Data System (ADS)
Eggert, S.; Walter, T. R.
2009-04-01
The study of volcanic triggering and coupling to the tectonic surroundings has received special attention in recent years, using both direct field observations and historical descriptions of eruptions and earthquake activity. Repeated reports of volcano-earthquake interactions in, e.g., Europe and Japan, may imply that clustered occurrence is important in some regions. However, the regions likely to suffer clustered eruption-earthquake activity have not been systematically identified, and the processes responsible for the observed interaction are debated. We first review previous works about the correlation of volcanic eruptions and earthquakes, and describe selected local clustered events. Following an overview of previous statistical studies, we further elaborate the databases of correlated eruptions and earthquakes from a global perspective. Since we can confirm a relationship between volcanic eruptions and earthquakes on the global scale, we then perform a statistical study on the regional level, showing that time and distance between events follow a linear relationship. In the time before an earthquake, a period of volcanic silence often occurs, whereas in the time after, an increase in volcanic activity is evident. Our statistical tests imply that certain regions are especially predisposed to concurrent eruption-earthquake pairs, e.g., Japan, whereas such pairing is statistically less significant in other regions, such as Europe. Based on this study, we argue that individual and selected observations may bias the perceptible weight of coupling. Volcanoes located in the predisposed regions (e.g., Japan, Indonesia, Melanesia), however, indeed often have unexpectedly changed in association with either an imminent or a past earthquake.
RT-PSM, a real-time program for peptide-spectrum matching with statistical significance.
Wu, Fang-Xiang; Gagné, Pierre; Droit, Arnaud; Poirier, Guy G
2006-01-01
The analysis of complex biological peptide mixtures by tandem mass spectrometry (MS/MS) produces a huge body of collision-induced dissociation (CID) MS/MS spectra. Several methods have been developed for identifying peptide-spectrum matches (PSMs) by assigning MS/MS spectra to peptides in a database. However, most of these methods either do not give the statistical significance of PSMs (e.g., SEQUEST) or employ time-consuming computational methods to estimate the statistical significance (e.g., PeptideProphet). In this paper, we describe a new algorithm, RT-PSM, which can be used to identify PSMs and estimate their accuracy statistically in real time. RT-PSM first computes PSM scores between an MS/MS spectrum and a set of candidate peptides whose masses are within a preset tolerance of the MS/MS precursor ion mass. Then the computed PSM scores of all candidate peptides are employed to fit the expectation value distribution of the scores into a second-degree polynomial function in PSM score. The statistical significance of the best PSM is estimated by extrapolating the fitting polynomial function to the best PSM score. RT-PSM was tested on two pairs of MS/MS spectrum datasets and protein databases to investigate its performance. The MS/MS spectra were acquired using an ion trap mass spectrometer equipped with a nano-electrospray ionization source. The results show that RT-PSM has good sensitivity and specificity. Using a 55,577-entry protein database and running on a standard Pentium-4, 2.8-GHz CPU personal computer, RT-PSM can process peptide spectra on a sequential, one-by-one basis in 0.047 s on average, compared to more than 7 s per spectrum on average for Sequest and X!Tandem, in their current batch-mode processing implementations. RT-PSM is clearly shown to be fast enough for real-time PSM assignment of MS/MS spectra generated every 3 s or so by a 3D ion trap or by a QqTOF instrument.
Jefferson, L; Cooper, E; Hewitt, C; Torgerson, T; Cook, L; Tharmanathan, P; Cockayne, S; Torgerson, D
2016-01-01
Objective Time-lag from study completion to publication is a potential source of publication bias in randomised controlled trials. This study sought to update the evidence base by identifying the effect of the statistical significance of research findings on time to publication of trial results. Design Literature searches were carried out in four general medical journals from June 2013 to June 2014 inclusive (BMJ, JAMA, the Lancet and the New England Journal of Medicine). Setting Methodological review of four general medical journals. Participants Original research articles presenting the primary analyses from phase 2, 3 and 4 parallel-group randomised controlled trials were included. Main outcome measures Time from trial completion to publication. Results The median time from trial completion to publication was 431 days (n = 208, interquartile range 278–618). A multivariable adjusted Cox model found no statistically significant difference in time to publication for trials reporting positive or negative results (hazard ratio: 0.86, 95% CI 0.64 to 1.16, p = 0.32). Conclusion In contrast to previous studies, this review did not demonstrate the presence of time-lag bias in time to publication. This may be a result of these articles being published in four high-impact general medical journals that may be more inclined to publish rapidly, whatever the findings. Further research is needed to explore the presence of time-lag bias in lower quality studies and lower impact journals. PMID:27757242
NASA Astrophysics Data System (ADS)
Hu, Rui; Wang, Bin
2001-02-01
Finding out statistically significant words in DNA and protein sequences forms the basis for many genetic studies. By applying the maximal entropy principle, we give one systematic way to study the nonrandom occurrence of words in DNA or protein sequences. Through comparison with experimental results, it was shown that patterns of regulatory binding sites in Saccharomyces cerevisiae ( yeast) genomes tend to occur significantly in the promoter regions. We studied two correlated gene families of yeast. The method successfully extracts the binding sites verified by experiments in each family. Many putative regulatory sites in the upstream regions are proposed. The study also suggested that some regulatory sites are active in both directions, while others show directional preference.
2014-01-01
Background Most work on the topic of activity landscapes has focused on their quantitative description and visual representation, with the aim of aiding navigation of SAR. Recent developments have addressed applications such as quantifying the proportion of activity cliffs, investigating the predictive abilities of activity landscape methods and so on. However, all these publications have worked under the assumption that the activity landscape models are “real” (i.e., statistically significant). Results The current study addresses for the first time, in a quantitative manner, the significance of a landscape or individual cliffs in the landscape. In particular, we question whether the activity landscape derived from observed (experimental) activity data is different from a randomly generated landscape. To address this we used the SALI measure with six different data sets tested against one or more molecular targets. We also assessed the significance of the landscapes for single and multiple representations. Conclusions We find that non-random landscapes are data set and molecular representation dependent. For the data sets and representations used in this work, our results suggest that not all representations lead to non-random landscapes. This indicates that not all molecular representations should be used to a) interpret the SAR and b) combined to generate consensus models. Our results suggest that significance testing of activity landscape models and in particular, activity cliffs, is key, prior to the use of such models. PMID:24694189
Key statistics related to CO/sub 2/ emissions: Significant contributing countries
Kellogg, M.A.; Edmonds, J.A.; Scott, M.J.; Pomykala, J.S.
1987-07-01
This country selection task report describes and applies a methodology for identifying a set of countries responsible for significant present and anticipated future emissions of CO/sub 2/ and other radiatively important gases (RIGs). The identification of countries responsible for CO/sub 2/ and other RIGs emissions will help determine to what extent a select number of countries might be capable of influencing future emissions. Once identified, those countries could potentially exercise cooperative collective control of global emissions and thus mitigate the associated adverse affects of those emissions. The methodology developed consists of two approaches: the resource approach and the emissions approach. While conceptually very different, both approaches yield the same fundamental conclusion. The core of any international initiative to control global emissions must include three key countries: the US, USSR, and the People's Republic of China. It was also determined that broader control can be achieved through the inclusion of sixteen additional countries with significant contributions to worldwide emissions.
Crow, C.J.
1985-01-01
Middle Ordovician age Chickamauga Group carbonates crop out along the Birmingham and Murphrees Valley anticlines in central Alabama. The macrofossil contents on exposed surfaces of seven bioherms have been counted to determine their various paleontologic characteristics. Twelve groups of organisms are present in these bioherms. Dominant organisms include bryozoans, algae, brachiopods, sponges, pelmatozoans, stromatoporoids and corals. Minor accessory fauna include predators, scavengers and grazers such as gastropods, ostracods, trilobites, cephalopods and pelecypods. Vertical and horizontal niche zonation has been detected for some of the bioherm dwelling fauna. No one bioherm of those studied exhibits all 12 groups of organisms; rather, individual bioherms display various subsets of the total diversity. Statistical treatment (G-test) of the diversity data indicates a lack of statistical homogeneity of the bioherms, both within and between localities. Between-locality population heterogeneity can be ascribed to differences in biologic responses to such gross environmental factors as water depth and clarity, and energy levels. At any one locality, gross aspects of the paleoenvironments are assumed to have been more uniform. Significant differences among bioherms at any one locality may have resulted from patchy distribution of species populations, differential preservation and other factors.
ERIC Educational Resources Information Center
Liao, Ying; Lin, Wen-He
2016-01-01
In the era when digitalization is pursued, numbers are the major medium of information performance and statistics is the primary instrument to interpret and analyze numerical information. For this reason, the cultivation of fundamental statistical literacy should be a key in the learning area of mathematics at the stage of compulsory education.…
ERIC Educational Resources Information Center
Callingham, Rosemary; Carmichael, Colin; Watson, Jane M.
2016-01-01
Statistics is an increasingly important component of the mathematics curriculum. "StatSmart" was a project intended to influence middle-years students' learning outcomes in statistics through the provision of appropriate professional learning opportunities and technology to teachers. Participating students in grade 5/6 to grade 9…
Linden, Ariel
2008-04-01
Prior to implementing a disease management (DM) strategy, a needs assessment should be conducted to determine whether sufficient opportunity exists for an intervention to be successful in the given population. A central component of this assessment is a sample size analysis to determine whether the population is of sufficient size to allow the expected program effect to achieve statistical significance. This paper discusses the parameters that comprise the generic sample size formula for independent samples and their interrelationships, followed by modifications for the DM setting. In addition, a table is provided with sample size estimates for various effect sizes. Examples are described in detail along with strategies for overcoming common barriers. Ultimately, conducting these calculations up front will help set appropriate expectations about the ability to demonstrate the success of the intervention.
A network-based method to assess the statistical significance of mild co-regulation effects.
Horvát, Emőke-Ágnes; Zhang, Jitao David; Uhlmann, Stefan; Sahin, Özgür; Zweig, Katharina Anna
2013-01-01
Recent development of high-throughput, multiplexing technology has initiated projects that systematically investigate interactions between two types of components in biological networks, for instance transcription factors and promoter sequences, or microRNAs (miRNAs) and mRNAs. In terms of network biology, such screening approaches primarily attempt to elucidate relations between biological components of two distinct types, which can be represented as edges between nodes in a bipartite graph. However, it is often desirable not only to determine regulatory relationships between nodes of different types, but also to understand the connection patterns of nodes of the same type. Especially interesting is the co-occurrence of two nodes of the same type, i.e., the number of their common neighbours, which current high-throughput screening analysis fails to address. The co-occurrence gives the number of circumstances under which both of the biological components are influenced in the same way. Here we present SICORE, a novel network-based method to detect pairs of nodes with a statistically significant co-occurrence. We first show the stability of the proposed method on artificial data sets: when randomly adding and deleting observations we obtain reliable results even with noise exceeding the expected level in large-scale experiments. Subsequently, we illustrate the viability of the method based on the analysis of a proteomic screening data set to reveal regulatory patterns of human microRNAs targeting proteins in the EGFR-driven cell cycle signalling system. Since statistically significant co-occurrence may indicate functional synergy and the mechanisms underlying canalization, and thus hold promise in drug target identification and therapeutic development, we provide a platform-independent implementation of SICORE with a graphical user interface as a novel tool in the arsenal of high-throughput screening analysis.
Statistics, Probability, Significance, Likelihood: Words Mean What We Define Them to Mean
ERIC Educational Resources Information Center
Drummond, Gordon B.; Tom, Brian D. M.
2011-01-01
Statisticians use words deliberately and specifically, but not necessarily in the way they are used colloquially. For example, in general parlance "statistics" can mean numerical information, usually data. In contrast, one large statistics textbook defines the term "statistic" to denote "a characteristic of a…
NASA Astrophysics Data System (ADS)
Kellerer-Pirklbauer, Andreas
2016-04-01
Longer data series (e.g. >10 a) of ground temperatures in alpine regions are helpful to improve the understanding regarding the effects of present climate change on distribution and thermal characteristics of seasonal frost- and permafrost-affected areas. Beginning in 2004 - and more intensively since 2006 - a permafrost and seasonal frost monitoring network was established in Central and Eastern Austria by the University of Graz. This network consists of c.60 ground temperature (surface and near-surface) monitoring sites which are located at 1922-3002 m a.s.l., at latitude 46°55'-47°22'N and at longitude 12°44'-14°41'E. These data allow conclusions about general ground thermal conditions, potential permafrost occurrence, trend during the observation period, and regional pattern of changes. Calculations and analyses of several different temperature-related parameters were accomplished. At an annual scale a region-wide statistical significant warming during the observation period was revealed by e.g. an increase in mean annual temperature values (mean, maximum) or the significant lowering of the surface frost number (F+). At a seasonal scale no significant trend of any temperature-related parameter was in most cases revealed for spring (MAM) and autumn (SON). Winter (DJF) shows only a weak warming. In contrast, the summer (JJA) season reveals in general a significant warming as confirmed by several different temperature-related parameters such as e.g. mean seasonal temperature, number of thawing degree days, number of freezing degree days, or days without night frost. On a monthly basis August shows the statistically most robust and strongest warming of all months, although regional differences occur. Despite the fact that the general ground temperature warming during the last decade is confirmed by the field data in the study region, complications in trend analyses arise by temperature anomalies (e.g. warm winter 2006/07) or substantial variations in the winter
Sassenhagen, Jona; Alday, Phillip M
2016-11-01
Experimental research on behavior and cognition frequently rests on stimulus or subject selection where not all characteristics can be fully controlled, even when attempting strict matching. For example, when contrasting patients to controls, variables such as intelligence or socioeconomic status are often correlated with patient status. Similarly, when presenting word stimuli, variables such as word frequency are often correlated with primary variables of interest. One procedure very commonly employed to control for such nuisance effects is conducting inferential tests on confounding stimulus or subject characteristics. For example, if word length is not significantly different for two stimulus sets, they are considered as matched for word length. Such a test has high error rates and is conceptually misguided. It reflects a common misunderstanding of statistical tests: interpreting significance not to refer to inference about a particular population parameter, but about 1. the sample in question, 2. the practical relevance of a sample difference (so that a nonsignificant test is taken to indicate evidence for the absence of relevant differences). We show inferential testing for assessing nuisance effects to be inappropriate both pragmatically and philosophically, present a survey showing its high prevalence, and briefly discuss an alternative in the form of regression including nuisance variables.
Cheng, Chia-Ying; Huang, Chung-Yuan; Sun, Chuen-Tsai
2008-02-01
A major task for postgenomic systems biology researchers is to systematically catalogue molecules and their interactions within living cells. Advancements in complex-network theory are being made toward uncovering organizing principles that govern cell formation and evolution, but we lack understanding of how molecules and their interactions determine how complex systems function. Molecular bridge motifs include isolated motifs that neither interact nor overlap with others, whereas brick motifs act as network foundations that play a central role in defining global topological organization. To emphasize their structural organizing and evolutionary characteristics, we define bridge motifs as consisting of weak links only and brick motifs as consisting of strong links only, then propose a method for performing two tasks simultaneously, which are as follows: 1) detecting global statistical features and local connection structures in biological networks and 2) locating functionally and statistically significant network motifs. To further understand the role of biological networks in system contexts, we examine functional and topological differences between bridge and brick motifs for predicting biological network behaviors and functions. After observing brick motif similarities between E. coli and S. cerevisiae, we note that bridge motifs differentiate C. elegans from Drosophila and sea urchin in three types of networks. Similarities (differences) in bridge and brick motifs imply similar (different) key circuit elements in the three organisms. We suggest that motif-content analyses can provide researchers with global and local data for real biological networks and assist in the search for either isolated or functionally and topologically overlapping motifs when investigating and comparing biological system functions and behaviors.
ERIC Educational Resources Information Center
Kalaian, Sema A.; Kasim, Rafa M.
2014-01-01
This meta-analytic study focused on the quantitative integration and synthesis of the accumulated pedagogical research in undergraduate statistics education literature. These accumulated research studies compared the academic achievement of students who had been instructed using one of the various forms of small-group learning methods to those who…
ERIC Educational Resources Information Center
Missouri Coordinating Board for Higher Education, Jefferson City.
The statistical summary for 1989-90 higher education in Missouri presents data in the form of 120 tables for 7 categories: (1) the Missouri Student Achievement Study (fiscal year 1989); (2) preparation; (3) enrolled freshmen; (4) access; (5) participation; (6) resources; and (7) completions. Sample tables provide the following information: mean…
Mousseau, Jeffrey, D.; Jansen, John, R.; Janke, David, H.; Plowman, Catherine, M.
2003-02-26
Improved waste minimization practices at the Department of Energy's (DOE) Idaho National Engineering and Environmental Laboratory (INEEL) are leading to a 15% reduction in the generation of hazardous and radioactive waste. Bechtel, BWXT Idaho, LLC (BBWI), the prime management and operations contractor at the INEEL, applied the Six Sigma improvement process to the INEEL Waste Minimization Program to review existing processes and define opportunities for improvement. Our Six Sigma analysis team: composed of an executive champion, process owner, a black belt and yellow belt, and technical and business team members used this statistical based process approach to analyze work processes and produced ten recommendations for improvement. Recommendations ranged from waste generator financial accountability for newly generated waste to enhanced employee recognition programs for waste minimization efforts. These improvements have now been implemented to reduce waste generation rates and are producing positive results.
A novel approach to achieving significant reverberation control in performance halls
NASA Astrophysics Data System (ADS)
Conant, David A.; Chu, William
2005-09-01
Conventional methods for achieving broadband, variable sound absorption in large halls normally include heavy application of sound-absorptive drapery and/or thick fibrous panels, applied near available surfaces below, at, and in volumes above the catwalk plane. Occasionally, direct adjustments to room air volume are also provided to effect double-sloped decays. The novel method described here combines carefully located, broad scattering and absorption in singular architectural elements and was applied to a new, 1200-seat concert hall. A change of 0.70 s RT60 in midfrequency is achieved in a visually dramatic manner while neither materially changing room volume nor introducing often-maligned drapery. The aggregate of reverberation control methodologies employed reduces the unoccupied RT60 at midfrequencies from about 3.2 to 1.7 s in this space programed principally for music, including pipe organ. Results of MLS measurements including binaural measurements and binaural recordings of anechoic material and CATT-acoustic modeling and auralizations are discussed.
Gehrmann, Thies; Reinders, Marcel J.T.
2015-01-01
Background: With more and more genomes being sequenced, detecting synteny between genomes becomes more and more important. However, for microorganisms the genomic divergence quickly becomes large, resulting in different codon usage and shuffling of gene order and gene elements such as exons. Results: We present Proteny, a methodology to detect synteny between diverged genomes. It operates on the amino acid sequence level to be insensitive to codon usage adaptations and clusters groups of exons disregarding order to handle diversity in genomic ordering between genomes. Furthermore, Proteny assigns significance levels to the syntenic clusters such that they can be selected on statistical grounds. Finally, Proteny provides novel ways to visualize results at different scales, facilitating the exploration and interpretation of syntenic regions. We test the performance of Proteny on a standard ground truth dataset, and we illustrate the use of Proteny on two closely related genomes (two different strains of Aspergillus niger) and on two distant genomes (two species of Basidiomycota). In comparison to other tools, we find that Proteny finds clusters with more true homologies in fewer clusters that contain more genes, i.e. Proteny is able to identify a more consistent synteny. Further, we show how genome rearrangements, assembly errors, gene duplications and the conservation of specific genes can be easily studied with Proteny. Availability and implementation: Proteny is freely available at the Delft Bioinformatics Lab website http://bioinformatics.tudelft.nl/dbl/software. Contact: t.gehrmann@tudelft.nl Supplementary information: Supplementary data are available at Bioinformatics online. PMID:26116928
Wang, Bo; Shi, Zhanquan; Weber, Georg F; Kennedy, Michael A
2013-10-01
Nuclear magnetic resonance (NMR) spectroscopy-based metabonomics is of growing importance for discovery of human disease biomarkers. Identification and validation of disease biomarkers using statistical significance analysis (SSA) is critical for translation to clinical practice. SSA is performed by assessing a null hypothesis test using a derivative of the Student's t test, e.g., a Welch's t test. Choosing how to correct the significance level for rejecting null hypotheses in the case of multiple testing to maintain a constant family-wise type I error rate is a common problem in such tests. The multiple testing problem arises because the likelihood of falsely rejecting the null hypothesis, i.e., a false positive, grows as the number of tests applied to the same data set increases. Several methods have been introduced to address this problem. Bonferroni correction (BC) assumes all variables are independent and therefore sacrifices sensitivity for detecting true positives in partially dependent data sets. False discovery rate (FDR) methods are more sensitive than BC but uniformly ascribe highest stringency to lowest p value variables. Here, we introduce standard deviation step down (SDSD), which is more sensitive and appropriate than BC for partially dependent data sets. Sensitivity and type I error rate of SDSD can be adjusted based on the degree of variable dependency. SDSD generates fundamentally different profiles of critical p values compared with FDR methods potentially leading to reduced type II error rates. SDSD is increasingly sensitive for more concentrated metabolites. SDSD is demonstrated using NMR-based metabonomics data collected on three different breast cancer cell line extracts.
Assessing the Disconnect between Grade Expectation and Achievement in a Business Statistics Course
ERIC Educational Resources Information Center
Berenson, Mark L.; Ramnarayanan, Renu; Oppenheim, Alan
2015-01-01
In an institutional review board--approved study aimed at evaluating differences in learning between a large-sized introductory business statistics course section using courseware assisted examinations compared with small-sized sections using traditional paper-and-pencil examinations, there appeared to be a severe disconnect between the final…
ERIC Educational Resources Information Center
Acee, Taylor Wayne
2009-01-01
The purpose of this dissertation was to investigate the differential effects of goal setting and value reappraisal on female students' self-efficacy beliefs, value perceptions, exam performance and continued interest in statistics. It was hypothesized that the Enhanced Goal Setting Intervention (GS-E) would positively impact students'…
Basic Mathematics Test Predicts Statistics Achievement and Overall First Year Academic Success
ERIC Educational Resources Information Center
Fonteyne, Lot; De Fruyt, Filip; Dewulf, Nele; Duyck, Wouter; Erauw, Kris; Goeminne, Katy; Lammertyn, Jan; Marchant, Thierry; Moerkerke, Beatrijs; Oosterlinck, Tom; Rosseel, Yves
2015-01-01
In the psychology and educational science programs at Ghent University, only 36.1% of the new incoming students in 2011 and 2012 passed all exams. Despite availability of information, many students underestimate the scientific character of social science programs. Statistics courses are a major obstacle in this matter. Not all enrolling students…
ERIC Educational Resources Information Center
Dancer, Diane; Morrison, Kellie; Tarr, Garth
2015-01-01
Peer-assisted study session (PASS) programs have been shown to positively affect students' grades in a majority of studies. This study extends that analysis in two ways: controlling for ability and other factors, with focus on international students, and by presenting results for PASS in business statistics. Ordinary least squares, random effects…
The Flipped Classroom Improves Student Achievement and Course Satisfaction in a Statistics Course
ERIC Educational Resources Information Center
Peterson, Daniel J.
2016-01-01
There are but a handful of experimental or quasi-experimental studies comparing student outcomes from flipped or inverted classrooms to more traditional lecture formats. In the current study, I present cumulative exam performance and student evaluation data from two sections of a statistics course I recently taught: one a traditional lecture (N =…
Barriers to Academic Achievement for Foster Youth: The Story behind the Statistics
ERIC Educational Resources Information Center
Morton, Brenda M.
2015-01-01
The purpose of this qualitative research study was to explore the perceptions of former and current foster youth about the barriers they encountered during their K-12 education, and to learn how they overcame these obstacles and achieved academic success. The study included in-depth interviews of 11 participants, all of whom were current or former…
Ashrafian, Hutan; Toma, Tania; Harling, Leanne; Kerr, Karen; Athanasiou, Thanos; Darzi, Ara
2014-09-01
The global epidemic of obesity continues to escalate. Obesity accounts for an increasing proportion of the international socioeconomic burden of noncommunicable disease. Online social networking services provide an effective medium through which information may be exchanged between obese and overweight patients and their health care providers, potentially contributing to superior weight-loss outcomes. We performed a systematic review and meta-analysis to assess the role of these services in modifying body mass index (BMI). Our analysis of twelve studies found that interventions using social networking services produced a modest but significant 0.64 percent reduction in BMI from baseline for the 941 people who participated in the studies' interventions. We recommend that social networking services that target obesity should be the subject of further clinical trials. Additionally, we recommend that policy makers adopt reforms that promote the use of anti-obesity social networking services, facilitate multistakeholder partnerships in such services, and create a supportive environment to confront obesity and its associated noncommunicable diseases.
Wei, Xiaohua; Blanco, Juan A.
2014-01-01
Subtropical planted forests are rapidly expanding. They are traditionally managed for intensive, short-term goals that often lead to long-term yield decline and reduced carbon sequestration capacity. Here we show how it is possible to increase and sustain carbon stored in subtropical forest plantations if management is switched towards more sustainable forestry. We first conducted a literature review to explore possible management factors that contribute to the potentials in ecosystem C in tropical and subtropical plantations. We found that broadleaves plantations have significantly higher ecosystem C than conifer plantations. In addition, ecosystem C increases with plantation age, and reaches a peak with intermediate stand densities of 1500–2500 trees ha−1. We then used the FORECAST model to simulate the regional implications of switching from traditional to sustainable management regimes, using Chinese fir (Cunninghamia lanceolata) plantations in subtropical China as a study case. We randomly simulated 200 traditional short-rotation pure stands and 200 sustainably-managed mixed Chinese fir – Phoebe bournei plantations, for 120 years. Our results showed that mixed, sustainably-managed plantations have on average 67.5% more ecosystem C than traditional pure conifer plantations. If all pure plantations were gradually transformed into mixed plantations during the next 10 years, carbon stocks could rise in 2050 by 260.22 TgC in east-central China. Assuming similar differences for temperate and boreal plantations, if sustainable forestry practices were applied to all new forest plantation types in China, stored carbon could increase by 1,482.80 TgC in 2050. Such an increase would be equivalent to a yearly sequestration rate of 40.08 TgC yr−1, offsetting 1.9% of China’s annual emissions in 2010. More importantly, this C increase can be sustained in the long term through the maintenance of higher amounts of soil organic carbon and the production of timber
Wei, Xiaohua; Blanco, Juan A
2014-01-01
Subtropical planted forests are rapidly expanding. They are traditionally managed for intensive, short-term goals that often lead to long-term yield decline and reduced carbon sequestration capacity. Here we show how it is possible to increase and sustain carbon stored in subtropical forest plantations if management is switched towards more sustainable forestry. We first conducted a literature review to explore possible management factors that contribute to the potentials in ecosystem C in tropical and subtropical plantations. We found that broadleaves plantations have significantly higher ecosystem C than conifer plantations. In addition, ecosystem C increases with plantation age, and reaches a peak with intermediate stand densities of 1500-2500 trees ha⁻¹. We then used the FORECAST model to simulate the regional implications of switching from traditional to sustainable management regimes, using Chinese fir (Cunninghamia lanceolata) plantations in subtropical China as a study case. We randomly simulated 200 traditional short-rotation pure stands and 200 sustainably-managed mixed Chinese fir--Phoebe bournei plantations, for 120 years. Our results showed that mixed, sustainably-managed plantations have on average 67.5% more ecosystem C than traditional pure conifer plantations. If all pure plantations were gradually transformed into mixed plantations during the next 10 years, carbon stocks could rise in 2050 by 260.22 TgC in east-central China. Assuming similar differences for temperate and boreal plantations, if sustainable forestry practices were applied to all new forest plantation types in China, stored carbon could increase by 1,482.80 TgC in 2050. Such an increase would be equivalent to a yearly sequestration rate of 40.08 TgC yr⁻¹, offsetting 1.9% of China's annual emissions in 2010. More importantly, this C increase can be sustained in the long term through the maintenance of higher amounts of soil organic carbon and the production of timber products
Kossobokov, V.G.; Romashkova, L.L.; Keilis-Borok, V. I.; Healy, J.H.
1999-01-01
Algorithms M8 and MSc (i.e., the Mendocino Scenario) were used in a real-time intermediate-term research prediction of the strongest earthquakes in the Circum-Pacific seismic belt. Predictions are made by M8 first. Then, the areas of alarm are reduced by MSc at the cost that some earthquakes are missed in the second approximation of prediction. In 1992-1997, five earthquakes of magnitude 8 and above occurred in the test area: all of them were predicted by M8 and MSc identified correctly the locations of four of them. The space-time volume of the alarms is 36% and 18%, correspondingly, when estimated with a normalized product measure of empirical distribution of epicenters and uniform time. The statistical significance of the achieved results is beyond 99% both for M8 and MSc. For magnitude 7.5 + , 10 out of 19 earthquakes were predicted by M8 in 40% and five were predicted by M8-MSc in 13% of the total volume considered. This implies a significance level of 81% for M8 and 92% for M8-MSc. The lower significance levels might result from a global change in seismic regime in 1993-1996, when the rate of the largest events has doubled and all of them become exclusively normal or reversed faults. The predictions are fully reproducible; the algorithms M8 and MSc in complete formal definitions were published before we started our experiment [Keilis-Borok, V.I., Kossobokov, V.G., 1990. Premonitory activation of seismic flow: Algorithm M8, Phys. Earth and Planet. Inter. 61, 73-83; Kossobokov, V.G., Keilis-Borok, V.I., Smith, S.W., 1990. Localization of intermediate-term earthquake prediction, J. Geophys. Res., 95, 19763-19772; Healy, J.H., Kossobokov, V.G., Dewey, J.W., 1992. A test to evaluate the earthquake prediction algorithm, M8. U.S. Geol. Surv. OFR 92-401]. M8 is available from the IASPEI Software Library [Healy, J.H., Keilis-Borok, V.I., Lee, W.H.K. (Eds.), 1997. Algorithms for Earthquake Statistics and Prediction, Vol. 6. IASPEI Software Library]. ?? 1999 Elsevier
Elçi, Alper; Polat, Rahime
2011-01-01
The main objective of this study was to statistically evaluate the significance of seasonal groundwater quality change and to provide an assessment on the spatial distribution of specific groundwater quality parameters. The studied area was the Mount Nif karstic aquifer system located in the southeast of the city of Izmir. Groundwater samples were collected at 57 sampling points in the rainy winter and dry summer seasons. Groundwater quality indicators of interest were electrical conductivity (EC), nitrate, chloride, sulfate, sodium, some heavy metals, and arsenic. Maps showing the spatial distributions and temporal changes of these parameters were created to further interpret spatial patterns and seasonal changes in groundwater quality. Furthermore, statistical tests were conducted to confirm whether the seasonal changes for each quality parameter were statistically significant. It was evident from the statistical tests that the seasonal changes in most groundwater quality parameters were statistically not significant. However, the increase in EC values and aluminum concentrations from winter to summer was found to be significant. Furthermore, a negative correlation between sampling elevation and groundwater quality was found. It was shown that with simple statistical testing, important conclusions can be drawn from limited monitoring data. It was concluded that less groundwater recharge in the dry period of the year does not always imply higher concentrations for all groundwater quality parameters because water circulation times, lithology, quality and extent of recharge, and land use patterns also play an important role on the alteration of groundwater quality.
Neilson, Gavin R; McNally, Jim
2013-03-01
The International Council of Nurses proposes that the shortage of nurses is global in scale and is expected to become much worse in the years ahead. A major factor impacting on the worldwide nursing shortage is the diminishing number of young people choosing nursing as a career (International Council of Nurses, 2008). One important dimension of the school pupils' career choice process is their interactions with significant others and the influence of these significant others (Hodkinson and Sparkes, 1997). As Schools/Departments of Nursing endeavour to attract more intellectual school leavers it is important to examine what advice and opinions are significant others giving regarding nursing as a career choice and how influential is this advice. This paper is based on interview data from 20 high academic achieving 5th and 6th year school pupils in Scotland, paradigmatic cases from a larger sample, who had considered nursing as a possible career choice within their career preference cluster, but then later disregarded nursing and decided to pursue medicine or another health care profession. The data was particularly striking in revealing the negative influence of significant others on high academic achieving school pupils' choice of nursing as a career. The influence of significant others, these being specifically parents, guardians, guidance teachers and career advisors was very apparent in the data in that they had a very negative view regarding nursing as a career choice for high academic achieving school pupils.
Links to sources of cancer-related statistics, including the Surveillance, Epidemiology and End Results (SEER) Program, SEER-Medicare datasets, cancer survivor prevalence data, and the Cancer Trends Progress Report.
ERIC Educational Resources Information Center
Rogosa, David
1981-01-01
The form of the Johnson-Neyman region of significance is shown to be determined by the statistic for testing the null hypothesis that the population within-group regressions are parallel. Results are obtained for both simultaneous and nonsimultaneous regions of significance. (Author)
Krumbholz, Aniko; Anielski, Patricia; Gfrerer, Lena; Graw, Matthias; Geyer, Hans; Schänzer, Wilhelm; Dvorak, Jiri; Thieme, Detlef
2014-01-01
Clenbuterol is a well-established β2-agonist, which is prohibited in sports and strictly regulated for use in the livestock industry. During the last few years clenbuterol-positive results in doping controls and in samples from residents or travellers from a high-risk country were suspected to be related the illegal use of clenbuterol for fattening. A sensitive liquid chromatography-tandem mass spectrometry (LC-MS/MS) method was developed to detect low clenbuterol residues in hair with a detection limit of 0.02 pg/mg. A sub-therapeutic application study and a field study with volunteers, who have a high risk of contamination, were performed. For the application study, a total dosage of 30 µg clenbuterol was applied to 20 healthy volunteers on 5 subsequent days. One month after the beginning of the application, clenbuterol was detected in the proximal hair segment (0-1 cm) in concentrations between 0.43 and 4.76 pg/mg. For the second part, samples of 66 Mexican soccer players were analyzed. In 89% of these volunteers, clenbuterol was detectable in their hair at concentrations between 0.02 and 1.90 pg/mg. A comparison of both parts showed no statistical difference between sub-therapeutic application and contamination. In contrast, discrimination to a typical abuse of clenbuterol is apparently possible. Due to these findings results of real doping control samples can be evaluated.
ERIC Educational Resources Information Center
Thompson, Bruce; Snyder, Patricia A.
1998-01-01
Investigates two aspects of research analyses in quantitative research studies reported in the 1996 issues of "Journal of Counseling & Development" (JCD). Acceptable methodological practice regarding significance testing and evaluation of score reliability has evolved considerably. Contemporary thinking on these issues is described; practice as…
Petykhov, A B; Maev, I V; Deriabin, V E
2012-01-01
Anthropometry--a technique, allowing to obtain the necessary features for the characteristic of human body's changes in norm and at pathology. Statistical analysis of anthropometric parameters, such as--body mass, length, waist line, hip, shoulder and wrist circumferences, skin rolls of fat thickness: on triceps, under a bladebone, on a breast, on a venter and on a biceps, with calculation of indexes and an assessment of possible age influence was carried out for the first time in domestic medicine. Complexes of showing interrelations anthropometric characteristics were detected. Correlation coefficients (r) were counted and the factorial (on a method main a component with the subsequent rotation--a varimax method), covariance and discriminative analyses (with application of the Kaiser and Wilks criterions and F-test) is applied. Study of intergroup variability of body composition was carried out on separate characteristics in healthy individuals groups (135 surveyed aged 45,6 +/- 1,2 years, 56,3% men and 43,7% women) and at internal pathology: patients after a gastrectomy--121 (57,7 +/- 1,2 years, 52% men and 48% women); after Billroth operation--214 (56,1 +/- 1,0 years, 53% men and 47% women); after enterectomy--103 (44,5 +/- 1,8 years, 53% men and 47% women); after mixed genesis protein-energy wasting--206 (29,04 +/- 1,6 years, 79% men and 21% women). The group of interlocking characteristics which includes anthropometric parameters of hypodermic lipopexia (rolls of fat thickness on triceps, a biceps, under a bladebone, on a venter) and fatty body mass was defined by results of the analysis. These characteristics are interconnected with age and growth and have more expressed dependence at women, that reflects development of a fatty component of a body, at assessment of body mass index at women (unlike men). The waist-hip circumference index differs irrespective of body composition indicators that doesn't allow to characterize it with the terms of truncal or
ERIC Educational Resources Information Center
Oshima, T. C.; Raju, Nambury S.; Nanda, Alice O.
2006-01-01
A new item parameter replication method is proposed for assessing the statistical significance of the noncompensatory differential item functioning (NCDIF) index associated with the differential functioning of items and tests framework. In this new method, a cutoff score for each item is determined by obtaining a (1-alpha ) percentile rank score…
Adams, James; Kruger, Uwe; Geis, Elizabeth; Gehn, Eva; Fimbres, Valeria; Pollard, Elena; Mitchell, Jessica; Ingram, Julie; Hellmers, Robert; Quig, David; Hahn, Juergen
2017-01-01
Introduction A number of previous studies examined a possible association of toxic metals and autism, and over half of those studies suggest that toxic metal levels are different in individuals with Autism Spectrum Disorders (ASD). Additionally, several studies found that those levels correlate with the severity of ASD. Methods In order to further investigate these points, this paper performs the most detailed statistical analysis to date of a data set in this field. First morning urine samples were collected from 67 children and adults with ASD and 50 neurotypical controls of similar age and gender. The samples were analyzed to determine the levels of 10 urinary toxic metals (UTM). Autism-related symptoms were assessed with eleven behavioral measures. Statistical analysis was used to distinguish participants on the ASD spectrum and neurotypical participants based upon the UTM data alone. The analysis also included examining the association of autism severity with toxic metal excretion data using linear and nonlinear analysis. “Leave-one-out” cross-validation was used to ensure statistical independence of results. Results and Discussion Average excretion levels of several toxic metals (lead, tin, thallium, antimony) were significantly higher in the ASD group. However, ASD classification using univariate statistics proved difficult due to large variability, but nonlinear multivariate statistical analysis significantly improved ASD classification with Type I/II errors of 15% and 18%, respectively. These results clearly indicate that the urinary toxic metal excretion profiles of participants in the ASD group were significantly different from those of the neurotypical participants. Similarly, nonlinear methods determined a significantly stronger association between the behavioral measures and toxic metal excretion. The association was strongest for the Aberrant Behavior Checklist (including subscales on Irritability, Stereotypy, Hyperactivity, and Inappropriate
Potts, T.T.; Hylko, J.M.; Almond, D.
2007-07-01
A company's overall safety program becomes an important consideration to continue performing work and for procuring future contract awards. When injuries or accidents occur, the employer ultimately loses on two counts - increased medical costs and employee absences. This paper summarizes the human and organizational components that contributed to successful safety programs implemented by WESKEM, LLC's Environmental, Safety, and Health Departments located in Paducah, Kentucky, and Oak Ridge, Tennessee. The philosophy of 'safety, compliance, and then production' and programmatic components implemented at the start of the contracts were qualitatively identified as contributing factors resulting in a significant accumulation of safe work hours and an Experience Modification Rate (EMR) of <1.0. Furthermore, a study by the Associated General Contractors of America quantitatively validated components, already found in the WESKEM, LLC programs, as contributing factors to prevent employee accidents and injuries. Therefore, an investment in the human and organizational components now can pay dividends later by reducing the EMR, which is the key to reducing Workers' Compensation premiums. Also, knowing your employees' demographics and taking an active approach to evaluate and prevent fatigue may help employees balance work and non-work responsibilities. In turn, this approach can assist employers in maintaining a healthy and productive workforce. For these reasons, it is essential that safety needs be considered as the starting point when performing work. (authors)
Meta-analysis using effect size distributions of only statistically significant studies.
van Assen, Marcel A L M; van Aert, Robbie C M; Wicherts, Jelte M
2015-09-01
Publication bias threatens the validity of meta-analytic results and leads to overestimation of the effect size in traditional meta-analysis. This particularly applies to meta-analyses that feature small studies, which are ubiquitous in psychology. Here we develop a new method for meta-analysis that deals with publication bias. This method, p-uniform, enables (a) testing of publication bias, (b) effect size estimation, and (c) testing of the null-hypothesis of no effect. No current method for meta-analysis possesses all 3 qualities. Application of p-uniform is straightforward because no additional data on missing studies are needed and no sophisticated assumptions or choices need to be made before applying it. Simulations show that p-uniform generally outperforms the trim-and-fill method and the test of excess significance (TES; Ioannidis & Trikalinos, 2007b) if publication bias exists and population effect size is homogenous or heterogeneity is slight. For illustration, p-uniform and other publication bias analyses are applied to the meta-analysis of McCall and Carriger (1993) examining the association between infants' habituation to a stimulus and their later cognitive ability (IQ). We conclude that p-uniform is a valuable technique for examining publication bias and estimating population effects in fixed-effect meta-analyses, and as sensitivity analysis to draw inferences about publication bias.
Saha, Ranajit; Pan, Sudip; Chattaraj, Pratim K
2016-11-05
The validity of the maximum hardness principle (MHP) is tested in the cases of 50 chemical reactions, most of which are organic in nature and exhibit anomeric effect. To explore the effect of the level of theory on the validity of MHP in an exothermic reaction, B3LYP/6-311++G(2df,3pd) and LC-BLYP/6-311++G(2df,3pd) (def2-QZVP for iodine and mercury) levels are employed. Different approximations like the geometric mean of hardness and combined hardness are considered in case there are multiple reactants and/or products. It is observed that, based on the geometric mean of hardness, while 82% of the studied reactions obey the MHP at the B3LYP level, 84% of the reactions follow this rule at the LC-BLYP level. Most of the reactions possess the hardest species on the product side. A 50% null hypothesis is rejected at a 1% level of significance.
Wang, Q.; Denton, D.L.; Shukla, R.
2000-01-01
As a follow up to the recommendations of the September 1995 SETAC Pellston Workshop on Whole Effluent Toxicity (WET) on test methods and appropriate endpoints, this paper will discuss the applications and statistical properties of using a statistical criterion of minimum significant difference (MSD). The authors examined the upper limits of acceptable MSDs as acceptance criterion in the case of normally distributed data. The implications of this approach are examined in terms of false negative rate as well as false positive rate. Results indicated that the proposed approach has reasonable statistical properties. Reproductive data from short-term chronic WET test with Ceriodaphnia dubia tests were used to demonstrate the applications of the proposed approach. The data were collected by the North Carolina Department of Environment, Health, and Natural Resources (Raleigh, NC, USA) as part of their National Pollutant Discharge Elimination System program.
Escoto Ponce de León, M C; Mancilla Díaz, J M; Camacho Ruiz, E J
2008-09-01
The current study used clinical and statistical significance tests to investigate the effects of two forms (didactic or interactive) of a universal prevention program on attitudes about shape and weight, eating behaviors, the influence of body aesthetic models, and self-esteem. Three schools were randomly assigned to one, interactive, didactic, or a control condition. Children (61 girls and 59 boys, age 9-11 years) were evaluated at pre-intervention, post-intervention, and at 6-month follow-up. Programs comprised eight, 90-min sessions. Statistical and clinical significance tests showed more changes in boys and girls with the interactive program versus the didactic intervention and control groups. The findings support the use of interactive programs that highlight identified risk factors and construction of identity based on positive traits distinct to physical appearance.
Iacucci, Ernesto; Zingg, Hans H; Perkins, Theodore J
2012-01-01
High-throughput molecular biology studies, such as microarray assays of gene expression, two-hybrid experiments for detecting protein interactions, or ChIP-Seq experiments for transcription factor binding, often result in an "interesting" set of genes - say, genes that are co-expressed or bound by the same factor. One way of understanding the biological meaning of such a set is to consider what processes or functions, as defined in an ontology, are over-represented (enriched) or under-represented (depleted) among genes in the set. Usually, the significance of enrichment or depletion scores is based on simple statistical models and on the membership of genes in different classifications. We consider the more general problem of computing p-values for arbitrary integer additive statistics, or weighted membership functions. Such membership functions can be used to represent, for example, prior knowledge on the role of certain genes or classifications, differential importance of different classifications or genes to the experimenter, hierarchical relationships between classifications, or different degrees of interestingness or evidence for specific genes. We describe a generic dynamic programming algorithm that can compute exact p-values for arbitrary integer additive statistics. We also describe several optimizations for important special cases, which can provide orders-of-magnitude speed up in the computations. We apply our methods to datasets describing oxidative phosphorylation and parturition and compare p-values based on computations of several different statistics for measuring enrichment. We find major differences between p-values resulting from these statistics, and that some statistics recover "gold standard" annotations of the data better than others. Our work establishes a theoretical and algorithmic basis for far richer notions of enrichment or depletion of gene sets with respect to gene ontologies than has previously been available.
NASA Astrophysics Data System (ADS)
Baluev, Roman V.
2013-11-01
We consider the `multifrequency' periodogram, in which the putative signal is modelled as a sum of two or more sinusoidal harmonics with independent frequencies. It is useful in cases when the data may contain several periodic components, especially when their interaction with each other and with the data sampling patterns might produce misleading results. Although the multifrequency statistic itself was constructed earlier, for example by G. Foster in his CLEANest algorithm, its probabilistic properties (the detection significance levels) are still poorly known and much of what is deemed known is not rigorous. These detection levels are nonetheless important for data analysis. We argue that to prove the simultaneous existence of all n components revealed in a multiperiodic variation, it is mandatory to apply at least 2n - 1 significance tests, among which most involve various multifrequency statistics, and only n tests are single-frequency ones. The main result of this paper is an analytic estimation of the statistical significance of the frequency tuples that the multifrequency periodogram can reveal. Using the theory of extreme values of random fields (the generalized Rice method), we find a useful approximation to the relevant false alarm probability. For the double-frequency periodogram, this approximation is given by the elementary formula (π/16)W2e- zz2, where W denotes the normalized width of the settled frequency range, and z is the observed periodogram maximum. We carried out intensive Monte Carlo simulations to show that the practical quality of this approximation is satisfactory. A similar analytic expression for the general multifrequency periodogram is also given, although with less numerical verification.
Huang, Ruili; Southall, Noel; Xia, Menghang; Cho, Ming-Hsuang; Jadhav, Ajit; Nguyen, Dac-Trung; Inglese, James; Tice, Raymond R.; Austin, Christopher P.
2009-01-01
In support of the U.S. Tox21 program, we have developed a simple and chemically intuitive model we call weighted feature significance (WFS) to predict the toxicological activity of compounds, based on the statistical enrichment of structural features in toxic compounds. We trained and tested the model on the following: (1) data from quantitative high–throughput screening cytotoxicity and caspase activation assays conducted at the National Institutes of Health Chemical Genomics Center, (2) data from Salmonella typhimurium reverse mutagenicity assays conducted by the U.S. National Toxicology Program, and (3) hepatotoxicity data published in the Registry of Toxic Effects of Chemical Substances. Enrichments of structural features in toxic compounds are evaluated for their statistical significance and compiled into a simple additive model of toxicity and then used to score new compounds for potential toxicity. The predictive power of the model for cytotoxicity was validated using an independent set of compounds from the U.S. Environmental Protection Agency tested also at the National Institutes of Health Chemical Genomics Center. We compared the performance of our WFS approach with classical classification methods such as Naive Bayesian clustering and support vector machines. In most test cases, WFS showed similar or slightly better predictive power, especially in the prediction of hepatotoxic compounds, where WFS appeared to have the best performance among the three methods. The new algorithm has the important advantages of simplicity, power, interpretability, and ease of implementation. PMID:19805409
NASA Astrophysics Data System (ADS)
Ahmed, Sheehan H.; Brooks, Alyson M.; Christensen, Charlotte R.
2017-04-01
We investigate whether the inclusion of baryonic physics influences the formation of thin, coherently rotating planes of satellites such as those seen around the Milky Way and Andromeda. For four Milky Way-mass simulations, each run both as dark matter-only and with baryons included, we are able to identify a planar configuration that significantly maximizes the number of plane satellite members. The maximum plane member satellites are consistently different between the dark matter-only and baryonic versions of the same run due to the fact that satellites are both more likely to be destroyed and to infall later in the baryonic runs. Hence, studying satellite planes in dark matter-only simulations is misleading, because they will be composed of different satellite members than those that would exist if baryons were included. Additionally, the destruction of satellites in the baryonic runs leads to less radially concentrated satellite distributions, a result that is critical to making planes that are statistically significant compared to a random distribution. Since all planes pass through the centre of the galaxy, it is much harder to create a plane of a given height from a random distribution if the satellites have a low radial concentration. We identify Andromeda's low radial satellite concentration as a key reason why the plane in Andromeda is highly significant. Despite this, when corotation is considered, none of the satellite planes identified for the simulated galaxies are as statistically significant as the observed planes around the Milky Way and Andromeda, even in the baryonic runs.
NASA Astrophysics Data System (ADS)
Casati, Michele
2014-05-01
The assertion that solar activity may play a significant role in the trigger of large volcanic eruptions is, and has been discussed by many geophysicists. Numerous scientific papers have established a possible correlation between these events and the electromagnetic coupling between the Earth and the Sun, but none of them has been able to highlight a possible statistically significant relationship between large volcanic eruptions and any of the series, such as geomagnetic activity, solar wind, sunspots number. In our research, we compare the 148 volcanic eruptions with index VEI4, the major 37 historical volcanic eruptions equal to or greater than index VEI5, recorded from 1610 to 2012 , with its sunspots number. Staring, as the threshold value, a monthly sunspot number of 46 (recorded during the great eruption of Krakatoa VEI6 historical index, August 1883), we note some possible relationships and conduct a statistical test. • Of the historical 31 large volcanic eruptions with index VEI5+, recorded between 1610 and 1955, 29 of these were recorded when the SSN<46. The remaining 2 eruptions were not recorded when the SSN<46, but rather during solar maxima of the solar cycle of the year 1739 and in the solar cycle No. 14 (Shikotsu eruption of 1739 and Ksudach 1907). • Of the historical 8 large volcanic eruptions with index VEI6+, recorded from 1610 to the present, 7 of these were recorded with SSN<46 and more specifically, within the three large solar minima known : Maunder (1645-1710), Dalton (1790-1830) and during the solar minimums occurred between 1880 and 1920. As the only exception, we note the eruption of Pinatubo of June 1991, recorded in the solar maximum of cycle 22. • Of the historical 6 major volcanic eruptions with index VEI5+, recorded after 1955, 5 of these were not recorded during periods of low solar activity, but rather during solar maxima, of the cycles 19,21 and 22. The significant tests, conducted with the chi-square χ ² = 7,782, detect a
Yokoyama, Shozo; Takenaka, Naomi
2005-04-01
Red-green color vision is strongly suspected to enhance the survival of its possessors. Despite being red-green color blind, however, many species have successfully competed in nature, which brings into question the evolutionary advantage of achieving red-green color vision. Here, we propose a new method of identifying positive selection at individual amino acid sites with the premise that if positive Darwinian selection has driven the evolution of the protein under consideration, then it should be found mostly at the branches in the phylogenetic tree where its function had changed. The statistical and molecular methods have been applied to 29 visual pigments with the wavelengths of maximal absorption at approximately 510-540 nm (green- or middle wavelength-sensitive [MWS] pigments) and at approximately 560 nm (red- or long wavelength-sensitive [LWS] pigments), which are sampled from a diverse range of vertebrate species. The results show that the MWS pigments are positively selected through amino acid replacements S180A, Y277F, and T285A and that the LWS pigments have been subjected to strong evolutionary conservation. The fact that these positively selected M/LWS pigments are found not only in animals with red-green color vision but also in those with red-green color blindness strongly suggests that both red-green color vision and color blindness have undergone adaptive evolution independently in different species.
Not Available
2008-09-01
This case study describes how the U. S. Steel Minntac plant in Mt. Iron, Minnesota, achieved annual savings of $760,000 and 95,000 MMBtu after receiving a DOE Save Energy Now energy assessment and implementing recommendations to improve the efficiency of its process heating system.
2008-09-01
The U. S. Steel Minntac plant in Mt. Iron, MN, achieved annual savings of $760,000 and 95,000 MMBtu after receiving a DOE Save Energy Now energy assessment and implementing recommendations to improve the efficiency of its process heating system.
ERIC Educational Resources Information Center
Alghamdi, Fatimah M. A.; Siddiqui, Ozma
2016-01-01
This study reports on and investigates an institutionalized remedial approach held by an English language institute (ELI) at a Saudi University in order to support foundation year struggling students who often achieve low grades or fail to pass a certain level of the English language program. The study utilizes semi-structured interviews to…
Wang, Xiaodong; Ratnaweera, Harsha; Holm, Johan Abdullah; Olsbu, Vibeke
2017-02-07
The on-line monitoring of Chemical oxygen demand (COD) and total phosphorus (TP) restrains wastewater treatment plants to achieve better control of aeration and chemical dosing. In this study, we applied principal components analysis (PCA) to find out significant variables for COD and TP prediction. Multiple regression method applied the variables suggested by PCA to predict influent COD and TP. Moreover, a model of full-scale wastewater treatment plant with moving bed bioreactor (MBBR) and ballasted separation process was developed to simulate the performance of wastewater treatment. The predicted COD and TP data by multiple regression served as model input for dynamic simulation. Besides, the wastewater characteristic of the wastewater treatment plant and MBBR model parameters were given for model calibration. As a result, R(2) of predicted COD and TP versus measured data are 81.6% and 77.2%, respectively. The model output in terms of sludge production and effluent COD based on predicted input data fitted measured data well, which provides possibility to enabled model predictive control of aeration and coagulant dosing in practice. This study provide a feasible and economical approach to overcome monitoring and modelling restrictions that limits model predictive control of wastewater treatment plant.
NASA Astrophysics Data System (ADS)
Wang, H. J.; Shi, W. L.; Chen, X. H.
2006-05-01
The West Development Policy being implemented in China is causing significant land use and land cover (LULC) changes in West China. With the up-to-date satellite database of the Global Land Cover Characteristics Database (GLCCD) that characterizes the lower boundary conditions, the regional climate model RIEMS-TEA is used to simulate possible impacts of the significant LULC variation. The model was run for five continuous three-month periods from 1 June to 1 September of 1993, 1994, 1995, 1996, and 1997, and the results of the five groups are examined by means of a student t-test to identify the statistical significance of regional climate variation. The main results are: (1) The regional climate is affected by the LULC variation because the equilibrium of water and heat transfer in the air-vegetation interface is changed. (2) The integrated impact of the LULC variation on regional climate is not only limited to West China where the LULC varies, but also to some areas in the model domain where the LULC does not vary at all. (3) The East Asian monsoon system and its vertical structure are adjusted by the large scale LULC variation in western China, where the consequences axe the enhancement of the westward water vapor transfer from the east east and the relevant increase of wet-hydrostatic energy in the middle-upper atmospheric layers. (4) The ecological engineering in West China affects significantly the regional climate in Northwest China, North China and the middle-lower reaches of the Yangtze River; there are obvious effects in South, Northeast, and Southwest China, but minor effects in Tibet.
Best, R; Harrell, A; Geesey, C; Libby, B; Wijesooriya, K
2014-06-15
Purpose: The purpose of this study is to inter-compare and find statistically significant differences between flattened field fixed-beam (FB) IMRT with flattening-filter free (FFF) volumetric modulated arc therapy (VMAT) for stereotactic body radiation therapy SBRT. Methods: SBRT plans using FB IMRT and FFF VMAT were generated for fifteen SBRT lung patients using 6 MV beams. For each patient, both IMRT and VMAT plans were created for comparison. Plans were generated utilizing RTOG 0915 (peripheral, 10 patients) and RTOG 0813 (medial, 5 patients) lung protocols. Target dose, critical structure dose, and treatment time were compared and tested for statistical significance. Parameters of interest included prescription isodose surface coverage, target dose heterogeneity, high dose spillage (location and volume), low dose spillage (location and volume), lung dose spillage, and critical structure maximum- and volumetric-dose limits. Results: For all criteria, we found equivalent or higher conformality with VMAT plans as well as reduced critical structure doses. Several differences passed a Student's t-test of significance: VMAT reduced the high dose spillage, evaluated with conformality index (CI), by an average of 9.4%±15.1% (p=0.030) compared to IMRT. VMAT plans reduced the lung volume receiving 20 Gy by 16.2%±15.0% (p=0.016) compared with IMRT. For the RTOG 0915 peripheral lesions, the volumes of lung receiving 12.4 Gy and 11.6 Gy were reduced by 27.0%±13.8% and 27.5%±12.6% (for both, p<0.001) in VMAT plans. Of the 26 protocol pass/fail criteria, VMAT plans were able to achieve an average of 0.2±0.7 (p=0.026) more constraints than the IMRT plans. Conclusions: FFF VMAT has dosimetric advantages over fixed beam IMRT for lung SBRT. Significant advantages included increased dose conformity, and reduced organs-at-risk doses. The overall improvements in terms of protocol pass/fail criteria were more modest and will require more patient data to establish difference
ERIC Educational Resources Information Center
Nguyen, ThuyUyen H.; Charity, Ian; Robson, Andrew
2016-01-01
This study investigates students' perceptions of computer-based learning environments, their attitude towards business statistics, and their academic achievement in higher education. Guided by learning environments concepts and attitudinal theory, a theoretical model was proposed with two instruments, one for measuring the learning environment and…
NASA Astrophysics Data System (ADS)
Govindan, R. B.; Al-Shargabi, Tareq; Andescavage, Nickie N.; Metzler, Marina; Lenin, R. B.; Plessis, Adré du
2017-01-01
Phase differences of two signals in perfect synchrony exhibit a narrow band distribution, whereas the phase differences of two asynchronous signals exhibit uniform distribution. We assess the statistical significance of the phase synchronization between two signals by using a signed rank test to compare the distribution of their phase differences to the theoretically expected uniform distribution for two asynchronous signals. Using numerical simulation of a second order autoregressive (AR2) process, we show that the proposed approach correctly identifies the coupling between the AR2 process and the driving white noise. We also identify the optimal p-value that distinguishes coupled scenarios from uncoupled ones. To identify the limiting cases, we study the phase synchronization between two independent white noises as a function of bandwidth of the filter in a different second simulation. We identify the frequency bandwidth below which the proposed approach fails and suggest using a data-driven approach for those scenarios. Finally, we demonstrate the application of this approach to study the coupling between beat-to-beat cardiac intervals and continuous blood pressure obtained from critically-ill infants to characterize the baroreflex function.
ERIC Educational Resources Information Center
Benton, Tom; White, Kerensa
2007-01-01
This report summarises key findings from the statistical aspects of the research conducted to assess the impact of the pilot on the attainment of bilingual pupils in participating primary schools. Research was conducted by a team at the National Foundation for Educational Research (NFER) between 2004 and 2006 on behalf of the Department for…
NASA Technical Reports Server (NTRS)
Friedlander, Alan L.; Harry, David P., III
1960-01-01
An exploratory analysis of vehicle guidance during the approach to a target planet is presented. The objective of the guidance maneuver is to guide the vehicle to a specific perigee distance with a high degree of accuracy and minimum corrective velocity expenditure. The guidance maneuver is simulated by considering the random sampling of real measurements with significant error and reducing this information to prescribe appropriate corrective action. The instrumentation system assumed includes optical and/or infrared devices to indicate range and a reference angle in the trajectory plane. Statistical results are obtained by Monte-Carlo techniques and are shown as the expectation of guidance accuracy and velocity-increment requirements. Results are nondimensional and applicable to any planet within limits of two-body assumptions. The problem of determining how many corrections to make and when to make them is a consequence of the conflicting requirement of accurate trajectory determination and propulsion. Optimum values were found for a vehicle approaching a planet along a parabolic trajectory with an initial perigee distance of 5 radii and a target perigee of 1.02 radii. In this example measurement errors were less than i minute of arc. Results indicate that four corrections applied in the vicinity of 50, 16, 15, and 1.5 radii, respectively, yield minimum velocity-increment requirements. Thrust devices capable of producing a large variation of velocity-increment size are required. For a vehicle approaching the earth, miss distances within 32 miles are obtained with 90-percent probability. Total velocity increments used in guidance are less than 3300 feet per second with 90-percent probability. It is noted that the above representative results are valid only for the particular guidance scheme hypothesized in this analysis. A parametric study is presented which indicates the effects of measurement error size, initial perigee, and initial energy on the guidance
ERIC Educational Resources Information Center
Jacko, Edward J.; Huck, Schuyler W.
The Alpert-Haber Achievement Anxiety Test was developed to measure the extent to which individuals experience test anxiety. In at least two published studies, the authors claim to have used the test when in fact the response format was changed from that used in the original instrument and the "buffer" items were omitted. To investigate…
Ioannidis, John P. A.
2017-01-01
A typical rule that has been used for the endorsement of new medications by the Food and Drug Administration is to have two trials, each convincing on its own, demonstrating effectiveness. “Convincing” may be subjectively interpreted, but the use of p-values and the focus on statistical significance (in particular with p < .05 being coined significant) is pervasive in clinical research. Therefore, in this paper, we calculate with simulations what it means to have exactly two trials, each with p < .05, in terms of the actual strength of evidence quantified by Bayes factors. Our results show that different cases where two trials have a p-value below .05 have wildly differing Bayes factors. Bayes factors of at least 20 in favor of the alternative hypothesis are not necessarily achieved and they fail to be reached in a large proportion of cases, in particular when the true effect size is small (0.2 standard deviations) or zero. In a non-trivial number of cases, evidence actually points to the null hypothesis, in particular when the true effect size is zero, when the number of trials is large, and when the number of participants in both groups is low. We recommend use of Bayes factors as a routine tool to assess endorsement of new medications, because Bayes factors consistently quantify strength of evidence. Use of p-values may lead to paradoxical and spurious decision-making regarding the use of new medications. PMID:28273140
Fisher, Aaron; Anderson, G Brooke; Peng, Roger; Leek, Jeff
2014-01-01
Scatterplots are the most common way for statisticians, scientists, and the public to visually detect relationships between measured variables. At the same time, and despite widely publicized controversy, P-values remain the most commonly used measure to statistically justify relationships identified between variables. Here we measure the ability to detect statistically significant relationships from scatterplots in a randomized trial of 2,039 students in a statistics massive open online course (MOOC). Each subject was shown a random set of scatterplots and asked to visually determine if the underlying relationships were statistically significant at the P < 0.05 level. Subjects correctly classified only 47.4% (95% CI [45.1%-49.7%]) of statistically significant relationships, and 74.6% (95% CI [72.5%-76.6%]) of non-significant relationships. Adding visual aids such as a best fit line or scatterplot smooth increased the probability a relationship was called significant, regardless of whether the relationship was actually significant. Classification of statistically significant relationships improved on repeat attempts of the survey, although classification of non-significant relationships did not. Our results suggest: (1) that evidence-based data analysis can be used to identify weaknesses in theoretical procedures in the hands of average users, (2) data analysts can be trained to improve detection of statistically significant results with practice, but (3) data analysts have incorrect intuition about what statistically significant relationships look like, particularly for small effects. We have built a web tool for people to compare scatterplots with their corresponding p-values which is available here: http://glimmer.rstudio.com/afisher/EDA/.
Fisher, Aaron; Anderson, G. Brooke; Peng, Roger
2014-01-01
Scatterplots are the most common way for statisticians, scientists, and the public to visually detect relationships between measured variables. At the same time, and despite widely publicized controversy, P-values remain the most commonly used measure to statistically justify relationships identified between variables. Here we measure the ability to detect statistically significant relationships from scatterplots in a randomized trial of 2,039 students in a statistics massive open online course (MOOC). Each subject was shown a random set of scatterplots and asked to visually determine if the underlying relationships were statistically significant at the P < 0.05 level. Subjects correctly classified only 47.4% (95% CI [45.1%–49.7%]) of statistically significant relationships, and 74.6% (95% CI [72.5%–76.6%]) of non-significant relationships. Adding visual aids such as a best fit line or scatterplot smooth increased the probability a relationship was called significant, regardless of whether the relationship was actually significant. Classification of statistically significant relationships improved on repeat attempts of the survey, although classification of non-significant relationships did not. Our results suggest: (1) that evidence-based data analysis can be used to identify weaknesses in theoretical procedures in the hands of average users, (2) data analysts can be trained to improve detection of statistically significant results with practice, but (3) data analysts have incorrect intuition about what statistically significant relationships look like, particularly for small effects. We have built a web tool for people to compare scatterplots with their corresponding p-values which is available here: http://glimmer.rstudio.com/afisher/EDA/. PMID:25337457
Greenhalgh, T.
1997-01-01
It is possible to be seriously misled by taking the statistical competence (and/or the intellectual honesty) of authors for granted. Some common errors committed (deliberately or inadvertently) by the authors of papers are given in the final box. PMID:9277611
Kurtz, S.E.; Fields, D.E.
1983-10-01
This report describes a version of the TERPED/P computer code that is very useful for small data sets. A new algorithm for determining the Kolmogorov-Smirnov (KS) statistics is used to extend program applicability. The TERPED/P code facilitates the analysis of experimental data and assists the user in determining its probability distribution function. Graphical and numerical tests are performed interactively in accordance with the user's assumption of normally or log-normally distributed data. Statistical analysis options include computation of the chi-square statistic and the KS one-sample test statistic and the corresponding significance levels. Cumulative probability plots of the user's data are generated either via a local graphics terminal, a local line printer or character-oriented terminal, or a remote high-resolution graphics device such as the FR80 film plotter or the Calcomp paper plotter. Several useful computer methodologies suffer from limitations of their implementations of the KS nonparametric test. This test is one of the more powerful analysis tools for examining the validity of an assumption about the probability distribution of a set of data. KS algorithms are found in other analysis codes, including the Statistical Analysis Subroutine (SAS) package and earlier versions of TERPED. The inability of these algorithms to generate significance levels for sample sizes less than 50 has limited their usefulness. The release of the TERPED code described herein contains algorithms to allow computation of the KS statistic and significance level for data sets of, if the user wishes, as few as three points. Values computed for the KS statistic are within 3% of the correct value for all data set sizes.
Smylie, Janet; Firestone, Michelle
2015-01-01
Canada is known internationally for excellence in both the quality and public policy relevance of its health and social statistics. There is a double standard however with respect to the relevance and quality of statistics for Indigenous populations in Canada. Indigenous specific health and social statistics gathering is informed by unique ethical, rights-based, policy and practice imperatives regarding the need for Indigenous participation and leadership in Indigenous data processes throughout the spectrum of indicator development, data collection, management, analysis and use. We demonstrate how current Indigenous data quality challenges including misclassification errors and non-response bias systematically contribute to a significant underestimate of inequities in health determinants, health status, and health care access between Indigenous and non-Indigenous people in Canada. The major quality challenge underlying these errors and biases is the lack of Indigenous specific identifiers that are consistent and relevant in major health and social data sources. The recent removal of an Indigenous identity question from the Canadian census has resulted in further deterioration of an already suboptimal system. A revision of core health data sources to include relevant, consistent, and inclusive Indigenous self-identification is urgently required. These changes need to be carried out in partnership with Indigenous peoples and their representative and governing organizations. PMID:26793283
Smylie, Janet; Firestone, Michelle
Canada is known internationally for excellence in both the quality and public policy relevance of its health and social statistics. There is a double standard however with respect to the relevance and quality of statistics for Indigenous populations in Canada. Indigenous specific health and social statistics gathering is informed by unique ethical, rights-based, policy and practice imperatives regarding the need for Indigenous participation and leadership in Indigenous data processes throughout the spectrum of indicator development, data collection, management, analysis and use. We demonstrate how current Indigenous data quality challenges including misclassification errors and non-response bias systematically contribute to a significant underestimate of inequities in health determinants, health status, and health care access between Indigenous and non-Indigenous people in Canada. The major quality challenge underlying these errors and biases is the lack of Indigenous specific identifiers that are consistent and relevant in major health and social data sources. The recent removal of an Indigenous identity question from the Canadian census has resulted in further deterioration of an already suboptimal system. A revision of core health data sources to include relevant, consistent, and inclusive Indigenous self-identification is urgently required. These changes need to be carried out in partnership with Indigenous peoples and their representative and governing organizations.
Liu, Wei; Ding, Jinhui
2016-05-25
The application of the principle of the intention-to-treat (ITT) to the analysis of clinical trials is challenged in the presence of missing outcome data. The consequences of stopping an assigned treatment in a withdrawn subject are unknown. It is difficult to make a single assumption about missing mechanisms for all clinical trials because there are complicated reactions in the human body to drugs due to the presence of complex biological networks, leading to data missing randomly or non-randomly. Currently there is no statistical method that can tell whether a difference between two treatments in the ITT population of a randomized clinical trial with missing data is significant at a pre-specified level. Making no assumptions about the missing mechanisms, we propose a generalized complete-case (GCC) analysis based on the data of completers. An evaluation of the impact of missing data on the ITT analysis reveals that a statistically significant GCC result implies a significant treatment effect in the ITT population at a pre-specified significance level unless, relative to the comparator, the test drug is poisonous to the non-completers as documented in their medical records. Applications of the GCC analysis are illustrated using literature data, and its properties and limits are discussed.
Nhu, Nguyen Van; Singh, Mahendra; Leonhard, Kai
2008-05-08
We have computed molecular descriptors for sizes, shapes, charge distributions, and dispersion interactions for 67 compounds using quantum chemical ab initio and density functional theory methods. For the same compounds, we have fitted the three perturbed-chain polar statistical associating fluid theory (PCP-SAFT) equation of state (EOS) parameters to experimental data and have performed a statistical analysis for relations between the descriptors and the EOS parameters. On this basis, an analysis of the physical significance of the parameters, the limits of the present descriptors, and the PCP-SAFT EOS has been performed. The result is a method that can be used to estimate the vapor pressure curve including the normal boiling point, the liquid volume, the enthalpy of vaporization, the critical data, mixture properties, and so on. When only two of the three parameters are predicted and one is adjusted to experimental normal boiling point data, excellent predictions of all investigated pure compound and mixture properties are obtained. We are convinced that the methodology presented in this work will lead to new EOS applications as well as improved EOS models whose predictive performance is likely to surpass that of most present quantum chemically based, quantitative structure-property relationship, and group contribution methods for a broad range of chemical substances.
Maric, Marija; de Haan, Else; Hogendoorn, Sanne M; Wolters, Lidewij H; Huizenga, Hilde M
2015-03-01
Single-case experimental designs are useful methods in clinical research practice to investigate individual client progress. Their proliferation might have been hampered by methodological challenges such as the difficulty applying existing statistical procedures. In this article, we describe a data-analytic method to analyze univariate (i.e., one symptom) single-case data using the common package SPSS. This method can help the clinical researcher to investigate whether an intervention works as compared with a baseline period or another intervention type, and to determine whether symptom improvement is clinically significant. First, we describe the statistical method in a conceptual way and show how it can be implemented in SPSS. Simulation studies were performed to determine the number of observation points required per intervention phase. Second, to illustrate this method and its implications, we present a case study of an adolescent with anxiety disorders treated with cognitive-behavioral therapy techniques in an outpatient psychotherapy clinic, whose symptoms were regularly assessed before each session. We provide a description of the data analyses and results of this case study. Finally, we discuss the advantages and shortcomings of the proposed method.
Hirata, Maki; Nakajima, Nobuyuki; Saito, Kosuke; Hashimoto, Hiroyuki; Soeda, Shuichi; Uchiyama, Yoshiyasu; Watanabe, Masahiko
2016-01-01
Losses in vital functions of the somatic motor and sensory nervous system are induced by severe long-gap peripheral nerve transection injury. In such cases, autologous nerve grafts are the gold standard treatment, despite the unavoidable sacrifice of other healthy functions, whereas the prognosis is not always favorable. Here, we use human skeletal muscle-derived stem cells (Sk-SCs) to reconstitute the function after long nerve-gap injury. Muscles samples were obtained from the amputated legs from 9 patients following unforeseen accidents. The Sk-SCs were isolated using conditioned collagenase solution, and sorted as CD34+/45- (Sk-34) and CD34-/45-/29+ (Sk-DN/29+) cells. Cells were separately cultured/expanded under optimal conditions for 2 weeks, then injected into the athymic nude mice sciatic nerve long-gap model (7-mm) bridging an acellular conduit. After 8–12 weeks, active cell engraftment was observed only in the Sk-34 cell transplanted group, showing preferential differentiation into Schwann cells and perineurial/endoneurial cells, as well as formation of the myelin sheath and perineurium/endoneurium surrounding regenerated axons, resulted in 87% of numerical recovery. Differentiation into vascular cell lineage (pericyte and endothelial cells) were also observed. A significant tetanic tension recovery (over 90%) of downstream muscles following electrical stimulation of the sciatic nerve (at upper portion of the gap) was also achieved. In contrast, Sk-DN/29+ cells were completely eliminated during the first 4 weeks, but relatively higher numerical (83% vs. 41% in axon) and functional (80% vs. 60% in tetanus) recovery than control were observed. Noteworthy, significant increase in the formation of vascular networks in the conduit during the early stage (first 2 weeks) of recovery was observed in both groups with the expression of key factors (mRNA and protein levels), suggesting the paracrine effects to angiogenesis. These results suggested that the human Sk
Aouinti, Safa; Malouche, Dhafer; Giudicelli, Véronique; Kossida, Sofia; Lefranc, Marie-Paule
2015-01-01
The adaptive immune responses of humans and of other jawed vertebrate species (gnasthostomata) are characterized by the B and T cells and their specific antigen receptors, the immunoglobulins (IG) or antibodies and the T cell receptors (TR) (up to 2.1012 different IG and TR per individual). IMGT, the international ImMunoGeneTics information system (http://www.imgt.org), was created in 1989 by Marie-Paule Lefranc (Montpellier University and CNRS) to manage the huge and complex diversity of these antigen receptors. IMGT built on IMGT-ONTOLOGY concepts of identification (keywords), description (labels), classification (gene and allele nomenclature) and numerotation (IMGT unique numbering), is at the origin of immunoinformatics, a science at the interface between immunogenetics and bioinformatics. IMGT/HighV-QUEST, the first web portal, and so far the only one, for the next generation sequencing (NGS) analysis of IG and TR, is the paradigm for immune repertoire standardized outputs and immunoprofiles of the adaptive immune responses. It provides the identification of the variable (V), diversity (D) and joining (J) genes and alleles, analysis of the V-(D)-J junction and complementarity determining region 3 (CDR3) and the characterization of the ‘IMGT clonotype (AA)’ (AA for amino acid) diversity and expression. IMGT/HighV-QUEST compares outputs of different batches, up to one million nucleotide sequencesfor the statistical module. These high throughput IG and TR repertoire immunoprofiles are of prime importance in vaccination, cancer, infectious diseases, autoimmunity and lymphoproliferative disorders, however their comparative statistical analysis still remains a challenge. We present a standardized statistical procedure to analyze IMGT/HighV-QUEST outputs for the evaluation of the significance of the IMGT clonotype (AA) diversity differences in proportions, per gene of a given group, between NGS IG and TR repertoire immunoprofiles. The procedure is generic and
NASA Astrophysics Data System (ADS)
Løvsletten, Ola; Rypdal, Martin; Rypdal, Kristoffer; Fredriksen, Hege-Beate
2015-04-01
We explore the statistics of instrumental surface temperature records on 5°× 5°, 2°× 2°, and equal-area grids. In particular, we compute the significance of determinstic trends against two parsimonious null models; auto-regressive processes of order 1, AR(1), and fractional Gaussian noises (fGn's). Both of these two null models contain a memory parameter which quantifies the temporal climate variability, with white noise nested in both classes of models. Estimates of the persistence parameters show significant positive serial correlation for most grid cells, with higher persistence over occeans compared to land areas. This shows that, in a trend detection framework, we need to take into account larger spurious trends than what follows from the frequently used white noise assumption. Tested against the fGn null hypothesis, we find that ~ 68% (~ 47%) of the time series have significant trends at the 5% (1%) significance level. If we assume an AR(1) null hypothesis instead, then the result is that ~ 94% (~ 88%) of the time series have significant trends at the 5% (1%) significance level. For both null models, the locations where we do not find significant trends are mostly the ENSO regions and the North-Atlantic. We try to discriminate between the two null models by means of likelihood-ratios. If we at each grid point choose the null model preferred by the model selection test, we find that ~ 82% (~ 73%) of the time series have significant trends at the 5% (1%). We conclude that there is emerging evidence of significant warming trends also at regional scales, although with a much lower signal-to-noise ratio compared to global mean temperatures. Another finding is that many temperature records are consistent with error models for internal variability that exhibit long-range dependence, whereas the temperature fluctuations of the tropical oceans are strongly influenced by the ENSO, and therefore seemingly more consistent with random processes with short
NASA Astrophysics Data System (ADS)
Chen, I. W.
Since 1990, the author, a British U Chinese consultant, has studied and followed the significant achievements accomplished by Snon-mainstreamT seismologists in & cedil;earthquake prediction in China since 1970. The scientific systems used include: (1) Astronomy-seismology: The relativity between special positions of certain planets (especially the moon and another planet) relative to the seismic active areas on the earth and the occurrence time of major damaging earthquakes in these areas on the earth, the relativity between the dates of magnetic storms on the earth caused by so- lar flare on the sun and the occurrence dates of major damaging earthquakes on the earth, as well as certain cycle relativity between the occurrence dates of major his- torical earthquakes occurring in relative areas on the earth. (2) Precursor analysis: With own-developed sensors and instruments, different to conventional seismologi- cal instruments, numerous precursors, abnormality signs, and earthquake imminent signals were recorded. In most cases, these precursors can not be detected by conven- tional seismological sensors/instruments. Through exploratory practice and theoreti- cal studies, various relativity between different characteristics of the precursors, and the occurrence time, epicenter location and magnitude of the developing earthquake were identified and can be calculated. Through approaches quite different to conven- tional methods, successful predictions of quite a large number of earthquakes have been achieved, including earthquakes that occurred in mainland China, Taiwan and Japan. (3) Earthquake imminent affirmative confirmation: With a special instrument, the background of imminent state of earthquakes can be identified, and a universal earthquake imminent signal is further identified. It can be used to confirm if an earlier predicted earthquake is entering its imminent state, if it will definitely occur, or if an earlier prediction can be released. (4) 5km, 7km and
ERIC Educational Resources Information Center
Lee, Steven K.
2002-01-01
Surveys of 105 U.S.-born, Chinese-American and Korean-American students in southern California high schools found that those who adapted to the mainstream culture while maintaining their heritage language and culture had higher academic achievement than those who wholly adopted mainstream values and lifestyles. (Contains 21 references.) (SV)
ERIC Educational Resources Information Center
Levesque, Karen; Wun, Jolene; Green, Caitlin
2010-01-01
The definition of CTE (career/technical education) used by the National Center for Education Statistics (NCES) includes, at the high school level, family and consumer sciences education, general labor market preparation, and occupational education (Bradby and Hoachlander 1999; Bradby and Hudson 2007). Most researchers focus on occupational…
ERIC Educational Resources Information Center
Block, Robert M.
2012-01-01
The use of open-book tests, closed-book tests, and notecards on tests in an introductory statistics course is described in this article. A review of the literature shows that open-book assessments are universally recognized to reduce anxiety. The literature is mixed however on whether deeper learning or better preparation occurs with open-book…
ERIC Educational Resources Information Center
Niculescu, Alexandra C.; Templelaar, Dirk; Leppink, Jimmie; Dailey-Hebert, Amber; Segers, Mien; Gijselaers, Wim
2015-01-01
Introduction: This study examined the predictive value of four learning-related emotions--Enjoyment, Anxiety, Boredom and Hopelessness for achievement outcomes in the first year of study at university. Method: We used a large sample (N = 2337) of first year university students enrolled over three consecutive academic years in a mathematics and…
Minor changes in the indicator used to measure fine PM, which cause only modest changes in Mass concentrations, can lead to dramatic changes in the statistical relationship of fine PM mass with cardiovascular mortality. An epidemiologic study in Phoenix (Mar et al., 2000), augme...
Musavian, Hanieh S; Butt, Tariq M; Larsen, Annette Baltzer; Krebs, Niels
2015-02-01
Food contact surfaces require rigorous sanitation procedures for decontamination, although these methods very often fail to efficiently clean and disinfect surfaces that are visibly contaminated with food residues and possible biofilms. In this study, the results of a short treatment (1 to 2 s) of combined steam (95°C) and ultrasound (SonoSteam) of industrial fish and meat transportation boxes and live-chicken transportation crates naturally contaminated with food and fecal residues were investigated. Aerobic counts of 5.0 to 6.0 log CFU/24 cm(2) and an Enterobacteriaceae spp. level of 2.0 CFU/24 cm(2) were found on the surfaces prior to the treatment. After 1 s of treatment, the aerobic counts were significantly (P < 0.0001) reduced, and within 2 s, reductions below the detection limit (<10 CFU) were reached. Enterobacteriaceae spp. were reduced to a level below the detection limit with only 1 s of treatment. Two seconds of steam-ultrasound treatment was also applied on two different types of plastic modular conveyor belts with hinge pins and one type of flat flexible rubber belt, all visibly contaminated with food residues. The aerobic counts of 3.0 to 5.0 CFU/50 cm(2) were significantly (P < 0.05) reduced, while Enterobacteriaceae spp. were reduced to a level below the detection limit. Industrial meat knives were contaminated with aerobic counts of 6.0 log CFU/5 cm(2) on the handle and 5.2 log CFU/14 cm(2) on the steel. The level of Enterobacteriaceae spp. contamination was approximately 2.5 log CFU on the handle and steel. Two seconds of steam-ultrasound treatment reduced the aerobic counts and Enterobacteriaceae spp. to levels below the detection limit on both handle and steel. This study shows that the steam-ultrasound treatment may be an effective replacement for disinfection processes and that it can be used for continuous disinfection at fast process lines. However, the treatment may not be able to replace efficient cleaning processes used to remove high
Significant achievements in the planetary geology program
NASA Technical Reports Server (NTRS)
Head, J. W. (Editor)
1984-01-01
Recent developments in planetology research are summarized. Important developments are summarized in topics ranging from solar system evolution, comparative planetology, and geologic processes active on other planetary bodies, to techniques and instrument development for exploration.
ERIC Educational Resources Information Center
Vanneman, Alan; Hamilton, Linda; Anderson, Janet Baldwin; Rahman, Taslima
2009-01-01
Mathematics and reading scores on the National Assessment of Educational Progress (NAEP) have increased among students attending elementary and secondary schools since the first time the assessment was administered. These score increases have been observed both for Black and White students; statistically significant score differences between the…
Yücel, Meryem A.; Selb, Juliette; Aasted, Christopher M.; Petkov, Mike P.; Becerra, Lino; Borsook, David; Boas, David A.
2015-01-01
Abstract. Autonomic nervous system response is known to be highly task-dependent. The sensitivity of near-infrared spectroscopy (NIRS) measurements to superficial layers, particularly to the scalp, makes it highly susceptible to systemic physiological changes. Thus, one critical step in NIRS data processing is to remove the contribution of superficial layers to the NIRS signal and to obtain the actual brain response. This can be achieved using short separation channels that are sensitive only to the hemodynamics in the scalp. We investigated the contribution of hemodynamic fluctuations due to autonomous nervous system activation during various tasks. Our results provide clear demonstrations of the critical role of using short separation channels in NIRS measurements to disentangle differing autonomic responses from the brain activation signal of interest. PMID:26835480
Abe, T; Tsuiki, T; Murai, K; Sasamori, S
1990-12-01
A statistical study of 41 cases with denture foreign bodies in the air and upper food passages which were treated in our department during the past 21 years was done. (1) Males were more frequently affected. The ratio of male to female was about 2 to 1. (2) Of 41 dentures, 2, 2 and 37 were lodged in the air passages, hypopharynx and esophagus respectively. (3) There were 5 complete mandibular dentures in 41 cases. (4) The causes of the denture foreign bodies were originated to the problem of denture itself in 29 cases, that of the patient himself in 2 cases and both in 10 cases. (5) Of 39 problematic dentures, 16 showed the breakage such as plate fracture and clasp deformity, but the other 23 showed no breakage. In this latter group, poor holding of the denture was ascribed to miss-making or miss-planning. (6) Of 12 patients with problems in their physical function, 5 had suffered from cerebrovascular disease and 3 from geriatric dementia. (7) The denture foreign body in aged patients with physical hypofunction tends to increase in recent years. (8) Of 39 dentures tried to remove by esophagoscopy, 18 were done with difficulty and they were detachable partial dentures with one artificial tooth and 2-arm-clasps lodged at the first and/or second isthmus of the esophagus. Though we have a denture removed successfully at the third trial, we have no case needed external esophagotomy. (9) Duplicated denture models were made in 20 cases prior to the procedure, and we certify that these models play an important role for the safer removal of denture foreign bodies.
Explaining Math Achievement: Personality, Motivation, and Trust
ERIC Educational Resources Information Center
Kilic-Bebek, Ebru
2009-01-01
This study investigated the statistical significance of student trust next to the well-tested constructs of personality and motivation to determine whether trust is a significant predictor of course achievement in college math courses. Participants were 175 students who were taking undergraduate math courses in an urban public university. The…
ERIC Educational Resources Information Center
Nevill, Dorothy D.
1975-01-01
Three techniques are outlined for use by higher education institutions to achieve salary equity: salary prediction (using various statistical procedures), counterparting (comparing salaries of persons of similar rank), and grievance procedures. (JT)
NASA Technical Reports Server (NTRS)
Wilson, Robert M.
2001-01-01
Since 1750, the number of cataclysmic volcanic eruptions (volcanic explosivity index (VEI)>=4) per decade spans 2-11, with 96 percent located in the tropics and extra-tropical Northern Hemisphere. A two-point moving average of the volcanic time series has higher values since the 1860's than before, being 8.00 in the 1910's (the highest value) and 6.50 in the 1980's, the highest since the 1910's peak. Because of the usual behavior of the first difference of the two-point moving averages, one infers that its value for the 1990's will measure approximately 6.50 +/- 1, implying that approximately 7 +/- 4 cataclysmic volcanic eruptions should be expected during the present decade (2000-2009). Because cataclysmic volcanic eruptions (especially those having VEI>=5) nearly always have been associated with short-term episodes of global cooling, the occurrence of even one might confuse our ability to assess the effects of global warming. Poisson probability distributions reveal that the probability of one or more events with a VEI>=4 within the next ten years is >99 percent. It is approximately 49 percent for an event with a VEI>=5, and 18 percent for an event with a VEI>=6. Hence, the likelihood that a climatically significant volcanic eruption will occur within the next ten years appears reasonably high.
Areepattamannil, Shaljan
2014-01-01
This study examined the relationships between academic motivation-intrinsic motivation, extrinsic motivation, amotivation-and mathematics achievement among 363 Indian adolescents in India and 355 Indian immigrant adolescents in Canada. Results of hierarchical multiple regression analyses showed that intrinsic motivation, extrinsic motivation, and amotivation were not statistically significantly related to mathematics achievement among Indian adolescents in India. In contrast, both intrinsic motivation and extrinsic motivation were statistically significantly related to mathematics achievement among Indian immigrant adolescents in Canada. While intrinsic motivation was a statistically significant positive predictor of mathematics achievement among Indian immigrant adolescents in Canada, extrinsic motivation was a statistically significant negative predictor of mathematics achievement among Indian immigrant adolescents in Canada. Amotivation was not statistically significantly related to mathematics achievement among Indian immigrant adolescents in Canada. Implications of the findings for pedagogy and practice are discussed.
Williams, Scott G. Buyyounouski, Mark K.; Pickles, Tom; Kestin, Larry; Martinez, Alvaro; Hanlon, Alexandra L.; Duchesne, Gillian M.
2008-03-15
Purpose: To define and incorporate the impact of the percentage of positive biopsy cores (PPC) into a predictive model of prostate cancer radiotherapy biochemical outcome. Methods and Materials: The data of 3264 men with clinically localized prostate cancer treated with external beam radiotherapy at four institutions were retrospectively analyzed. Standard prognostic and treatment factors plus the number of biopsy cores collected and the number positive for malignancy by transrectal ultrasound-guided biopsy were available. The primary endpoint was biochemical failure (bF, Phoenix definition). Multivariate proportional hazards analyses were performed and expressed as a nomogram and the model's predictive ability assessed using the concordance index (c-index). Results: The cohort consisted of 21% low-, 51% intermediate-, and 28% high-risk cancer patients, and 30% had androgen deprivation with radiotherapy. The median PPC was 50% (interquartile range [IQR] 29-67%), and median follow-up was 51 months (IQR 29-71 months). Percentage of positive biopsy cores displayed an independent association with the risk of bF (p = 0.01), as did age, prostate-specific antigen value, Gleason score, clinical stage, androgen deprivation duration, and radiotherapy dose (p < 0.001 for all). Including PPC increased the c-index from 0.72 to 0.73 in the overall model. The influence of PPC varied significantly with radiotherapy dose and clinical stage (p = 0.02 for both interactions), with doses <66 Gy and palpable tumors showing the strongest relationship between PPC and bF. Intermediate-risk patients were poorly discriminated regardless of PPC inclusion (c-index 0.65 for both models). Conclusions: Outcome models incorporating PPC show only minor additional ability to predict biochemical failure beyond those containing standard prognostic factors.
Science Achievement of Secondary Agricultural Education Students
ERIC Educational Resources Information Center
Clark, Sara Vicky
2012-01-01
The purposes of this quantitative descriptive and correlational study were to describe the science achievements of secondary agricultural education students and determine if the number of agricultural education courses passed, FFA involvement, and SAE participation would statistically significantly improve students' performance on science…
School Expenditures and Student Achievement: Evidence for the United States
ERIC Educational Resources Information Center
Ram, Rati
2004-01-01
Using state-level panel data, this study estimates a simple achievement function in the fixed-effects format to explore further the nexus between school expenditure and student achievement in the United States. Five main points are noted. First, the effect of per-pupil expenditure is positive and carries high statistical significance in some…
NASA Astrophysics Data System (ADS)
Khan, Shahjahan
Often scientific information on various data generating processes are presented in the from of numerical and categorical data. Except for some very rare occasions, generally such data represent a small part of the population, or selected outcomes of any data generating process. Although, valuable and useful information is lurking in the array of scientific data, generally, they are unavailable to the users. Appropriate statistical methods are essential to reveal the hidden "jewels" in the mess of the row data. Exploratory data analysis methods are used to uncover such valuable characteristics of the observed data. Statistical inference provides techniques to make valid conclusions about the unknown characteristics or parameters of the population from which scientifically drawn sample data are selected. Usually, statistical inference includes estimation of population parameters as well as performing test of hypotheses on the parameters. However, prediction of future responses and determining the prediction distributions are also part of statistical inference. Both Classical or Frequentists and Bayesian approaches are used in statistical inference. The commonly used Classical approach is based on the sample data alone. In contrast, increasingly popular Beyesian approach uses prior distribution on the parameters along with the sample data to make inferences. The non-parametric and robust methods are also being used in situations where commonly used model assumptions are unsupported. In this chapter,we cover the philosophical andmethodological aspects of both the Classical and Bayesian approaches.Moreover, some aspects of predictive inference are also included. In the absence of any evidence to support assumptions regarding the distribution of the underlying population, or if the variable is measured only in ordinal scale, non-parametric methods are used. Robust methods are employed to avoid any significant changes in the results due to deviations from the model
NASA Astrophysics Data System (ADS)
Khan, Shahjahan
Often scientific information on various data generating processes are presented in the from of numerical and categorical data. Except for some very rare occasions, generally such data represent a small part of the population, or selected outcomes of any data generating process. Although, valuable and useful information is lurking in the array of scientific data, generally, they are unavailable to the users. Appropriate statistical methods are essential to reveal the hidden “jewels” in the mess of the row data. Exploratory data analysis methods are used to uncover such valuable characteristics of the observed data. Statistical inference provides techniques to make valid conclusions about the unknown characteristics or parameters of the population from which scientifically drawn sample data are selected. Usually, statistical inference includes estimation of population parameters as well as performing test of hypotheses on the parameters. However, prediction of future responses and determining the prediction distributions are also part of statistical inference. Both Classical or Frequentists and Bayesian approaches are used in statistical inference. The commonly used Classical approach is based on the sample data alone. In contrast, increasingly popular Beyesian approach uses prior distribution on the parameters along with the sample data to make inferences. The non-parametric and robust methods are also being used in situations where commonly used model assumptions are unsupported. In this chapter,we cover the philosophical andmethodological aspects of both the Classical and Bayesian approaches.Moreover, some aspects of predictive inference are also included. In the absence of any evidence to support assumptions regarding the distribution of the underlying population, or if the variable is measured only in ordinal scale, non-parametric methods are used. Robust methods are employed to avoid any significant changes in the results due to deviations from the model
Grossling, Bernardo F.
1975-01-01
Exploratory drilling is still in incipient or youthful stages in those areas of the world where the bulk of the potential petroleum resources is yet to be discovered. Methods of assessing resources from projections based on historical production and reserve data are limited to mature areas. For most of the world's petroleum-prospective areas, a more speculative situation calls for a critical review of resource-assessment methodology. The language of mathematical statistics is required to define more rigorously the appraisal of petroleum resources. Basically, two approaches have been used to appraise the amounts of undiscovered mineral resources in a geologic province: (1) projection models, which use statistical data on the past outcome of exploration and development in the province; and (2) estimation models of the overall resources of the province, which use certain known parameters of the province together with the outcome of exploration and development in analogous provinces. These two approaches often lead to widely different estimates. Some of the controversy that arises results from a confusion of the probabilistic significance of the quantities yielded by each of the two approaches. Also, inherent limitations of analytic projection models-such as those using the logistic and Gomperts functions --have often been ignored. The resource-assessment problem should be recast in terms that provide for consideration of the probability of existence of the resource and of the probability of discovery of a deposit. Then the two above-mentioned models occupy the two ends of the probability range. The new approach accounts for (1) what can be expected with reasonably high certainty by mere projections of what has been accomplished in the past; (2) the inherent biases of decision-makers and resource estimators; (3) upper bounds that can be set up as goals for exploration; and (4) the uncertainties in geologic conditions in a search for minerals. Actual outcomes can then
Teaching Statistics without Sadistics.
ERIC Educational Resources Information Center
Forte, James A.
1995-01-01
Five steps designed to take anxiety out of statistics for social work students are outlined. First, statistics anxiety is identified as an educational problem. Second, instructional objectives and procedures to achieve them are presented and methods and tools for evaluating the course are explored. Strategies for, and obstacles to, making…
Hunt, N C; Ghosh, K M; Blain, A P; Rushton, S P; Longstaff, L M; Deehan, D J
2015-05-01
The aim of this study was to compare the maximum laxity conferred by the cruciate-retaining (CR) and posterior-stabilised (PS) Triathlon single-radius total knee arthroplasty (TKA) for anterior drawer, varus-valgus opening and rotation in eight cadaver knees through a defined arc of flexion (0º to 110º). The null hypothesis was that the limits of laxity of CR- and PS-TKAs are not significantly different. The investigation was undertaken in eight loaded cadaver knees undergoing subjective stress testing using a measurement rig. Firstly the native knee was tested prior to preparation for CR-TKA and subsequently for PS-TKA implantation. Surgical navigation was used to track maximal displacements/rotations at 0º, 30º, 60º, 90º and 110° of flexion. Mixed-effects modelling was used to define the behaviour of the TKAs. The laxity measured for the CR- and PS-TKAs revealed no statistically significant differences over the studied flexion arc for the two versions of TKA. Compared with the native knee both TKAs exhibited slightly increased anterior drawer and decreased varus-valgus and internal-external roational laxities. We believe further study is required to define the clinical states for which the additional constraint offered by a PS-TKA implant may be beneficial.
NASA Astrophysics Data System (ADS)
Temme, F. P.
1992-12-01
Realisation of the invariance properties of the p ⩽ 2 number partitional inventory components of the 20-fold spin algebra associated with [A] 20 nuclear spin clusters under SU2 × L20 allows the mappings {[λ] → Γ} to be derived. In addition, recent general inner tensor product expressions under Ln, for n even (odd), also facilitates the evaluation of many higher [λ] ( L20; p = 3) correlative mappings onto SU3↓SO(3) × L↓20T A 5 subduced symmetry from SU2 duality, thus providing results that determine the nature of adapted NMR bases for both dodecahedrane and its d 20 analogue. The significance of this work lies in the pertinence of nuclear spin statistics to both selective MQ-NMR and to other spectroscopic aspects of cage clusters, e.g., [ 13C] n, n = 20, 60, fullerenes. Mappings onto Ln irreps sets of specific p ⩽ 3 number partitions arise in combinatorial treatment of {M iti} Rota fields, defining scalar invariants in the context of Cayley algebra. Inclusion of the Ln group in the specific Racah chain for NMR symmetry gives rise to significant further physical insight.
ERIC Educational Resources Information Center
Hemphill, F. Cadelle; Vanneman, Alan
2011-01-01
This report provides detailed information on the size of the achievement gaps between Hispanic and White public school students at the national and state levels and describes how those achievement gaps have changed over time. Additional information about race/ethnicity in NAEP is given in Appendix A. Most of the data in this report is derived from…
Shi, Runhua; McLarty, Jerry W
2009-10-01
In this article, we introduced basic concepts of statistics, type of distributions, and descriptive statistics. A few examples were also provided. The basic concepts presented herein are only a fraction of the concepts related to descriptive statistics. Also, there are many commonly used distributions not presented herein, such as Poisson distributions for rare events and exponential distributions, F distributions, and logistic distributions. More information can be found in many statistics books and publications.
Miyauchi, T; Hagimoto, H; Saito, T; Endo, K; Ishii, M; Yamaguchi, T; Kajiwara, A; Matsushita, M
1989-01-01
EEG power amplitude and power ratio data obtained from 15 (3 men and 12 women) patients with Alzheimer's disease (AD) and 8 (2 men and 6 women) with senile dementia of Alzheimer type (SDAT) were compared with similar data from 40 age- and sex-matched normal controls. Compared with the healthy controls, both patient groups demonstrated increased EEG background slowing, and it indicated more slower in AD than in SDAT. Moreover, both groups showed characteristic findings respectively on EEG topography and t-statistic significance probability mapping (SPM). The differences between AD and their controls indicated high slowing with reductions in alpha 2, beta 1 and beta 2 activity. The SPMs of power ratio in theta and alpha 2 bands showed most prominent significance in the right posterior-temporal region and delta and beta bands did in the frontal region. Severe AD indicated only frontal delta slowing compared to mild AD. The differences between SDAT and their controls indicated only mild slowing in delta and theta bands. The SPM of power amplitude showed occipital slowing, whereas the SPM of power ratio showed the slowing in the frontal region. Judging from both topographic findings, these were considered to denote diffuse slow tendency. In summary, these results presumed that in AD, cortical damages followed by EEG slowing with reductions of alpha 2 and beta bands originated rapidly and thereafter developed subcortical (non-specific area in thalamus) changes with frontal delta activity on SPM. On the other hand, in SDAT, diffuse cortico-subcortical damages with diffuse slowing on EEG topography were caused gradually.
ERIC Educational Resources Information Center
Callamaras, Peter
1983-01-01
This buyer's guide to seven major types of statistics software packages for microcomputers reviews Edu-Ware Statistics 3.0; Financial Planning; Speed Stat; Statistics with DAISY; Human Systems Dynamics package of Stats Plus, ANOVA II, and REGRESS II; Maxistat; and Moore-Barnes' MBC Test Construction and MBC Correlation. (MBR)
ERIC Educational Resources Information Center
Petocz, Peter; Sowey, Eric
2008-01-01
As a branch of knowledge, Statistics is ubiquitous and its applications can be found in (almost) every field of human endeavour. In this article, the authors track down the possible source of the link between the "Siren song" and applications of Statistics. Answers to their previous five questions and five new questions on Statistics are presented.
Significance Analysis of Prognostic Signatures
Beck, Andrew H.; Knoblauch, Nicholas W.; Hefti, Marco M.; Kaplan, Jennifer; Schnitt, Stuart J.; Culhane, Aedin C.; Schroeder, Markus S.; Risch, Thomas; Quackenbush, John; Haibe-Kains, Benjamin
2013-01-01
A major goal in translational cancer research is to identify biological signatures driving cancer progression and metastasis. A common technique applied in genomics research is to cluster patients using gene expression data from a candidate prognostic gene set, and if the resulting clusters show statistically significant outcome stratification, to associate the gene set with prognosis, suggesting its biological and clinical importance. Recent work has questioned the validity of this approach by showing in several breast cancer data sets that “random” gene sets tend to cluster patients into prognostically variable subgroups. This work suggests that new rigorous statistical methods are needed to identify biologically informative prognostic gene sets. To address this problem, we developed Significance Analysis of Prognostic Signatures (SAPS) which integrates standard prognostic tests with a new prognostic significance test based on stratifying patients into prognostic subtypes with random gene sets. SAPS ensures that a significant gene set is not only able to stratify patients into prognostically variable groups, but is also enriched for genes showing strong univariate associations with patient prognosis, and performs significantly better than random gene sets. We use SAPS to perform a large meta-analysis (the largest completed to date) of prognostic pathways in breast and ovarian cancer and their molecular subtypes. Our analyses show that only a small subset of the gene sets found statistically significant using standard measures achieve significance by SAPS. We identify new prognostic signatures in breast and ovarian cancer and their corresponding molecular subtypes, and we show that prognostic signatures in ER negative breast cancer are more similar to prognostic signatures in ovarian cancer than to prognostic signatures in ER positive breast cancer. SAPS is a powerful new method for deriving robust prognostic biological signatures from clinically annotated
LED champing: statistically blessed?
Wang, Zhuo
2015-06-10
LED champing (smart mixing of individual LEDs to match the desired color and lumens) and color mixing strategies have been widely used to maintain the color consistency of light engines. Light engines with champed LEDs can easily achieve the color consistency of a couple MacAdam steps with widely distributed LEDs to begin with. From a statistical point of view, the distributions for the color coordinates and the flux after champing are studied. The related statistical parameters are derived, which facilitate process improvements such as Six Sigma and are instrumental to statistical quality control for mass productions.
Predicting achievement of first semester university science students
NASA Astrophysics Data System (ADS)
Gibbs, Al
1991-12-01
This paper reports on 11 measures used as predictors of students' achievement in their first semester subjects. The students were enrolled in the same four core subjects of a university general science course. Although a number of statistically significant correlations were found, only one predictor variable, HSC aggregate mark, correlated significantly with each of the achievement variables. One predictor variable entered four of the achievement regression equations, while two variables entered the fifth, accounting for 34 to 54% of the variance. *** DIRECT SUPPORT *** A05A9011 00020
ERIC Educational Resources Information Center
Petocz, Peter; Sowey, Eric
2008-01-01
In this article, the authors focus on hypothesis testing--that peculiarly statistical way of deciding things. Statistical methods for testing hypotheses were developed in the 1920s and 1930s by some of the most famous statisticians, in particular Ronald Fisher, Jerzy Neyman and Egon Pearson, who laid the foundations of almost all modern methods of…
Inverse statistics and information content
NASA Astrophysics Data System (ADS)
Ebadi, H.; Bolgorian, Meysam; Jafari, G. R.
2010-12-01
Inverse statistics analysis studies the distribution of investment horizons to achieve a predefined level of return. This distribution provides a maximum investment horizon which determines the most likely horizon for gaining a specific return. There exists a significant difference between inverse statistics of financial market data and a fractional Brownian motion (fBm) as an uncorrelated time-series, which is a suitable criteria to measure information content in financial data. In this paper we perform this analysis for the DJIA and S&P500 as two developed markets and Tehran price index (TEPIX) as an emerging market. We also compare these probability distributions with fBm probability, to detect when the behavior of the stocks are the same as fBm.
ERIC Educational Resources Information Center
Chatterji, Madhabi
2006-01-01
This study estimated reading achievement gaps in different ethnic, gender, and socioeconomic groups of 1st graders in the U.S. compared with specific reference groups and identified statistically significant correlates and moderators of early reading achievement. A subset of 2,296 students nested in 184 schools from the Early Childhood…
Student academic achievement in college chemistry
NASA Astrophysics Data System (ADS)
Tabibzadeh, Kiana S.
General Chemistry is required for variety of baccalaureate degrees, including all medical related fields, engineering, and science majors. Depending on the institution, the prerequisite requirement for college level General Chemistry varies. The success rate for this course is low. The purpose of this study is to examine the factors influencing student academic achievement and retention in General Chemistry at the college level. In this study student achievement is defined by those students who earned grades of "C" or better. The dissertation contains in-depth studies on influence of Intermediate Algebra as a prerequisite compared to Fundamental Chemistry for student academic achievement and student retention in college General Chemistry. In addition the study examined the extent and manner in which student self-efficacy influences student academic achievement in college level General Chemistry. The sample for this part of the study is 144 students enrolled in first semester college level General Chemistry. Student surveys determined student self-efficacy level. The statistical analyses of study demonstrated that Fundamental Chemistry is a better prerequisite for student academic achievement and student retention. The study also found that student self-efficacy has no influence on student academic achievement. The significance of this study will be to provide data for the purpose of establishing a uniform and most suitable prerequisite for college level General Chemistry. Finally the variables identified to influence student academic achievement and enhance student retention will support educators' mission to maximize the students' ability to complete their educational goal at institutions of higher education.
NASA Technical Reports Server (NTRS)
Feiveson, Alan H.; Foy, Millennia; Ploutz-Snyder, Robert; Fiedler, James
2014-01-01
Do you have elevated p-values? Is the data analysis process getting you down? Do you experience anxiety when you need to respond to criticism of statistical methods in your manuscript? You may be suffering from Insufficient Statistical Support Syndrome (ISSS). For symptomatic relief of ISSS, come for a free consultation with JSC biostatisticians at our help desk during the poster sessions at the HRP Investigators Workshop. Get answers to common questions about sample size, missing data, multiple testing, when to trust the results of your analyses and more. Side effects may include sudden loss of statistics anxiety, improved interpretation of your data, and increased confidence in your results.
Teaching Statistics in Integration with Psychology
ERIC Educational Resources Information Center
Wiberg, Marie
2009-01-01
The aim was to revise a statistics course in order to get the students motivated to learn statistics and to integrate statistics more throughout a psychology course. Further, we wish to make students become more interested in statistics and to help them see the importance of using statistics in psychology research. To achieve this goal, several…
Tellinghuisen, Joel
2008-01-01
The method of least squares is probably the most powerful data analysis tool available to scientists. Toward a fuller appreciation of that power, this work begins with an elementary review of statistics fundamentals, and then progressively increases in sophistication as the coverage is extended to the theory and practice of linear and nonlinear least squares. The results are illustrated in application to data analysis problems important in the life sciences. The review of fundamentals includes the role of sampling and its connection to probability distributions, the Central Limit Theorem, and the importance of finite variance. Linear least squares are presented using matrix notation, and the significance of the key probability distributions-Gaussian, chi-square, and t-is illustrated with Monte Carlo calculations. The meaning of correlation is discussed, including its role in the propagation of error. When the data themselves are correlated, special methods are needed for the fitting, as they are also when fitting with constraints. Nonlinear fitting gives rise to nonnormal parameter distributions, but the 10% Rule of Thumb suggests that such problems will be insignificant when the parameter is sufficiently well determined. Illustrations include calibration with linear and nonlinear response functions, the dangers inherent in fitting inverted data (e.g., Lineweaver-Burk equation), an analysis of the reliability of the van't Hoff analysis, the problem of correlated data in the Guggenheim method, and the optimization of isothermal titration calorimetry procedures using the variance-covariance matrix for experiment design. The work concludes with illustrations on assessing and presenting results.
... population, or about 25 million Americans, has experienced tinnitus lasting at least five minutes in the past ... by NIDCD Epidemiology and Statistics Program staff: (1) tinnitus prevalence was obtained from the 2008 National Health ...
ERIC Educational Resources Information Center
Huberty, Carl J.
An approach to statistical testing, which combines Neyman-Pearson hypothesis testing and Fisher significance testing, is recommended. The use of P-values in this approach is discussed in some detail. The author also discusses some problems which are often found in introductory statistics textbooks. The problems involve the definitions of…
ERIC Educational Resources Information Center
Greer, Wil
2013-01-01
This study identified the variables associated with data-driven instruction (DDI) that are perceived to best predict student achievement. Of the DDI variables discussed in the literature, 51 of them had a sufficient enough research base to warrant statistical analysis. Of them, 26 were statistically significant. Multiple regression and an…
The faulty statistics of complementary alternative medicine (CAM).
Pandolfi, Maurizio; Carreras, Giulia
2014-09-01
The authors illustrate the difficulties involved in obtaining a valid statistical significance in clinical studies especially when the prior probability of the hypothesis under scrutiny is low. Since the prior probability of a research hypothesis is directly related to its scientific plausibility, the commonly used frequentist statistics, which does not take into account this probability, is particularly unsuitable for studies exploring matters in various degree disconnected from science such as complementary alternative medicine (CAM) interventions. Any statistical significance obtained in this field should be considered with great caution and may be better applied to more plausible hypotheses (like placebo effect) than that examined - which usually is the specific efficacy of the intervention. Since achieving meaningful statistical significance is an essential step in the validation of medical interventions, CAM practices, producing only outcomes inherently resistant to statistical validation, appear not to belong to modern evidence-based medicine.
Defending the Logic of Significance Testing: A Response to Gorard
ERIC Educational Resources Information Center
Neale, Dave
2015-01-01
Recently, Stephen Gorard has outlined strong objections to the use of significance testing in social research. He has argued, first, that as the samples used in social research are almost always non-random it is not possible to use inferential statistical techniques and, second, that even if a truly random sample were achieved, the logic behind…
Students' Achievement in Human Circulatory System Unit: The Effect of Reasoning Ability and Gender.
ERIC Educational Resources Information Center
Sungur, Semra; Tekkaya, Ceren
2003-01-01
Investigates the effect of gender and reasoning ability on the human circulatory system concepts achievement and attitude toward biology. Reports a statistically significant mean difference between concrete and formal students with regard to achievement and attitude toward biology. (Contains 24 references.) (Author/YDS)
ERIC Educational Resources Information Center
Edge, D. Michael
2011-01-01
This non-experimental study attempted to determine how the different prescribed mathematic tracks offered at a comprehensive technical high school influenced the mathematics performance of low-achieving students on standardized assessments of mathematics achievement. The goal was to provide an analysis of any statistically significant differences…
Liem, Arief Darmanegara; Nie, Youyan
2008-10-01
This study examined how values related to achievement goals and individual-oriented and social-oriented achievement motivations among secondary school students in China (N = 355) and Indonesia (N = 356). Statistical comparisons showed the Chinese students endorsed more strongly than the Indonesian students on self-direction and hedonism values, individual-oriented achievement motivation, and mastery-approach goals. Conversely, the Indonesian students endorsed more strongly than their Chinese counterparts on security, conformity, tradition, universalism and achievement values, social-oriented achievement motivation, and performance-approach and mastery-avoidance goals. Values explained a significant amount of the variance in almost all of the dimensions of motivation. Etic and emic relationships between values and achievement motivations were found.
ERIC Educational Resources Information Center
Chicot, Katie; Holmes, Hilary
2012-01-01
The use, and misuse, of statistics is commonplace, yet in the printed format data representations can be either over simplified, supposedly for impact, or so complex as to lead to boredom, supposedly for completeness and accuracy. In this article the link to the video clip shows how dynamic visual representations can enliven and enhance the…
Significant achievements in the planetary geology program, 1981
NASA Technical Reports Server (NTRS)
Mouginis-Mark, P. J.
1982-01-01
Recent developments in planetology research are summarized. Important developments are summarized in topics ranging from solar system evolution, comparative planetology, and geologic processes, to techniques and instrument development for future exploration.
Significant achievements in the planetary geology program, 1980
NASA Technical Reports Server (NTRS)
Holt, H. E. (Editor)
1980-01-01
Recent developments in planetology research as reported at the 1980 NASA Planetology Program Principal Investigators meeting are summarized. Important developments are summarized in topics ranging from solar system evolution and comparative planetology to geologic processes active on other planetary bodies.
Significant achievements in the planetary geology program, 1981
NASA Technical Reports Server (NTRS)
Holt, H. E. (Editor)
1981-01-01
Recent developments in planetology research as reported at the 1981 NASA Planetary Geology Principal Investigators meeting are summarized. The evolution of the solar system, comparative planetology, and geologic processes active on other planets are considered. Galilean satellites and small bodies, Venus, geochemistry and regoliths, volcanic and aeolian processes and landforms, fluvial and periglacial processes, and planetary impact cratering, remote sensing, and cartography are discussed.
Significant achievements in the planetary program, 1976 - 1977
NASA Technical Reports Server (NTRS)
Head, J. W. (Editor)
1978-01-01
Recent developments in planetology research as reported at the 1977 NASA Planetology Program Principal Investigators meeting are summarized. Important developments are summarized in topics ranging from solar system evolution, comparative planetology, and geologic processes, to techniques and instrument development for future exploration.
Rendón-Macías, Mario Enrique; Villasís-Keever, Miguel Ángel; Miranda-Novales, María Guadalupe
2016-01-01
Descriptive statistics is the branch of statistics that gives recommendations on how to summarize clearly and simply research data in tables, figures, charts, or graphs. Before performing a descriptive analysis it is paramount to summarize its goal or goals, and to identify the measurement scales of the different variables recorded in the study. Tables or charts aim to provide timely information on the results of an investigation. The graphs show trends and can be histograms, pie charts, "box and whiskers" plots, line graphs, or scatter plots. Images serve as examples to reinforce concepts or facts. The choice of a chart, graph, or image must be based on the study objectives. Usually it is not recommended to use more than seven in an article, also depending on its length.
Order Statistics and Nonparametric Statistics.
2014-09-26
Topics investigated include the following: Probability that a fuze will fire; moving order statistics; distribution theory and properties of the...problem posed by an Army Scientist: A fuze will fire when at least n-i (or n-2) of n detonators function within time span t. What is the probability of
NASA Astrophysics Data System (ADS)
Goodman, Joseph W.
2000-07-01
The Wiley Classics Library consists of selected books that have become recognized classics in their respective fields. With these new unabridged and inexpensive editions, Wiley hopes to extend the life of these important works by making them available to future generations of mathematicians and scientists. Currently available in the Series: T. W. Anderson The Statistical Analysis of Time Series T. S. Arthanari & Yadolah Dodge Mathematical Programming in Statistics Emil Artin Geometric Algebra Norman T. J. Bailey The Elements of Stochastic Processes with Applications to the Natural Sciences Robert G. Bartle The Elements of Integration and Lebesgue Measure George E. P. Box & Norman R. Draper Evolutionary Operation: A Statistical Method for Process Improvement George E. P. Box & George C. Tiao Bayesian Inference in Statistical Analysis R. W. Carter Finite Groups of Lie Type: Conjugacy Classes and Complex Characters R. W. Carter Simple Groups of Lie Type William G. Cochran & Gertrude M. Cox Experimental Designs, Second Edition Richard Courant Differential and Integral Calculus, Volume I RIchard Courant Differential and Integral Calculus, Volume II Richard Courant & D. Hilbert Methods of Mathematical Physics, Volume I Richard Courant & D. Hilbert Methods of Mathematical Physics, Volume II D. R. Cox Planning of Experiments Harold S. M. Coxeter Introduction to Geometry, Second Edition Charles W. Curtis & Irving Reiner Representation Theory of Finite Groups and Associative Algebras Charles W. Curtis & Irving Reiner Methods of Representation Theory with Applications to Finite Groups and Orders, Volume I Charles W. Curtis & Irving Reiner Methods of Representation Theory with Applications to Finite Groups and Orders, Volume II Cuthbert Daniel Fitting Equations to Data: Computer Analysis of Multifactor Data, Second Edition Bruno de Finetti Theory of Probability, Volume I Bruno de Finetti Theory of Probability, Volume 2 W. Edwards Deming Sample Design in Business Research
Pestana, Dinis
2013-01-01
Statistics is a privileged tool in building knowledge from information, since the purpose is to extract from a sample limited information conclusions to the whole population. The pervasive use of statistical software (that always provides an answer, the question being adequate or not), and the absence of statistics to confer a scientific flavour to so much bad science, has had a pernicious effect on some disbelief on statistical research. Would Lord Rutherford be alive today, it is almost certain that he would not condemn the use of statistics in research, as he did in the dawn of the 20th century. But he would indeed urge everyone to use statistics quantum satis, since to use bad data, too many data, and statistics to enquire on irrelevant questions, is a source of bad science, namely because with too many data we can establish statistical significance of irrelevant results. This is an important point that addicts of evidence based medicine should be aware of, since the meta analysis of two many data will inevitably establish senseless results.
Statistics 101 for Radiologists.
Anvari, Arash; Halpern, Elkan F; Samir, Anthony E
2015-10-01
Diagnostic tests have wide clinical applications, including screening, diagnosis, measuring treatment effect, and determining prognosis. Interpreting diagnostic test results requires an understanding of key statistical concepts used to evaluate test efficacy. This review explains descriptive statistics and discusses probability, including mutually exclusive and independent events and conditional probability. In the inferential statistics section, a statistical perspective on study design is provided, together with an explanation of how to select appropriate statistical tests. Key concepts in recruiting study samples are discussed, including representativeness and random sampling. Variable types are defined, including predictor, outcome, and covariate variables, and the relationship of these variables to one another. In the hypothesis testing section, we explain how to determine if observed differences between groups are likely to be due to chance. We explain type I and II errors, statistical significance, and study power, followed by an explanation of effect sizes and how confidence intervals can be used to generalize observed effect sizes to the larger population. Statistical tests are explained in four categories: t tests and analysis of variance, proportion analysis tests, nonparametric tests, and regression techniques. We discuss sensitivity, specificity, accuracy, receiver operating characteristic analysis, and likelihood ratios. Measures of reliability and agreement, including κ statistics, intraclass correlation coefficients, and Bland-Altman graphs and analysis, are introduced.
Achievement in physical education and self-concept of Greek students in grades 5 and 6.
Kanioglou, Aggelos
2008-08-01
This study investigated the relationship between rated achievement in physical education (low, medium, high) and specific domains of self-concept on the Self-Perception Profile for Children of 303 students in Grades 5 and 6 (M age = 10.9 yr., SD = 0.7). Analysis of variance yielded statistically significant differences between achievement groups on all domains of self-concept except Behavioral Conduct. Students with lower rated achievement in physical education had lower scores on self-concept domains.
Graphic presentation of the simplest statistical tests
NASA Astrophysics Data System (ADS)
Georgiev, Tsvetan B.
This paper presents graphically well known tests about change of population mean and standard deviation, about comparison of population means and standard deviations, as well as about significance of correlation and regression coefficients. The critical bounds and criteria for variability with statistical guaranty P=95 % and P=99 % are presented as dependences on the data number n. The graphs further give fast visual solutions of the direct problem (estimation of confidence interval for specified P and n), as well of the reverse problem (estimation of n, which is necessary for achieving a desired statistical guaranty of the result). The aim of the work is to present the simplest statistical tests in a comprehensible and convenient graphs, which will be always at hand. The graphs may be useful in the investigations of time series in astronomy, geophysics, ecology etc., as well as in the education.
NASA Astrophysics Data System (ADS)
Montgomery, Jennifer Dawn
The purpose of the study was to examine the relationship between vocabulary knowledge and the reading and science achievement of fifth-grade students. Models were developed and tested using multiple linear regression (MLR) to determine whether vocabulary knowledge is a statistically significant predictor of reading and science. A model was tested for reading achievement, and a model was tested for science achievement. Other independent variables in the models included socioeconomic status, ethnicity, gender, status as an English-language learner, status as a special education student, classification as gifted/talented, history of retention, and migrant status. Archival data from fifth-grade students in a large, urban public school district were used in the analyses. Both models were found to be statistically significant (p < .001). Findings indicated that reading vocabulary was a statistically significant predictor for both reading achievement (B = .571, p < .001) and science achievement (B = .241, p < .001). The significance of vocabulary to reading achievement confirmed past research. The role of reading vocabulary in science achievement revealed a significant, if modest, relationship. In addition, findings pointed out the significance of variables such as history of retention, gender, and status as an English-language learner. Conclusions from the study, pedagogical implications, and recommendations for future research are discussed.
NASA Astrophysics Data System (ADS)
Paine, Gregory Harold
1982-03-01
The primary objective of the thesis is to explore the dynamical properties of small nerve networks by means of the methods of statistical mechanics. To this end, a general formalism is developed and applied to elementary groupings of model neurons which are driven by either constant (steady state) or nonconstant (nonsteady state) forces. Neuronal models described by a system of coupled, nonlinear, first-order, ordinary differential equations are considered. A linearized form of the neuronal equations is studied in detail. A Lagrange function corresponding to the linear neural network is constructed which, through a Legendre transformation, provides a constant of motion. By invoking the Maximum-Entropy Principle with the single integral of motion as a constraint, a probability distribution function for the network in a steady state can be obtained. The formalism is implemented for some simple networks driven by a constant force; accordingly, the analysis focuses on a study of fluctuations about the steady state. In particular, a network composed of N noninteracting neurons, termed Free Thinkers, is considered in detail, with a view to interpretation and numerical estimation of the Lagrange multiplier corresponding to the constant of motion. As an archetypical example of a net of interacting neurons, the classical neural oscillator, consisting of two mutually inhibitory neurons, is investigated. It is further shown that in the case of a network driven by a nonconstant force, the Maximum-Entropy Principle can be applied to determine a probability distribution functional describing the network in a nonsteady state. The above examples are reconsidered with nonconstant driving forces which produce small deviations from the steady state. Numerical studies are performed on simplified models of two physical systems: the starfish central nervous system and the mammalian olfactory bulb. Discussions are given as to how statistical neurodynamics can be used to gain a better
Graded Achievement, Tested Achievement, and Validity
ERIC Educational Resources Information Center
Brookhart, Susan M.
2015-01-01
Twenty-eight studies of grades, over a century, were reviewed using the argument-based approach to validity suggested by Kane as a theoretical framework. The review draws conclusions about the meaning of graded achievement, its relation to tested achievement, and changes in the construct of graded achievement over time. "Graded…
Asperger Syndrome and Academic Achievement.
ERIC Educational Resources Information Center
Griswold, Deborah E.; Barnhill, Gena P.; Myles, Brenda Smith; Hagiwara, Taku; Simpson, Richard L.
2002-01-01
A study focused on identifying the academic characteristics of 21 children and youth who have Asperger syndrome. Students had an extraordinary range of academic achievement scores, extending from significantly above average to far below grade level. Lowest achievement scores were shown for numerical operations, listening comprehension, and written…
Whither Statistics Education Research?
ERIC Educational Resources Information Center
Watson, Jane
2016-01-01
This year marks the 25th anniversary of the publication of a "National Statement on Mathematics for Australian Schools", which was the first curriculum statement this country had including "Chance and Data" as a significant component. It is hence an opportune time to survey the history of the related statistics education…
ERIC Educational Resources Information Center
Harris, Cydnie Ellen Smith
2012-01-01
The effect of the leadership style of the secondary school principal on student achievement in select public schools in Louisiana was examined in this study. The null hypothesis was that there was no statistically significant difference between principal leadership style and student academic achievement. The researcher submitted the LEAD-Self…
Statistics of Statisticians: Critical Mass of Statistics and Operational Research Groups
NASA Astrophysics Data System (ADS)
Kenna, Ralph; Berche, Bertrand
Using a recently developed model, inspired by mean field theory in statistical physics, and data from the UK's Research Assessment Exercise, we analyse the relationship between the qualities of statistics and operational research groups and the quantities of researchers in them. Similar to other academic disciplines, we provide evidence for a linear dependency of quality on quantity up to an upper critical mass, which is interpreted as the average maximum number of colleagues with whom a researcher can communicate meaningfully within a research group. The model also predicts a lower critical mass, which research groups should strive to achieve to avoid extinction. For statistics and operational research, the lower critical mass is estimated to be 9 ± 3. The upper critical mass, beyond which research quality does not significantly depend on group size, is 17 ± 6.
Ma, X; Xu, Jiangmin; Xu, Jiangming
2004-04-01
Using data from the Longitudinal Study of American Youth (LSAY), we aimed to determine the causal ordering between mathematics anxiety and mathematics achievement. Results of structural equation modelling showed that, across the entire junior and senior high school, prior low mathematics achievement significantly related to later high mathematics anxiety, but prior high mathematics anxiety hardly related to later low mathematics achievement. Mathematics achievement was more reliably stable from year to year than mathematics anxiety. There were statistically significant gender differences in the causal ordering between mathematics anxiety and mathematics achievement. Prior low mathematics achievement significantly related to later high mathematics anxiety for boys across the entire junior and senior high school but for girls at critical transition points only. Mathematics anxiety was more reliably stable from year to year among girls than among boys.
... PRS GO PSN PSEN GRAFT Contact Us News Plastic Surgery Statistics Plastic surgery procedural statistics from the ... Plastic Surgery Statistics 2005 Plastic Surgery Statistics 2016 Plastic Surgery Statistics Stats Report 2016 National Clearinghouse of ...
Detecting Statistically Significant Communities of Triangle Motifs in Undirected Networks
2015-03-16
to LFR benchmark graphs , relative to the method proposed by Perry et al. [6]. 6 Distribution A: Approved for public release; distribution is...trials. Specifically, let (X,Y ) denote any two observed triangles, then for a Bernoulli(p) graph : E(X) = E(Y ) = p3 (1) 7 Distribution A: Approved...the observed adjacency matrix and consider the null hypothesis H0: number of triangles in A is consistent with Bernoulli graph with probability p
Intervention for Maltreating Fathers: Statistically and Clinically Significant Change
ERIC Educational Resources Information Center
Scott, Katreena L.; Lishak, Vicky
2012-01-01
Objective: Fathers are seldom the focus of efforts to address child maltreatment and little is currently known about the effectiveness of intervention for this population. To address this gap, we examined the efficacy of a community-based group treatment program for fathers who had abused or neglected their children or exposed their children to…
Worry, Intolerance of Uncertainty, and Statistics Anxiety
ERIC Educational Resources Information Center
Williams, Amanda S.
2013-01-01
Statistics anxiety is a problem for most graduate students. This study investigates the relationship between intolerance of uncertainty, worry, and statistics anxiety. Intolerance of uncertainty was significantly related to worry, and worry was significantly related to three types of statistics anxiety. Six types of statistics anxiety were…
Nonlinear Statistical Modeling of Speech
NASA Astrophysics Data System (ADS)
Srinivasan, S.; Ma, T.; May, D.; Lazarou, G.; Picone, J.
2009-12-01
Contemporary approaches to speech and speaker recognition decompose the problem into four components: feature extraction, acoustic modeling, language modeling and search. Statistical signal processing is an integral part of each of these components, and Bayes Rule is used to merge these components into a single optimal choice. Acoustic models typically use hidden Markov models based on Gaussian mixture models for state output probabilities. This popular approach suffers from an inherent assumption of linearity in speech signal dynamics. Language models often employ a variety of maximum entropy techniques, but can employ many of the same statistical techniques used for acoustic models. In this paper, we focus on introducing nonlinear statistical models to the feature extraction and acoustic modeling problems as a first step towards speech and speaker recognition systems based on notions of chaos and strange attractors. Our goal in this work is to improve the generalization and robustness properties of a speech recognition system. Three nonlinear invariants are proposed for feature extraction: Lyapunov exponents, correlation fractal dimension, and correlation entropy. We demonstrate an 11% relative improvement on speech recorded under noise-free conditions, but show a comparable degradation occurs for mismatched training conditions on noisy speech. We conjecture that the degradation is due to difficulties in estimating invariants reliably from noisy data. To circumvent these problems, we introduce two dynamic models to the acoustic modeling problem: (1) a linear dynamic model (LDM) that uses a state space-like formulation to explicitly model the evolution of hidden states using an autoregressive process, and (2) a data-dependent mixture of autoregressive (MixAR) models. Results show that LDM and MixAR models can achieve comparable performance with HMM systems while using significantly fewer parameters. Currently we are developing Bayesian parameter estimation and
Relationship between affect and achievement in science and mathematics in Malaysia and Singapore
NASA Astrophysics Data System (ADS)
Thoe Ng, Khar; Fah Lay, Yoon; Areepattamannil, Shaljan; Treagust, David F.; Chandrasegaran, A. L.
2012-11-01
Background : The Trends in International Mathematics and Science Study (TIMSS) assesses the quality of the teaching and learning of science and mathematics among Grades 4 and 8 students across participating countries. Purpose : This study explored the relationship between positive affect towards science and mathematics and achievement in science and mathematics among Malaysian and Singaporean Grade 8 students. Sample : In total, 4466 Malaysia students and 4599 Singaporean students from Grade 8 who participated in TIMSS 2007 were involved in this study. Design and method : Students' achievement scores on eight items in the survey instrument that were reported in TIMSS 2007 were used as the dependent variable in the analysis. Students' scores on four items in the TIMSS 2007 survey instrument pertaining to students' affect towards science and mathematics together with students' gender, language spoken at home and parental education were used as the independent variables. Results : Positive affect towards science and mathematics indicated statistically significant predictive effects on achievement in the two subjects for both Malaysian and Singaporean Grade 8 students. There were statistically significant predictive effects on mathematics achievement for the students' gender, language spoken at home and parental education for both Malaysian and Singaporean students, with R 2 = 0.18 and 0.21, respectively. However, only parental education showed statistically significant predictive effects on science achievement for both countries. For Singapore, language spoken at home also demonstrated statistically significant predictive effects on science achievement, whereas gender did not. For Malaysia, neither gender nor language spoken at home had statistically significant predictive effects on science achievement. Conclusions : It is important for educators to consider implementing self-concept enhancement intervention programmes by incorporating 'affect' components of academic
Suite versus composite statistics
Balsillie, J.H.; Tanner, W.F.
1999-01-01
Suite and composite methodologies, two statistically valid approaches for producing statistical descriptive measures, are investigated for sample groups representing a probability distribution where, in addition, each sample is probability distribution. Suite and composite means (first moment measures) are always equivalent. Composite standard deviations (second moment measures) are always larger than suite standard deviations. Suite and composite values for higher moment measures have more complex relationships. Very seldom, however, are they equivalent, and they normally yield statistically significant but different results. Multiple samples are preferable to single samples (including composites) because they permit the investigator to examine sample-to-sample variability. These and other relationships for suite and composite probability distribution analyses are investigated and reported using granulometric data.
Dienemann, Jacqueline
2002-01-01
This article examines one outcome of leadership: productive achievement. Without achievement one is judged to not truly be a leader. Thus, the ideal leader must be a visionary, a critical thinker, an expert, a communicator, a mentor, and an achiever of organizational goals. This article explores the organizational context that supports achievement, measures of quality nursing care, fiscal accountability, leadership development, rewards and punishments, and the educational content and teaching strategies to prepare graduates to be achievers.
... Standards Act and Program MQSA Insights MQSA National Statistics Share Tweet Linkedin Pin it More sharing options ... but should level off with time. Archived Scorecard Statistics 2017 Scorecard Statistics 2016 Scorecard Statistics (Archived) 2015 ...
Statistics Anxiety among Postgraduate Students
ERIC Educational Resources Information Center
Koh, Denise; Zawi, Mohd Khairi
2014-01-01
Most postgraduate programmes, that have research components, require students to take at least one course of research statistics. Not all postgraduate programmes are science based, there are a significant number of postgraduate students who are from the social sciences that will be taking statistics courses, as they try to complete their…
Statistics and mathematics anxiety in social science students: some interesting parallels.
Zeidner, M
1991-11-01
This study illuminates some interesting parallels between statistics anxiety and mathematics anxiety in social science students. Parallel to what is confirmed for mathematics anxiety, two factors were observed to underly statistics anxiety scores, namely, statistics test anxiety and content anxiety. The study revealed modest though significant correlations between student attributes and the two confirmed dimensions of statistics anxiety. Furthermore, parallel to the inverse correlation reported for mathematics anxiety and maths course performance, statistics anxiety correlated negatively with high school matriculation scores in maths as well as self perceptions of maths abilities. These data lend support to the hypothesis that aversive prior experiences with mathematics, prior poor achievement in maths, and a low sense of maths self-efficacy are meaningful antecedent correlates of statistics anxiety and thus lend some credence to the "deficit" interpretation of statistics anxiety.
ERIC Educational Resources Information Center
Pike, Gary R.; Kuh, George D.; Massa-McKinley, Ryan C.
2008-01-01
This study examined the relationships among first-year students' employment, engagement, and academic achievement using data from the 2004 National Survey of Student Engagement. A statistically significant negative relationship was found between working more than 20 hours per week and grades, even after controlling for students' characteristics…
ERIC Educational Resources Information Center
Wolfgang, Jeff Drayton
2013-01-01
National educational achievement statistics show that academic underachievement is a significant problem for all students in the United States and for culturally diverse students in particular. The relationship of attachment and its interaction with traumatic stress has been proposed as an alternative explanation for the persistent…
The Influence of Ability Grouping on Math Achievement in a Rural Middle School
ERIC Educational Resources Information Center
Pritchard, Robert R.
2012-01-01
The researcher examined the academic performance of low-tracked students (n = 156) using standardized math test scores to determine whether there is a statistically significant difference in achievement depending on academic environment, tracked or nontracked. An analysis of variance (ANOVA) was calculated, using a paired samples t-test for a…
High School Teachers' Perceptions of School Change and Its Implications for Student Achievement
ERIC Educational Resources Information Center
Mitchell, Anthony J.
2010-01-01
This study attempted to observe if there was a statistical significant relationship between perceptions of educational change and student achievement. The independent variables that represented perceptions of educational change included: community pressure for change, faculty anxiety to change, faculty openness to change, principal anxiety to…
ERIC Educational Resources Information Center
Polly, Drew; Wang, Chuang; Lambert, Richard; Martin, Christie; McGee, Jennifer Richardson; Pugalee, David; Lehew, Amy
2017-01-01
This study investigates the impacts of a year-long professional development program on Kindergarten teachers' beliefs and practices and the association of these changes with student achievement in mathematics measured by curriculum-based instruments. Although teacher content knowledge was not statistically significantly different before and after…
A SUMMARY OF STUDIES IN ACHIEVEMENT OF VOCATIONAL AGRICULTURE GRADUATES IN COLLEGE.
ERIC Educational Resources Information Center
MCCLELLAND, JOHN B.
TWENTY-SEVEN STUDIES ARE INCLUDED IN THIS SYNTHESIS OF RESEARCH ON THE APPROPRIATENESS OF HIGH SCHOOL VOCATIONAL AGRICULTURE STUDENTS GOING ON TO AGRICULTURAL COLLEGES. MOST OF THE STUDIES INVOLVED STATISTICAL SIGNIFICANCE TREATMENT. THE STUDIES ARE ORGANIZED INTO SECTIONS--(1) COMPREHENSIVE, (2) ACHIEVEMENT IN LEADERSHIP ACTIVITIES, (3)…
Highly Qualified Teacher Status and the Reading Achievement of Students with Disabilities
ERIC Educational Resources Information Center
Robinson, Helene
2011-01-01
The purpose of this study was to examine teacher qualification factors believed to affect reading achievement of students with disabilities in intensive reading classes after controlling for certain student and teacher demographics using ANCOVA. Results indicated that there was no statistically significant difference between the reading…
ERIC Educational Resources Information Center
Roberts, Timothy Gerald
Statistically significant differences were not found between the treatment and non-treatment groups in a study designed to investigate the effectiveness of the Auditory Discrimination in Depth (A.D.D.) Program. The treatment group involved thirty-nine normally achieving and educationally handicapped students who were given the A.D.D. Program…
ERIC Educational Resources Information Center
What Works Clearinghouse, 2013
2013-01-01
This study examined whether attending a Knowledge is Power Program (KIPP) middle school improved students' reading, math, social studies, and science achievement for up to 4 years following enrollment. The study reported that students attending KIPP middle schools scored statistically significantly higher than matched students on all of the state…
Comparing Science Achievement Constructs: Targeted and Achieved
ERIC Educational Resources Information Center
Ferrara, Steve; Duncan, Teresa
2011-01-01
This article illustrates how test specifications based solely on academic content standards, without attention to other cognitive skills and item response demands, can fall short of their targeted constructs. First, the authors inductively describe the science achievement construct represented by a statewide sixth-grade science proficiency test.…
ERIC Educational Resources Information Center
Anderson, Sharon; Medrich, Elliott; Fowler, Donna
2007-01-01
From the halls of Congress to the local elementary school, conversations on education reform have tossed around the term "achievement gap" as though people all know precisely what that means. As it's commonly used, "achievement gap" refers to the differences in scores on state or national achievement tests between various…
Environmental Health Practice: Statistically Based Performance Measurement
Enander, Richard T.; Gagnon, Ronald N.; Hanumara, R. Choudary; Park, Eugene; Armstrong, Thomas; Gute, David M.
2007-01-01
Objectives. State environmental and health protection agencies have traditionally relied on a facility-by-facility inspection-enforcement paradigm to achieve compliance with government regulations. We evaluated the effectiveness of a new approach that uses a self-certification random sampling design. Methods. Comprehensive environmental and occupational health data from a 3-year statewide industry self-certification initiative were collected from representative automotive refinishing facilities located in Rhode Island. Statistical comparisons between baseline and postintervention data facilitated a quantitative evaluation of statewide performance. Results. The analysis of field data collected from 82 randomly selected automotive refinishing facilities showed statistically significant improvements (P<.05, Fisher exact test) in 4 major performance categories: occupational health and safety, air pollution control, hazardous waste management, and wastewater discharge. Statistical significance was also shown when a modified Bonferroni adjustment for multiple comparisons was performed. Conclusions. Our findings suggest that the new self-certification approach to environmental and worker protection is effective and can be used as an adjunct to further enhance state and federal enforcement programs. PMID:17267709
Nursing student attitudes toward statistics.
Mathew, Lizy; Aktan, Nadine M
2014-04-01
Nursing is guided by evidence-based practice. To understand and apply research to practice, nurses must be knowledgeable in statistics; therefore, it is crucial to promote a positive attitude toward statistics among nursing students. The purpose of this quantitative cross-sectional study was to assess differences in attitudes toward statistics among undergraduate nursing, graduate nursing, and undergraduate non-nursing students. The Survey of Attitudes Toward Statistics Scale-36 (SATS-36) was used to measure student attitudes, with higher scores denoting more positive attitudes. The convenience sample was composed of 175 students from a public university in the northeastern United States. Statistically significant relationships were found among some of the key demographic variables. Graduate nursing students had a significantly lower score on the SATS-36, compared with baccalaureate nursing and non-nursing students. Therefore, an innovative nursing curriculum that incorporates knowledge of student attitudes and key demographic variables may result in favorable outcomes.
Statistics Poker: Reinforcing Basic Statistical Concepts
ERIC Educational Resources Information Center
Leech, Nancy L.
2008-01-01
Learning basic statistical concepts does not need to be tedious or dry; it can be fun and interesting through cooperative learning in the small-group activity of Statistics Poker. This article describes a teaching approach for reinforcing basic statistical concepts that can help students who have high anxiety and makes learning and reinforcing…
Predict! Teaching Statistics Using Informational Statistical Inference
ERIC Educational Resources Information Center
Makar, Katie
2013-01-01
Statistics is one of the most widely used topics for everyday life in the school mathematics curriculum. Unfortunately, the statistics taught in schools focuses on calculations and procedures before students have a chance to see it as a useful and powerful tool. Researchers have found that a dominant view of statistics is as an assortment of tools…
Childhood Obesity and Cognitive Achievement.
Black, Nicole; Johnston, David W; Peeters, Anna
2015-09-01
Obese children tend to perform worse academically than normal-weight children. If poor cognitive achievement is truly a consequence of childhood obesity, this relationship has significant policy implications. Therefore, an important question is to what extent can this correlation be explained by other factors that jointly determine obesity and cognitive achievement in childhood? To answer this question, we exploit a rich longitudinal dataset of Australian children, which is linked to national assessments in math and literacy. Using a range of estimators, we find that obesity and body mass index are negatively related to cognitive achievement for boys but not girls. This effect cannot be explained by sociodemographic factors, past cognitive achievement or unobserved time-invariant characteristics and is robust to different measures of adiposity. Given the enormous importance of early human capital development for future well-being and prosperity, this negative effect for boys is concerning and warrants further investigation.
The Wide Range Achievement Test and the Peabody Individual Achievement Test: A Comparative Study.
ERIC Educational Resources Information Center
Harmer, William R.; Williams, Fern
1978-01-01
The article presents a statistical and descriptive comparison, with emphasis on math subtests, of the Wide Range Achievement Test and the Peabody Individual Achievement Test, based on scores obtained from clients (in grades 1-12) at a university-affiliated learning disabilities center. (SBH)
Neuroendocrine Tumor: Statistics
... Tumor > Neuroendocrine Tumor: Statistics Request Permissions Neuroendocrine Tumor: Statistics Approved by the Cancer.Net Editorial Board , 11/ ... the body. It is important to remember that statistics on how many people survive this type of ...
Adrenal Gland Tumors: Statistics
... Gland Tumor: Statistics Request Permissions Adrenal Gland Tumor: Statistics Approved by the Cancer.Net Editorial Board , 03/ ... primary adrenal gland tumor is very uncommon. Exact statistics are not available for this type of tumor ...
The Surprisingly Modest Relationship between SES and Educational Achievement
ERIC Educational Resources Information Center
Harwell, Michael; Maeda, Yukiko; Bishop, Kyoungwon; Xie, Aolin
2017-01-01
Measures of socioeconomic status (SES) are routinely used in analyses of achievement data to increase statistical power, statistically control for the effects of SES, and enhance causality arguments under the premise that the SES-achievement relationship is moderate to strong. Empirical evidence characterizing the strength of the SES-achievement…
2007-05-01
The latest version of the NHS Institute for Innovation and Improvement's 'no delays achiever', a web based tool created to help NHS organisations achieve the 18-week target for GP referrals to first treatment, is available at www.nodelaysachiever.nhs.uk.
Vicarious Achievement Orientation.
ERIC Educational Resources Information Center
Leavitt, Harold J.; And Others
This study tests hypotheses about achievement orientation, particularly vicarious achievement. Undergraduate students (N=437) completed multiple-choice questionnaires, indicating likely responses of one person to the success of another. The sex of succeeder and observer, closeness of relationship, and setting (medical school or graduate school of…
Heritability of Creative Achievement
ERIC Educational Resources Information Center
Piffer, Davide; Hur, Yoon-Mi
2014-01-01
Although creative achievement is a subject of much attention to lay people, the origin of individual differences in creative accomplishments remain poorly understood. This study examined genetic and environmental influences on creative achievement in an adult sample of 338 twins (mean age = 26.3 years; SD = 6.6 years). Twins completed the Creative…
Confronting the Achievement Gap
ERIC Educational Resources Information Center
Gardner, David
2007-01-01
This article talks about the large achievement gap between children of color and their white peers. The reasons for the achievement gap are varied. First, many urban minorities come from a background of poverty. One of the detrimental effects of growing up in poverty is receiving inadequate nourishment at a time when bodies and brains are rapidly…
ERIC Educational Resources Information Center
Fletcher, Mike; And Others
1992-01-01
This collection of seven articles examines achievement-based resourcing (ABR), the concept that the funding of educational institutions should be linked to their success in promoting student achievement, with a focus on the application of ABR to postsecondary education in the United Kingdom. The articles include: (1) "Introduction" (Mick…
States Address Achievement Gaps.
ERIC Educational Resources Information Center
Christie, Kathy
2002-01-01
Summarizes 2 state initiatives to address the achievement gap: North Carolina's report by the Advisory Commission on Raising Achievement and Closing Gaps, containing an 11-point strategy, and Kentucky's legislation putting in place 10 specific processes. The North Carolina report is available at www.dpi.state.nc.us.closingthegap; Kentucky's…
STATISTICAL ANALYSIS, REPORTS), (*PROBABILITY, REPORTS), INFORMATION THEORY, DIFFERENTIAL EQUATIONS, STATISTICAL PROCESSES, STOCHASTIC PROCESSES, MULTIVARIATE ANALYSIS, DISTRIBUTION THEORY , DECISION THEORY, MEASURE THEORY, OPTIMIZATION
Cognitive predictors of reading and math achievement among gifted referrals.
Rowe, Ellen W; Miller, Cristin; Ebenstein, Lauren A; Thompson, Dawna F
2012-09-01
This study investigated the predictive power of the Wechsler Intelligence Scale for Children-Fourth Edition (WISC-IV) Full Scale IQ (FSIQ), the General Ability Index (GAI), and the WISC-IV index score composites on subsequent reading and math standardized test scores among high-achieving students. The sample consisted of 84 elementary-age students who received an individual cognitive assessment with the WISC-IV in the previous year as part of the application process for gifted and talented programming through their schools. Although there were no significant differences among the mean WISC-IV index scores, 77% of the individual students evidenced statistically significant WISC-IV index score variability. Thus, intraindividual test score variability appears to be the norm among high-achieving students. In spite of this variability, regression analyses indicated that the FSIQ predicted reading comprehension and mathematics achievement better than, or as well as, the GAI or individual scores for verbal comprehension and perceptual reasoning. None of the cognitive variables correlated significantly with achievement scores for Word Reading or Pseudoword Decoding scores, but the FSIQ, GAI, Verbal Comprehension, and Perceptual Reasoning scores predicted reading comprehension. Limitations and directions for future research are discussed.
Dementia Caregiver Intervention Research: In Search of Clinical Significance
Schulz, Richard; O’Brien, Alison; Czaja, Sara; Ory, Marcia; Norris, Rachel; Martire, Lynn M.; Belle, Steven H.; Burgio, Lou; Gitlin, Laura; Coon, David; Burns, Robert; Gallagher-Thompson, Dolores; Stevens, Alan
2008-01-01
Purpose We reviewed intervention studies that reported dementia caregiver outcomes published since 1996, including psychosocial interventions for caregivers and environmental and pharmacological interventions for care recipients. Our goal was to focus on issues of clinical significance in caregiver intervention research in order to move the field toward a greater emphasis on achieving reliable and clinically meaningful outcomes. Design and Methods MEDLINE, PsycINFO, and Cumulative Index to Nursing & Allied Health databases from 1996 through 2001 were searched to identify articles and book chapters mapping to two medical subject headings: caregivers and either dementia or Alzheimer’s disease. Articles were evaluated on two dimensions, outcomes in four domains thought to be important to the individual or society and the magnitude of reported effects for these outcomes in order to determine if they were large enough to be clinically meaningful. Results Although many studies have reported small to moderate statistically significant effects on a broad range of outcomes, only a small proportion of these studies achieved clinically meaningful outcomes. Nevertheless, caregiving intervention studies have increasingly shown promise of affecting important public health outcomes in areas such as service utilization, including delayed institutionalization; psychiatric symptomatology, including the successful treatment of major and minor depression; and providing services that are highly valued by caregivers. Implications Assessment of clinical significance in addition to statistical significance is needed in this research area. Specific recommendations on design, measurement, and conceptual issues are made to enhance the clinical significance of future research. PMID:12351794
Achievability for telerobotic systems
NASA Astrophysics Data System (ADS)
Kress, Reid L.; Draper, John V.; Hamel, William R.
2001-02-01
Methods are needed to improve the capabilities of autonomous robots to perform tasks that are difficult for contemporary robots, and to identify those tasks that robots cannot perform. Additionally, in the realm of remote handling, methods are needed to assess which tasks and/or subtasks are candidates for automation. We are developing a new approach to understanding the capability of autonomous robotic systems. This approach uses formalized methods for determining the achievability of tasks for robots, that is, the likelihood that an autonomous robot or telerobot can successfully complete a particular task. Any autonomous system may be represented in achievability space by the volume describing that system's capabilities within the 3-axis space delineated by perception, cognition, and action. This volume may be thought of as a probability density with achievability decreasing as the distance from the centroid of the volume increases. Similarly, any task may be represented within achievability space. However, as tasks have more finite requirements for perception, cognition, and action, each may be represented as a point (or, more accurately, as a small sphere) within achievability space. Analysis of achievability can serve to identify, a priori, the survivability of robotic systems and the likelihood of mission success; it can be used to plan a mission or portions of a mission; it can be used to modify a mission plan to accommodate unpredicted occurrences; it can also serve to identify needs for modifications to robotic systems or tasks to improve achievability. .
Stevens, Joseph J; Schulte, Ann C; Elliott, Stephen N; Nese, Joseph F T; Tindal, Gerald
2015-02-01
This study estimated mathematics achievement growth trajectories in a statewide sample of 92,045 students with and without disabilities over Grades 3 to 7. Students with disabilities (SWDs) were identified in seven exceptionality categories. Students without disabilities (SWoDs) were categorized as General Education (GE) or Academically/Intellectually Gifted (AIG). Students in all groups showed significant growth that decelerated over grades as well as significant variability in achievement by student group, both at the initial assessment in Grade 3 and in rates of growth over time. Race/ethnicity, gender, parental education, free/reduced lunch status, and English language proficiency were also significant predictors of achievement. Effect size estimates showed substantial year-to-year growth that decreased over grades. Sizeable achievement gaps that were relatively stable over grades were observed between SWoDs and students in specific exceptionality categories. Our study also demonstrated the importance of statistically controlling for variation related to student demographic characteristics. Additional research is needed that expands on these results with the same and additional exceptionality groups.
Wild, M.; Rouhani, S.
1995-02-01
A typical site investigation entails extensive sampling and monitoring. In the past, sampling plans have been designed on purely ad hoc bases, leading to significant expenditures and, in some cases, collection of redundant information. In many instances, sampling costs exceed the true worth of the collected data. The US Environmental Protection Agency (EPA) therefore has advocated the use of geostatistics to provide a logical framework for sampling and analysis of environmental data. Geostatistical methodology uses statistical techniques for the spatial analysis of a variety of earth-related data. The use of geostatistics was developed by the mining industry to estimate ore concentrations. The same procedure is effective in quantifying environmental contaminants in soils for risk assessments. Unlike classical statistical techniques, geostatistics offers procedures to incorporate the underlying spatial structure of the investigated field. Sample points spaced close together tend to be more similar than samples spaced further apart. This can guide sampling strategies and determine complex contaminant distributions. Geostatistic techniques can be used to evaluate site conditions on the basis of regular, irregular, random and even spatially biased samples. In most environmental investigations, it is desirable to concentrate sampling in areas of known or suspected contamination. The rigorous mathematical procedures of geostatistics allow for accurate estimates at unsampled locations, potentially reducing sampling requirements. The use of geostatistics serves as a decision-aiding and planning tool and can significantly reduce short-term site assessment costs, long-term sampling and monitoring needs, as well as lead to more accurate and realistic remedial design criteria.
[Comment on] Statistical discrimination
NASA Astrophysics Data System (ADS)
Chinn, Douglas
In the December 8, 1981, issue of Eos, a news item reported the conclusion of a National Research Council study that sexual discrimination against women with Ph.D.'s exists in the field of geophysics. Basically, the item reported that even when allowances are made for motherhood the percentage of female Ph.D.'s holding high university and corporate positions is significantly lower than the percentage of male Ph.D.'s holding the same types of positions. The sexual discrimination conclusion, based only on these statistics, assumes that there are no basic psychological differences between men and women that might cause different populations in the employment group studied. Therefore, the reasoning goes, after taking into account possible effects from differences related to anatomy, such as women stopping their careers in order to bear and raise children, the statistical distributions of positions held by male and female Ph.D.'s ought to be very similar to one another. Any significant differences between the distributions must be caused primarily by sexual discrimination.
Statistical Reference Datasets
National Institute of Standards and Technology Data Gateway
Statistical Reference Datasets (Web, free access) The Statistical Reference Datasets is also supported by the Standard Reference Data Program. The purpose of this project is to improve the accuracy of statistical software by providing reference datasets with certified computational results that enable the objective evaluation of statistical software.
ERIC Educational Resources Information Center
Butz, Stephen D.
2012-01-01
This research examined the education system at high-poverty schools that had significantly higher student achievement levels as compared to similar schools with lower student achievement levels. A multischool qualitative case study was conducted of the educational systems where there was a significant difference in the scores achieved on the…
Explorations in statistics: statistical facets of reproducibility.
Curran-Everett, Douglas
2016-06-01
Learning about statistics is a lot like learning about science: the learning is more meaningful if you can actively explore. This eleventh installment of Explorations in Statistics explores statistical facets of reproducibility. If we obtain an experimental result that is scientifically meaningful and statistically unusual, we would like to know that our result reflects a general biological phenomenon that another researcher could reproduce if (s)he repeated our experiment. But more often than not, we may learn this researcher cannot replicate our result. The National Institutes of Health and the Federation of American Societies for Experimental Biology have created training modules and outlined strategies to help improve the reproducibility of research. These particular approaches are necessary, but they are not sufficient. The principles of hypothesis testing and estimation are inherent to the notion of reproducibility in science. If we want to improve the reproducibility of our research, then we need to rethink how we apply fundamental concepts of statistics to our science.
Culture and Achievement Motivation
ERIC Educational Resources Information Center
Maehr, Martin L.
1974-01-01
A framework is suggested for the cross-cultural study of motivation that stresses the importance of contextual conditions in eliciting achievement motivation and emphasizes cultural relativity in the definition of the concept. (EH)
ERIC Educational Resources Information Center
Ragasa, Carmelita Y.
2008-01-01
The objective of the study is to determine if there is a significant difference in the effects of the treatment and control groups on achievement as well as on attitude as measured by the posttest. A class of 38 sophomore college students in the basic statistics taught with the use of computer-assisted instruction and another class of 15 students…
Achieve inventory reduction and improve customer service?
Moody, M C
2000-05-01
Is it really possible to achieve significant reductions in your manufacturing inventories while improving customer service? If you really want to achieve significant inventory reductions, focus on the root causes, and develop countermeasures and a work plan, to execute your countermeasures. Include measurements for recording your progress, and deploy your countermeasures until they are no longer required, or until new ones are needed.
Sustaining School Achievement in California's Elementary Schools after State Monitoring
ERIC Educational Resources Information Center
McCabe, Molly
2010-01-01
This study examined the Academic Performance Index (API) and Adequate Yearly Progress (AYP) achievement trends between 2004 and 2006 of 58 California public elementary schools after exiting state monitoring and investigated practices for sustaining consistent achievement growth. Statistical methods were used to analyze statewide achievement trends…
[Big data in official statistics].
Zwick, Markus
2015-08-01
The concept of "big data" stands to change the face of official statistics over the coming years, having an impact on almost all aspects of data production. The tasks of future statisticians will not necessarily be to produce new data, but rather to identify and make use of existing data to adequately describe social and economic phenomena. Until big data can be used correctly in official statistics, a lot of questions need to be answered and problems solved: the quality of data, data protection, privacy, and the sustainable availability are some of the more pressing issues to be addressed. The essential skills of official statisticians will undoubtedly change, and this implies a number of challenges to be faced by statistical education systems, in universities, and inside the statistical offices. The national statistical offices of the European Union have concluded a concrete strategy for exploring the possibilities of big data for official statistics, by means of the Big Data Roadmap and Action Plan 1.0. This is an important first step and will have a significant influence on implementing the concept of big data inside the statistical offices of Germany.
Science literacy in high school students: A comparison of achievement in two curriculum approaches
NASA Astrophysics Data System (ADS)
McAlister, Diane C.
2009-12-01
Academic achievement as measured by the Florida Comprehensive Assessment Test (FCAT) in science for 367 students in two science curriculum options, integrated science and the traditional subject-specific courses, in one central Florida high school were compared. A multivariate analysis of covariance (MANCOVA) of science curriculum choice was analyzed for three variables, total FCAT score, earth science subscore, and scientific thinking subscore. Covariate of academic ability as defined by grade point average (GPA) and academic focus as defined by post secondary plans were considered for use. Analysis of statistically significant results was completed through analysis of covariance (ANCOVA). While statistically significant results were found in favor of the traditional curriculum group, additional statistical analysis of the curriculum groups for differences in socioeconomic status (SES), gender, and instructional level led to a logistic regression to explore the ability of these variables, GPA, and total FCAT score to predict curriculum group membership. GPA, level of instruction and FCAT score were found to be statistically significant predictors. Final conclusions to the study indicated a significant difference in scientific literacy for the two groups in favor of the traditional curriculum. However, logistic regression results indicated that due to significant differences in SES, gender, GPA, and level of instruction for the groups, the differences in academic achievement were probably due to factors other than curriculum design. Limitations of the study and suggestions for further research were presented.
(Errors in statistical tests)3.
Phillips, Carl V; MacLehose, Richard F; Kaufman, Jay S
2008-07-14
In 2004, Garcia-Berthou and Alcaraz published "Incongruence between test statistics and P values in medical papers," a critique of statistical errors that received a tremendous amount of attention. One of their observations was that the final reported digit of p-values in articles published in the journal Nature departed substantially from the uniform distribution that they suggested should be expected. In 2006, Jeng critiqued that critique, observing that the statistical analysis of those terminal digits had been based on comparing the actual distribution to a uniform continuous distribution, when digits obviously are discretely distributed. Jeng corrected the calculation and reported statistics that did not so clearly support the claim of a digit preference. However delightful it may be to read a critique of statistical errors in a critique of statistical errors, we nevertheless found several aspects of the whole exchange to be quite troubling, prompting our own meta-critique of the analysis.The previous discussion emphasized statistical significance testing. But there are various reasons to expect departure from the uniform distribution in terminal digits of p-values, so that simply rejecting the null hypothesis is not terribly informative. Much more importantly, Jeng found that the original p-value of 0.043 should have been 0.086, and suggested this represented an important difference because it was on the other side of 0.05. Among the most widely reiterated (though often ignored) tenets of modern quantitative research methods is that we should not treat statistical significance as a bright line test of whether we have observed a phenomenon. Moreover, it sends the wrong message about the role of statistics to suggest that a result should be dismissed because of limited statistical precision when it is so easy to gather more data.In response to these limitations, we gathered more data to improve the statistical precision, and analyzed the actual pattern of the
Ranald Macdonald and statistical inference.
Smith, Philip T
2009-05-01
Ranald Roderick Macdonald (1945-2007) was an important contributor to mathematical psychology in the UK, as a referee and action editor for British Journal of Mathematical and Statistical Psychology and as a participant and organizer at the British Psychological Society's Mathematics, statistics and computing section meetings. This appreciation argues that his most important contribution was to the foundations of significance testing, where his concern about what information was relevant in interpreting the results of significance tests led him to be a persuasive advocate for the 'Weak Fisherian' form of hypothesis testing.
Manganese exposure from drinking water and children's academic achievement.
Khan, Khalid; Wasserman, Gail A; Liu, Xinhua; Ahmed, Ershad; Parvez, Faruque; Slavkovich, Vesna; Levy, Diane; Mey, Jacob; van Geen, Alexander; Graziano, Joseph H; Factor-Litvak, Pam
2012-01-01
Drinking water manganese (WMn) is a potential threat to children's health due to its associations with a wide range of outcomes including cognitive, behavioral and neuropsychological effects. Although adverse effects of Mn on cognitive function of the children indicate possible impact on their academic achievement little evidence on this issue is available. Moreover, little is known regarding potential interactions between exposure to Mn and other metals, especially water arsenic (WAs). In Araihazar, a rural area of Bangladesh, we conducted a cross-sectional study of 840 children to investigate associations between WMn and WAs and academic achievement in mathematics and languages among elementary school-children, aged 8-11 years. Data on As and Mn exposure were collected from the participants at the baseline of an ongoing longitudinal study of school-based educational intervention. Annual scores of the study children in languages (Bangla and English) and mathematics were obtained from the academic achievement records of the elementary schools. WMn above the WHO standard of 400μg/L was associated with 6.4% score loss (95% CI=-12.3 to -0.5) in mathematics achievement test scores, adjusted for WAs and other sociodemographic variables. We did not find any statistically significant associations between WMn and academic achievement in either language. Neither WAs nor urinary As was significantly related to any of the three academic achievement scores. Our finding suggests that a large number of children in rural Bangladesh may experience deficits in mathematics due to high concentrations of Mn exposure in drinking water.
SALT and Spelling Achievement.
ERIC Educational Resources Information Center
Nelson, Joan
A study investigated the effects of suggestopedic accelerative learning and teaching (SALT) on the spelling achievement, attitudes toward school, and memory skills of fourth-grade students. Subjects were 20 male and 28 female students from two self-contained classrooms at Kennedy Elementary School in Rexburg, Idaho. The control classroom and the…
ERIC Educational Resources Information Center
Ohrn, Deborah Gore, Ed.
1993-01-01
This issue of the Goldfinch highlights some of Iowa's 20th century women of achievement. These women have devoted their lives to working for human rights, education, equality, and individual rights. They come from the worlds of politics, art, music, education, sports, business, entertainment, and social work. They represent Native Americans,…
Schools Achieving Gender Equity.
ERIC Educational Resources Information Center
Revis, Emma
This guide is designed to assist teachers presenting the Schools Achieving Gender Equity (SAGE) curriculum for vocational education students, which was developed to align gender equity concepts with the Kentucky Education Reform Act (KERA). Included in the guide are lesson plans for classes on the following topics: legal issues of gender equity,…
Achieving Peace through Education.
ERIC Educational Resources Information Center
Clarken, Rodney H.
While it is generally agreed that peace is desirable, there are barriers to achieving a peaceful world. These barriers are classified into three major areas: (1) an erroneous view of human nature; (2) injustice; and (3) fear of world unity. In a discussion of these barriers, it is noted that although the consciousness and conscience of the world…
Explorations in achievement motivation
NASA Technical Reports Server (NTRS)
Helmreich, Robert L.
1982-01-01
Recent research on the nature of achievement motivation is reviewed. A three-factor model of intrinsic motives is presented and related to various criteria of performance, job satisfaction and leisure activities. The relationships between intrinsic and extrinsic motives are discussed. Needed areas for future research are described.
Increasing Male Academic Achievement
ERIC Educational Resources Information Center
Jackson, Barbara Talbert
2008-01-01
The No Child Left Behind legislation has brought greater attention to the academic performance of American youth. Its emphasis on student achievement requires a closer analysis of assessment data by school districts. To address the findings, educators must seek strategies to remedy failing results. In a mid-Atlantic district of the Unites States,…
Appraising Reading Achievement.
ERIC Educational Resources Information Center
Ediger, Marlow
To determine quality sequence in pupil progress, evaluation approaches need to be used which guide the teacher to assist learners to attain optimally. Teachers must use a variety of procedures to appraise student achievement in reading, because no one approach is adequate. Appraisal approaches might include: (1) observation and subsequent…
Cognitive Processes and Achievement.
ERIC Educational Resources Information Center
Hunt, Dennis; Randhawa, Bikkar S.
For a group of 165 fourth- and fifth-grade students, four achievement test scores were correlated with success on nine tests designed to measure three cognitive functions: sustained attention, successive processing, and simultaneous processing. This experiment was designed in accordance with Luria's model of the three functional units of the…
Graders' Mathematics Achievement
ERIC Educational Resources Information Center
Bond, John B.; Ellis, Arthur K.
2013-01-01
The purpose of this experimental study was to investigate the effects of metacognitive reflective assessment instruction on student achievement in mathematics. The study compared the performance of 141 students who practiced reflective assessment strategies with students who did not. A posttest-only control group design was employed, and results…
ERIC Educational Resources Information Center
Hartley, Tricia
2009-01-01
National learning and skills policy aims both to build economic prosperity and to achieve social justice. Participation in higher education (HE) has the potential to contribute substantially to both aims. That is why the Campaign for Learning has supported the ambition to increase the proportion of the working-age population with a Level 4…
Improving Educational Achievement.
ERIC Educational Resources Information Center
New York University Education Quarterly, 1979
1979-01-01
This is a slightly abridged version of the report of the National Academy of Education panel, convened at the request of HEW Secretary Joseph Califano and Assistant Secretary for Education Mary F. Berry, to study recent declines in student achievement and methods of educational improvement. (SJL)
ERIC Educational Resources Information Center
Rogers, Ibram
2009-01-01
When Gabrielle Carpenter became a guidance counselor in Northern Virginia nine years ago, she focused on the academic achievement gap and furiously tried to close it. At first, she was compelled by tremendous professional interest. However, after seeing her son lose his zeal for school, Carpenter joined forces with other parents to form an…
Achievement in Problem Solving
ERIC Educational Resources Information Center
Friebele, David
2010-01-01
This Action Research Project is meant to investigate the effects of incorporating research-based instructional strategies into instruction and their subsequent effect on student achievement in the area of problem-solving. The two specific strategies utilized are the integration of manipulatives and increased social interaction on a regular basis.…
Essays on Educational Achievement
ERIC Educational Resources Information Center
Ampaabeng, Samuel Kofi
2013-01-01
This dissertation examines the determinants of student outcomes--achievement, attainment, occupational choices and earnings--in three different contexts. The first two chapters focus on Ghana while the final chapter focuses on the US state of Massachusetts. In the first chapter, I exploit the incidence of famine and malnutrition that resulted to…
ERIC Educational Resources Information Center
Walberg, Herbert J.
2010-01-01
For the last half century, higher spending and many modern reforms have failed to raise the achievement of students in the United States to the levels of other economically advanced countries. A possible explanation, says Herbert Walberg, is that much current education theory is ill informed about scientific psychology, often drawing on fads and…
ERIC Educational Resources Information Center
Bracey, Gerald W.
2008-01-01
In his "Wall Street Journal" op-ed on the 25th of anniversary of "A Nation At Risk", former assistant secretary of education Chester E. Finn Jr. applauded the report for turning U.S. education away from equality and toward achievement. It was not surprising, then, that in mid-2008, Finn arranged a conference to examine the…
Mathematical and statistical analysis
NASA Technical Reports Server (NTRS)
Houston, A. Glen
1988-01-01
The goal of the mathematical and statistical analysis component of RICIS is to research, develop, and evaluate mathematical and statistical techniques for aerospace technology applications. Specific research areas of interest include modeling, simulation, experiment design, reliability assessment, and numerical analysis.
... Research AMIGAS Fighting Cervical Cancer Worldwide Stay Informed Statistics for Other Kinds of Cancer Breast Cervical Colorectal ( ... Skin Vaginal and Vulvar Cancer Home Uterine Cancer Statistics Language: English Español (Spanish) Recommend on Facebook Tweet ...
Experiment in Elementary Statistics
ERIC Educational Resources Information Center
Fernando, P. C. B.
1976-01-01
Presents an undergraduate laboratory exercise in elementary statistics in which students verify empirically the various aspects of the Gaussian distribution. Sampling techniques and other commonly used statistical procedures are introduced. (CP)
ERIC Educational Resources Information Center
Matthews, Sharon Elizabeth
2010-01-01
This study investigated the extent to which there were statistically significant relationships between school administrators' systemic implementation of student voice work and student perceptions (i.e. achievement, motivation, attachment and school climate) and PLAN performance. Student voice was defined as students being equal partners in school…
ERIC Educational Resources Information Center
Alvarez-Nunez, Tanya Mae
2012-01-01
Scope and Method of Study: This quantitative, non-experimental study sought to determine if a statistically significant difference existed in student achievement on the PSE exam in Belizean primary schools for students who have teachers with varying levels of self-efficacy (high, medium and low). The Teacher Efficacy Scale (TES), which captures…
NASA Astrophysics Data System (ADS)
Martin, Lynn A.
The purpose of this study was to examine the relationship between teachers' self-reported preparedness for teaching science content and their instructional practices to the science achievement of eighth grade science students in the United States as demonstrated by TIMSS 2007. Six hundred eighty-seven eighth grade science teachers in the United States representing 7,377 students responded to the TIMSS 2007 questionnaire about their instructional preparedness and their instructional practices. Quantitative data were reported. Through correlation analysis, the researcher found statistically significant positive relationships emerge between eighth grade science teachers' main area of study and their self-reported beliefs about their preparedness to teach that same content area. Another correlation analysis found a statistically significant negative relationship existed between teachers' self-reported use of inquiry-based instruction and preparedness to teach chemistry, physics and earth science. Another correlation analysis discovered a statistically significant positive relationship existed between physics preparedness and student science achievement. Finally, a correlation analysis found a statistically significant positive relationship existed between science teachers' self-reported implementation of inquiry-based instructional practices and student achievement. The data findings support the conclusion that teachers who have feelings of preparedness to teach science content and implement more inquiry-based instruction and less didactic instruction produce high achieving science students. As science teachers obtain the appropriate knowledge in science content and pedagogy, science teachers will feel prepared and will implement inquiry-based instruction in science classrooms.
ERIC Educational Resources Information Center
Revis, Kathy G.
2010-01-01
The purpose of this study was to discover if there was a statistically significant relationship between the self-reported instructional leadership practices of North Carolina superintendents and the achievement of their students with limited English proficiency (LEP) and students with disabilities (SWDs) as measured by the percent of students who…
ERIC Educational Resources Information Center
Lenard, Christopher; McCarthy, Sally; Mills, Terence
2014-01-01
There are many different aspects of statistics. Statistics involves mathematics, computing, and applications to almost every field of endeavour. Each aspect provides an opportunity to spark someone's interest in the subject. In this paper we discuss some ethical aspects of statistics, and describe how an introduction to ethics has been…
Teaching Statistics Using SAS.
ERIC Educational Resources Information Center
Mandeville, Garrett K.
The Statistical Analysis System (SAS) is presented as the single most appropriate statistical package to use as an aid in teaching statistics. A brief review of literature in which SAS is compared to SPSS, BMDP, and other packages is followed by six examples which demonstrate features unique to SAS which have pedagogical utility. Of particular…
Minnesota Health Statistics 1988.
ERIC Educational Resources Information Center
Minnesota State Dept. of Health, St. Paul.
This document comprises the 1988 annual statistical report of the Minnesota Center for Health Statistics. After introductory technical notes on changes in format, sources of data, and geographic allocation of vital events, an overview is provided of vital health statistics in all areas. Thereafter, separate sections of the report provide tables…
Statistical Methods for Astronomy
NASA Astrophysics Data System (ADS)
Feigelson, Eric D.; Babu, G. Jogesh
Statistical methodology, with deep roots in probability theory, providesquantitative procedures for extracting scientific knowledge from astronomical dataand for testing astrophysical theory. In recent decades, statistics has enormouslyincreased in scope and sophistication. After a historical perspective, this reviewoutlines concepts of mathematical statistics, elements of probability theory,hypothesis tests, and point estimation. Least squares, maximum likelihood, andBayesian approaches to statistical inference are outlined. Resampling methods,particularly the bootstrap, provide valuable procedures when distributionsfunctions of statistics are not known. Several approaches to model selection andgoodness of fit are considered.
Faculty achievement tracking tool.
Pettus, Sarah; Reifschneider, Ellen; Burruss, Nancy
2009-03-01
Faculty development and scholarship is an expectation of nurse educators. Accrediting institutions, such as the Commission on Collegiate Nursing Education, the National League for Nursing Accrediting Commission, and the Higher Learning Commission, all have criteria regarding faculty achievement. A faculty achievement tracking tool (FATT) was developed to facilitate documentation of accreditation criteria attainment. Based on criteria from accrediting organizations, the roles that are addressed include scholarship, service, and practice. Definitions and benchmarks for the faculty as an aggregate are included. Undergoing reviews from different accrediting organizations, the FATT has been used once for accreditation of the undergraduate program and once for accreditation of the graduate program. The FATT is easy to use and has become an excellent adjunct for the preparation for accreditation reports. In addition, the FATT may be used for yearly evaluations, advancement, and merit.
1997-06-13
Project ACHIEVE was a math/science academic enhancement program aimed at first year high school Hispanic American students. Four high schools -- two in El Paso, Texas and two in Bakersfield, California -- participated in this Department of Energy-funded program during the spring and summer of 1996. Over 50 students, many of whom felt they were facing a nightmare future, were given the opportunity to work closely with personal computers and software, sophisticated calculators, and computer-based laboratories -- an experience which their regular academic curriculum did not provide. Math and science projects, exercises, and experiments were completed that emphasized independent and creative applications of scientific and mathematical theories to real world problems. The most important outcome was the exposure Project ACHIEVE provided to students concerning the college and technical-field career possibilities available to them.
Gender Issues in Labour Statistics.
ERIC Educational Resources Information Center
Greenwood, Adriana Mata
1999-01-01
Presents the main features needed for labor statistics to reflect the respective situations for women and men in the labor market. Identifies topics to be covered and detail needed for significant distinctions to emerge. Explains how the choice of measurement method and data presentation can influence the final result. (Author/JOW)
ERIC Educational Resources Information Center
Accordino, Denise B.; Accordino, Michael P.; Slaney, Robert B.
2000-01-01
Examines the relationship of perfectionism with measures of achievement and achievement motivation and mental health aspects of depression and self-esteem in high school students (N=123). Results indicate that students' personal standards were significant predictors of academic achievement and academic motivation. Also reveals that as students'…
Evaluation of Multi-parameter Test Statistics for Multiple Imputation.
Liu, Yu; Enders, Craig K
2017-03-22
In Ordinary Least Square regression, researchers often are interested in knowing whether a set of parameters is different from zero. With complete data, this could be achieved using the gain in prediction test, hierarchical multiple regression, or an omnibus F test. However, in substantive research scenarios, missing data often exist. In the context of multiple imputation, one of the current state-of-art missing data strategies, there are several different analogous multi-parameter tests of the joint significance of a set of parameters, and these multi-parameter test statistics can be referenced to various distributions to make statistical inferences. However, little is known about the performance of these tests, and virtually no research study has compared the Type 1 error rates and statistical power of these tests in scenarios that are typical of behavioral science data (e.g., small to moderate samples, etc.). This paper uses Monte Carlo simulation techniques to examine the performance of these multi-parameter test statistics for multiple imputation under a variety of realistic conditions. We provide a number of practical recommendations for substantive researchers based on the simulation results, and illustrate the calculation of these test statistics with an empirical example.
NASA Astrophysics Data System (ADS)
YangKim, SungHee
This research was a correlational study of the relationship among self-regulation, students' nonacademic internet browsing, and academic achievement in an undergraduate computer literacy class. Nonacademic internet browsing during class can be a distraction from student academic studies. There has been little research on the role of self-regulation on nonacademic internet browsing in influencing academic achievement. Undergraduate computer literacy classes were used as samples (n= 39) for measuring these variables. Data were collected during three class periods in two sections of the computer literacy course taught by one instructor. The data consisted of a demographic survey, selected and modified items from the GVU 10th WWW User Survey Questionnaire, selected items of the Motivated Strategies for Learning Questionnaire, and measures of internet use. There were low correlations between self-regulation and academic grades (r= .18, p > .05) and self-regulation and internet use (r= -.14, p > .05). None of the correlations were statistically significant. Also, there was no statistically significant correlation between internet use and academic achievement (r= -.23, p >.05). Self-regulation was highly correlated to self-efficacy (r= .53, p < .05). Total internet access was highly correlated to nonacademic related internet browsing (r= .96, p < .01). Although not statistically significant, the consistent negative correlations between nonacademic internet use with both self-regulation and achievement indicate that the internet may present an attractive distraction to achievement which may be due to lack of self-regulation. The implication of embedded instruction of self-regulation in the computer literacy course was discussed to enhance self-regulated internet use. Further study of interaction of self-regulated internet use and academic achievement is recommended.
Parenting Style and Parental Involvement: Relations with Adolescent Achievement.
ERIC Educational Resources Information Center
Paulson, Sharon E.
1994-01-01
Eighty ninth-grade students completed questionnaires regarding their parents' demandingness, responsiveness, school involvement, and commitment to achievement. Boys' reports of both maternal and paternal parenting significantly predicted their achievement, with parental values toward achievement significantly predicting achievement in boys above…
Relationship between intravenous use and achieving initial cocaine abstinence.
Budney, A J; Higgins, S T; Bickel, W; Kent, L
1993-04-01
This study assessed whether route of cocaine administration (intravenous vs. intranasal) influences cocaine abstinence during the first 6 weeks of outpatient treatment. Fifty-nine persons received behavioral treatment or standard drug counselling in an outpatient clinic. Based on information collected at intake, intravenous users had fewer years of education, were employed in less skilled jobs, were less likely to be married, reported more negative consequences from cocaine use, reported using more cocaine per occasion and spent more money on cocaine per week than intranasal users. Intravenous and intranasal users did not differ significantly in the average duration of continuous cocaine abstinence (mean = 2.6 vs. mean = 3.3 weeks achieved during 6 weeks of treatment). The duration of abstinence between intravenous and intranasal users was equal in the behavioral treatment (mean = 4.2). In standard treatment the average duration was less among intravenous than intranasal users (mean = 0.9 vs. mean = 2.4), but that difference did not achieve statistical significance. Hepatitis and employment instability were associated with shorter periods of cocaine abstinence among intravenous users, whereas employment instability, lower job skill level, drug use severity and reports of memory loss were associated with shorter periods of cocaine abstinence among intranasal users. These results indicate that i.v. cocaine users can achieve a period of initial abstinence in an outpatient setting comparable to the duration of typical inpatient hospitalizations, although special types of outpatient treatment may be necessary to obtain a positive outcome.
Chapter 4: Adult Descriptions of Public Achievement
ERIC Educational Resources Information Center
Roholt, Ross VeLure; Hildreth, R. W.; Baizerman, Michael
2007-01-01
Adult volunteers who work as experiential educators in Public Achievement (PA) told us about their experiences, and we contrasted these with the stated aims of Public Achievement, young peoples' experiences, and what it is like to be an adult volunteer. PA coaches reflected that there was a significant disjuncture between the official aims of PA…
High-Throughput Quantification of Phenotype Heterogeneity Using Statistical Features
Chaddad, Ahmad; Tanougast, Camel
2015-01-01
Statistical features are widely used in radiology for tumor heterogeneity assessment using magnetic resonance (MR) imaging technique. In this paper, feature selection based on decision tree is examined to determine the relevant subset of glioblastoma (GBM) phenotypes in the statistical domain. To discriminate between active tumor (vAT) and edema/invasion (vE) phenotype, we selected the significant features using analysis of variance (ANOVA) with p value < 0.01. Then, we implemented the decision tree to define the optimal subset features of phenotype classifier. Naïve Bayes (NB), support vector machine (SVM), and decision tree (DT) classifier were considered to evaluate the performance of the feature based scheme in terms of its capability to discriminate vAT from vE. Whole nine features were statistically significant to classify the vAT from vE with p value < 0.01. Feature selection based on decision tree showed the best performance by the comparative study using full feature set. The feature selected showed that the two features Kurtosis and Skewness achieved a highest range value of 58.33–75.00% accuracy classifier and 73.88–92.50% AUC. This study demonstrated the ability of statistical features to provide a quantitative, individualized measurement of glioblastoma patient and assess the phenotype progression. PMID:26640485
Statistical atlas based extrapolation of CT data
NASA Astrophysics Data System (ADS)
Chintalapani, Gouthami; Murphy, Ryan; Armiger, Robert S.; Lepisto, Jyri; Otake, Yoshito; Sugano, Nobuhiko; Taylor, Russell H.; Armand, Mehran
2010-02-01
We present a framework to estimate the missing anatomical details from a partial CT scan with the help of statistical shape models. The motivating application is periacetabular osteotomy (PAO), a technique for treating developmental hip dysplasia, an abnormal condition of the hip socket that, if untreated, may lead to osteoarthritis. The common goals of PAO are to reduce pain, joint subluxation and improve contact pressure distribution by increasing the coverage of the femoral head by the hip socket. While current diagnosis and planning is based on radiological measurements, because of significant structural variations in dysplastic hips, a computer-assisted geometrical and biomechanical planning based on CT data is desirable to help the surgeon achieve optimal joint realignments. Most of the patients undergoing PAO are young females, hence it is usually desirable to minimize the radiation dose by scanning only the joint portion of the hip anatomy. These partial scans, however, do not provide enough information for biomechanical analysis due to missing iliac region. A statistical shape model of full pelvis anatomy is constructed from a database of CT scans. The partial volume is first aligned with the statistical atlas using an iterative affine registration, followed by a deformable registration step and the missing information is inferred from the atlas. The atlas inferences are further enhanced by the use of X-ray images of the patient, which are very common in an osteotomy procedure. The proposed method is validated with a leave-one-out analysis method. Osteotomy cuts are simulated and the effect of atlas predicted models on the actual procedure is evaluated.
Zheng, Chunmei; Gaumer Erickson, Amy; Kingston, Neal M; Noonan, Patricia M
2014-01-01
Research suggests that self-determination skills are positively correlated with factors that have been shown to improve academic achievement, but the direct relationship among self-determination, self-concept, and academic achievement is not fully understood. This study offers an empirical explanation of how self-determination and self-concept affect academic achievement for adolescents with learning disabilities after taking into consideration the covariates of gender, income, and urbanicity. In a nationally representative sample (N = 560), the proposed model closely fit the data, with all proposed path coefficients being statistically significant. The results indicated that there were significant correlations among the three latent variables (i.e., self-determination, self-concept, and academic achievement), with self-determination being a potential predictor of academic achievement for students with learning disabilities.
... Naloxone Pain Prevention Treatment Trends & Statistics Women and Drugs Publications Funding Funding Opportunities Clinical Research Post-Award Concerns General Information Grant & Contract Application ...
Statistical distribution sampling
NASA Technical Reports Server (NTRS)
Johnson, E. S.
1975-01-01
Determining the distribution of statistics by sampling was investigated. Characteristic functions, the quadratic regression problem, and the differential equations for the characteristic functions are analyzed.
Bradburne, John; Patton, Tisha C.
2001-02-25
When Fluor Fernald took over the management of the Fernald Environmental Management Project in 1992, the estimated closure date of the site was more than 25 years into the future. Fluor Fernald, in conjunction with DOE-Fernald, introduced the Accelerated Cleanup Plan, which was designed to substantially shorten that schedule and save taxpayers more than $3 billion. The management of Fluor Fernald believes there are three fundamental concerns that must be addressed by any contractor hoping to achieve closure of a site within the DOE complex. They are relationship management, resource management and contract management. Relationship management refers to the interaction between the site and local residents, regulators, union leadership, the workforce at large, the media, and any other interested stakeholder groups. Resource management is of course related to the effective administration of the site knowledge base and the skills of the workforce, the attraction and retention of qualified a nd competent technical personnel, and the best recognition and use of appropriate new technologies. Perhaps most importantly, resource management must also include a plan for survival in a flat-funding environment. Lastly, creative and disciplined contract management will be essential to effecting the closure of any DOE site. Fluor Fernald, together with DOE-Fernald, is breaking new ground in the closure arena, and ''business as usual'' has become a thing of the past. How Fluor Fernald has managed its work at the site over the last eight years, and how it will manage the new site closure contract in the future, will be an integral part of achieving successful closure at Fernald.
Thermodynamic Limit in Statistical Physics
NASA Astrophysics Data System (ADS)
Kuzemsky, A. L.
2014-03-01
The thermodynamic limit in statistical thermodynamics of many-particle systems is an important but often overlooked issue in the various applied studies of condensed matter physics. To settle this issue, we review tersely the past and present disposition of thermodynamic limiting procedure in the structure of the contemporary statistical mechanics and our current understanding of this problem. We pick out the ingenious approach by Bogoliubov, who developed a general formalism for establishing the limiting distribution functions in the form of formal series in powers of the density. In that study, he outlined the method of justification of the thermodynamic limit when he derived the generalized Boltzmann equations. To enrich and to weave our discussion, we take this opportunity to give a brief survey of the closely related problems, such as the equipartition of energy and the equivalence and nonequivalence of statistical ensembles. The validity of the equipartition of energy permits one to decide what are the boundaries of applicability of statistical mechanics. The major aim of this work is to provide a better qualitative understanding of the physical significance of the thermodynamic limit in modern statistical physics of the infinite and "small" many-particle systems.
Statistical Mechanics of Zooplankton.
Hinow, Peter; Nihongi, Ai; Strickler, J Rudi
2015-01-01
Statistical mechanics provides the link between microscopic properties of many-particle systems and macroscopic properties such as pressure and temperature. Observations of similar "microscopic" quantities exist for the motion of zooplankton, as well as many species of other social animals. Herein, we propose to take average squared velocities as the definition of the "ecological temperature" of a population under different conditions on nutrients, light, oxygen and others. We test the usefulness of this definition on observations of the crustacean zooplankton Daphnia pulicaria. In one set of experiments, D. pulicaria is infested with the pathogen Vibrio cholerae, the causative agent of cholera. We find that infested D. pulicaria under light exposure have a significantly greater ecological temperature, which puts them at a greater risk of detection by visual predators. In a second set of experiments, we observe D. pulicaria in cold and warm water, and in darkness and under light exposure. Overall, our ecological temperature is a good discriminator of the crustacean's swimming behavior.
Statistical Mechanics of Zooplankton
Hinow, Peter; Nihongi, Ai; Strickler, J. Rudi
2015-01-01
Statistical mechanics provides the link between microscopic properties of many-particle systems and macroscopic properties such as pressure and temperature. Observations of similar “microscopic” quantities exist for the motion of zooplankton, as well as many species of other social animals. Herein, we propose to take average squared velocities as the definition of the “ecological temperature” of a population under different conditions on nutrients, light, oxygen and others. We test the usefulness of this definition on observations of the crustacean zooplankton Daphnia pulicaria. In one set of experiments, D. pulicaria is infested with the pathogen Vibrio cholerae, the causative agent of cholera. We find that infested D. pulicaria under light exposure have a significantly greater ecological temperature, which puts them at a greater risk of detection by visual predators. In a second set of experiments, we observe D. pulicaria in cold and warm water, and in darkness and under light exposure. Overall, our ecological temperature is a good discriminator of the crustacean’s swimming behavior. PMID:26270537
Predictors of cultural capital on science academic achievement at the 8th grade level
NASA Astrophysics Data System (ADS)
Misner, Johnathan Scott
The purpose of the study was to determine if students' cultural capital is a significant predictor of 8th grade science achievement test scores in urban locales. Cultural capital refers to the knowledge used and gained by the dominant class, which allows social and economic mobility. Cultural capital variables include magazines at home and parental education level. Other variables analyzed include socioeconomic status (SES), gender, and English language learners (ELL). This non-experimental study analyzed the results of the 2011 Eighth Grade Science National Assessment of Educational Progress (NAEP). The researcher analyzed the data using a multivariate stepwise regression analysis. The researcher concluded that the addition of cultural capital factors significantly increased the predictive power of the model where magazines in home, gender, student classified as ELL, parental education level, and SES were the independent variables and science achievement was the dependent variable. For alpha=0.05, the overall test for the model produced a R2 value of 0.232; therefore the model predicted 23.2% of variance in science achievement results. Other major findings include: higher measures of home resources predicted higher 2011 NAEP eighth grade science achievement; males were predicted to have higher 2011 NAEP 8 th grade science achievement; classified ELL students were predicted to score lower on the NAEP eight grade science achievement; higher parent education predicted higher NAEP eighth grade science achievement; lower measures of SES predicted lower 2011 NAEP eighth grade science achievement. This study contributed to the research in this field by identifying cultural capital factors that have been found to have statistical significance on predicting eighth grade science achievement results, which can lead to strategies to help improve science academic achievement among underserved populations.
Self-Esteem and Academic Achievement of High School Students
ERIC Educational Resources Information Center
Moradi Sheykhjan, Tohid; Jabari, Kamran; Rajeswari, K.
2014-01-01
The primary purpose of this study was to determine the influence of self-esteem on academic achievement among high school students in Miandoab City of Iran. The methodology of the research is descriptive and correlation that descriptive and inferential statistics were used to analyze the data. Statistical Society includes male and female high…
Prosocial foundations of children's academic achievement.
Caprara, G V; Barbaranelli, C; Pastorelli, C; Bandura, A; Zimbardo, P G
2000-07-01
The present longitudinal research demonstrates robust contributions of early prosocial behavior to children's developmental trajectories in academic and social domains. Both prosocial and aggressive behaviors in early childhood were tested as predictors of academic achievement and peer relations in adolescence 5 years later. Prosocialness included cooperating, helping, sharing, and consoling, and the measure of antisocial aspects included proneness to verbal and physical aggression. Prosocialness had a strong positive impact on later academic achievement and social preferences, but early aggression had no significant effect on either outcome. The conceptual model accounted for 35% of variance in later academic achievement, and 37% of variance in social preferences. Additional analysis revealed that early academic achievement did not contribute to later academic achievement after controlling for effects of early prosocialness. Possible mediating processes by which prosocialness may affect academic achievement and other socially desirable developmental outcomes are proposed.
Explorations in Statistics: Power
ERIC Educational Resources Information Center
Curran-Everett, Douglas
2010-01-01
Learning about statistics is a lot like learning about science: the learning is more meaningful if you can actively explore. This fifth installment of "Explorations in Statistics" revisits power, a concept fundamental to the test of a null hypothesis. Power is the probability that we reject the null hypothesis when it is false. Four…
STATSIM: Exercises in Statistics.
ERIC Educational Resources Information Center
Thomas, David B.; And Others
A computer-based learning simulation was developed at Florida State University which allows for high interactive responding via a time-sharing terminal for the purpose of demonstrating descriptive and inferential statistics. The statistical simulation (STATSIM) is comprised of four modules--chi square, t, z, and F distribution--and elucidates the…
Understanding Undergraduate Statistical Anxiety
ERIC Educational Resources Information Center
McKim, Courtney
2014-01-01
The purpose of this study was to understand undergraduate students' views of statistics. Results reveal that students with less anxiety have a higher interest in statistics and also believe in their ability to perform well in the course. Also students who have a more positive attitude about the class tend to have a higher belief in their…
ERIC Educational Resources Information Center
Hodgson, Ted; Andersen, Lyle; Robison-Cox, Jim; Jones, Clain
2004-01-01
Water quality experiments, especially the use of macroinvertebrates as indicators of water quality, offer an ideal context for connecting statistics and science. In the STAR program for secondary students and teachers, water quality experiments were also used as a context for teaching statistics. In this article, we trace one activity that uses…
Towards Statistically Undetectable Steganography
2011-06-30
Statistically Undciectable Steganography 5a. CONTRACT NUMBER FA9550-08-1-0084 5b. GRANT NUMBER 5c. PROGRAM ELEMENT NUMBER 6. AUTHOR(S) Prof. Jessica...approved for public release: distribution is unlimited. 13. SUPPLEMENTARY NOTES 14. ABSTRACT Fundamental asymptotic laws for imperfect steganography ...formats. 15. SUBJECT TERMS Steganography . covert communication, statistical detectability. asymptotic performance, secure pay load, minimum
Explorations in Statistics: Regression
ERIC Educational Resources Information Center
Curran-Everett, Douglas
2011-01-01
Learning about statistics is a lot like learning about science: the learning is more meaningful if you can actively explore. This seventh installment of "Explorations in Statistics" explores regression, a technique that estimates the nature of the relationship between two things for which we may only surmise a mechanistic or predictive…
ERIC Educational Resources Information Center
Singer, Arlene
This guide outlines a one semester Option Y course, which has seven learner objectives. The course is designed to provide students with an introduction to the concerns and methods of statistics, and to equip them to deal with the many statistical matters of importance to society. Topics covered include graphs and charts, collection and…
Croarkin, M. Carroll
2001-01-01
For more than 50 years, the Statistical Engineering Division (SED) has been instrumental in the success of a broad spectrum of metrology projects at NBS/NIST. This paper highlights fundamental contributions of NBS/NIST statisticians to statistics and to measurement science and technology. Published methods developed by SED staff, especially during the early years, endure as cornerstones of statistics not only in metrology and standards applications, but as data-analytic resources used across all disciplines. The history of statistics at NBS/NIST began with the formation of what is now the SED. Examples from the first five decades of the SED illustrate the critical role of the division in the successful resolution of a few of the highly visible, and sometimes controversial, statistical studies of national importance. A review of the history of major early publications of the division on statistical methods, design of experiments, and error analysis and uncertainty is followed by a survey of several thematic areas. The accompanying examples illustrate the importance of SED in the history of statistics, measurements and standards: calibration and measurement assurance, interlaboratory tests, development of measurement methods, Standard Reference Materials, statistical computing, and dissemination of measurement technology. A brief look forward sketches the expanding opportunity and demand for SED statisticians created by current trends in research and development at NIST. PMID:27500023
Reform in Statistical Education
ERIC Educational Resources Information Center
Huck, Schuyler W.
2007-01-01
Two questions are considered in this article: (a) What should professionals in school psychology do in an effort to stay current with developments in applied statistics? (b) What should they do with their existing knowledge to move from surface understanding of statistics to deep understanding? Written for school psychologists who have completed…
Deconstructing Statistical Analysis
ERIC Educational Resources Information Center
Snell, Joel
2014-01-01
Using a very complex statistical analysis and research method for the sake of enhancing the prestige of an article or making a new product or service legitimate needs to be monitored and questioned for accuracy. 1) The more complicated the statistical analysis, and research the fewer the number of learned readers can understand it. This adds a…
ERIC Educational Resources Information Center
Huizingh, Eelko K. R. E.
2007-01-01
Accessibly written and easy to use, "Applied Statistics Using SPSS" is an all-in-one self-study guide to SPSS and do-it-yourself guide to statistics. What is unique about Eelko Huizingh's approach is that this book is based around the needs of undergraduate students embarking on their own research project, and its self-help style is designed to…
Vijayaraj, Veeraraghavan; Cheriyadat, Anil M; Bhaduri, Budhendra L; Vatsavai, Raju; Bright, Eddie A
2008-01-01
Statistical properties of high-resolution overhead images representing different land use categories are analyzed using various local and global statistical image properties based on the shape of the power spectrum, image gradient distributions, edge co-occurrence, and inter-scale wavelet coefficient distributions. The analysis was performed on a database of high-resolution (1 meter) overhead images representing a multitude of different downtown, suburban, commercial, agricultural and wooded exemplars. Various statistical properties relating to these image categories and their relationship are discussed. The categorical variations in power spectrum contour shapes, the unique gradient distribution characteristics of wooded categories, the similarity in edge co-occurrence statistics for overhead and natural images, and the unique edge co-occurrence statistics of downtown categories are presented in this work. Though previous work on natural image statistics has showed some of the unique characteristics for different categories, the relationships for overhead images are not well understood. The statistical properties of natural images were used in previous studies to develop prior image models, to predict and index objects in a scene and to improve computer vision models. The results from our research findings can be used to augment and adapt computer vision algorithms that rely on prior image statistics to process overhead images, calibrate the performance of overhead image analysis algorithms, and derive features for better discrimination of overhead image categories.
Robust Spectral Clustering Using Statistical Sub-Graph Affinity Model
Eichel, Justin A.; Wong, Alexander; Fieguth, Paul; Clausi, David A.
2013-01-01
Spectral clustering methods have been shown to be effective for image segmentation. Unfortunately, the presence of image noise as well as textural characteristics can have a significant negative effect on the segmentation performance. To accommodate for image noise and textural characteristics, this study introduces the concept of sub-graph affinity, where each node in the primary graph is modeled as a sub-graph characterizing the neighborhood surrounding the node. The statistical sub-graph affinity matrix is then constructed based on the statistical relationships between sub-graphs of connected nodes in the primary graph, thus counteracting the uncertainty associated with the image noise and textural characteristics by utilizing more information than traditional spectral clustering methods. Experiments using both synthetic and natural images under various levels of noise contamination demonstrate that the proposed approach can achieve improved segmentation performance when compared to existing spectral clustering methods. PMID:24386111
Ensemble 3D PTV for high resolution turbulent statistics
NASA Astrophysics Data System (ADS)
Agüera, Nereida; Cafiero, Gioacchino; Astarita, Tommaso; Discetti, Stefano
2016-12-01
A method to extract turbulent statistics from three-dimensional (3D) PIV measurements via ensemble averaging is presented. The proposed technique is a 3D extension of the ensemble particle tracking velocimetry methods, which consist in summing distributions of velocity vectors calculated on low image density samples and then extract the statistical moments from the velocity vectors within sub-volumes, with the size of the sub-volume depending on the desired number of particles and on the available number of snapshots. The extension to 3D measurements poses the additional difficulty of sparse velocity vectors distributions, thus requiring a large number of snapshots to achieve high resolution measurements with a sufficient degree of accuracy. At the current state, this hinders the achievement of single-voxel measurements, unless millions of samples are available. Consequently, one has to give up spatial resolution and live with still relatively large (if compared to the voxel) sub-volumes. This leads to the further problem of the possible occurrence of a residual mean velocity gradient within the sub-volumes, which significantly contaminates the computation of second order moments. In this work, we propose a method to reduce the residual gradient effect, allowing to reach high resolution even with relatively large interrogation spots, therefore still retrieving a large number of particles on which it is possible to calculate turbulent statistics. The method consists in applying a polynomial fit to the velocity distributions within each sub-volume trying to mimic the residual mean velocity gradient.
Achievement Goals and Achievement Emotions: A Meta-Analysis
ERIC Educational Resources Information Center
Huang, Chiungjung
2011-01-01
This meta-analysis synthesized 93 independent samples (N = 30,003) in 77 studies that reported in 78 articles examining correlations between achievement goals and achievement emotions. Achievement goals were meaningfully associated with different achievement emotions. The correlations of mastery and mastery approach goals with positive achievement…
ERIC Educational Resources Information Center
Schuta, Theresa; Mauricio, David
2012-01-01
Three years ago, the authors accepted positions as high school principals in Buffalo City (NY) Schools after serving as elementary school principals in the district for many years. In their new positions, they were to lead schools that were designated by the New York State Department of Education as "persistently lowest achieving,"…
Ector, Hugo
2010-12-01
I still remember my first book on statistics: "Elementary statistics with applications in medicine and the biological sciences" by Frederick E. Croxton. For me, it has been the start of pursuing understanding statistics in daily life and in medical practice. It was the first volume in a long row of books. In his introduction, Croxton pretends that"nearly everyone involved in any aspect of medicine needs to have some knowledge of statistics". The reality is that for many clinicians, statistics are limited to a "P < 0.05 = ok". I do not blame my colleagues who omit the paragraph on statistical methods. They have never had the opportunity to learn concise and clear descriptions of the key features. I have experienced how some authors can describe difficult methods in a well understandable language. Others fail completely. As a teacher, I tell my students that life is impossible without a basic knowledge of statistics. This feeling has resulted in an annual seminar of 90 minutes. This tutorial is the summary of this seminar. It is a summary and a transcription of the best pages I have detected.
NASA Astrophysics Data System (ADS)
Koparan, Timur
2016-02-01
In this study, the effect on the achievement and attitudes of prospective teachers is examined. With this aim ahead, achievement test, attitude scale for statistics and interviews were used as data collection tools. The achievement test comprises 8 problems based on statistical data, and the attitude scale comprises 13 Likert-type items. The study was carried out in 2014-2015 academic year fall semester at a university in Turkey. The study, which employed the pre-test-post-test control group design of quasi-experimental research method, was carried out on a group of 80 prospective teachers, 40 in the control group and 40 in the experimental group. Both groups had four-hour classes about descriptive statistics. The classes with the control group were carried out through traditional methods while dynamic statistics software was used in the experimental group. Five prospective teachers from the experimental group were interviewed clinically after the application for a deeper examination of their views about application. Qualitative data gained are presented under various themes. At the end of the study, it was found that there is a significant difference in favour of the experimental group in terms of achievement and attitudes, the prospective teachers have affirmative approach to the use of dynamic software and see it as an effective tool to enrich maths classes. In accordance with the findings of the study, it is suggested that dynamic software, which offers unique opportunities, be used in classes by teachers and students.
Bilingualism and academic achievement.
Han, Wen-Jui
2012-01-01
Using the Early Childhood Longitudinal Study, Kindergarten Cohort, this study examines the role that bilingualism plays in children's academic developmental trajectories during their early school years, with particular attention on the school environment (N = 16,380). Growth-curve results showed that despite starting with lower math scores in kindergarten, Mixed Bilingual children fully closed the math gap with their White English Monolingual peers by fifth grade. However, because non-English-Dominant Bilinguals and non-English Monolinguals started kindergarten with significantly lower reading and math scores compared to their English Monolingual peers, by fifth grade the former groups still had significantly lower scores. School-level factors explained about one third of the reductions in the differences in children's academic performance.
NASA Technical Reports Server (NTRS)
Young, M.; Koslovsky, M.; Schaefer, Caroline M.; Feiveson, A. H.
2017-01-01
Back by popular demand, the JSC Biostatistics Laboratory and LSAH statisticians are offering an opportunity to discuss your statistical challenges and needs. Take the opportunity to meet the individuals offering expert statistical support to the JSC community. Join us for an informal conversation about any questions you may have encountered with issues of experimental design, analysis, or data visualization. Get answers to common questions about sample size, repeated measures, statistical assumptions, missing data, multiple testing, time-to-event data, and when to trust the results of your analyses.
Commentary: statistics for biomarkers.
Lovell, David P
2012-05-01
This short commentary discusses Biomarkers' requirements for the reporting of statistical analyses in submitted papers. It is expected that submitters will follow the general instructions of the journal, the more detailed guidance given by the International Committee of Medical Journal Editors, the specific guidelines developed by the EQUATOR network, and those of various specialist groups. Biomarkers expects that the study design and subsequent statistical analyses are clearly reported and that the data reported can be made available for independent assessment. The journal recognizes that there is continuing debate about different approaches to statistical science. Biomarkers appreciates that the field continues to develop rapidly and encourages the use of new methodologies.
Parents' Attitudes Towards Science and their Children's Science Achievement
NASA Astrophysics Data System (ADS)
Perera, Liyanage Devangi H.
2014-12-01
Although countries worldwide are emphasizing the importance of science education for technological development and global economic competition, comparative findings from standardized international student assessments reveal a huge gap in science scores between developed and developing countries. Certain developed economies too have made little progress in raising science achievement over the past decade. Despite school improvement being placed high on the policy agenda, the results of such actions have been poor. Therefore, there is a need to explore additional ways in which science achievement can be enhanced. This study focuses on the family and examines whether parents' attitudes towards science (how much they value science and the importance they place on it) can influence their children's science achievement. Individual- and school-level data are obtained from the Program for International Student Assessment 2006 survey for 15 Organisation for Economic Co-operation and Development (OECD) and non-OECD countries. Hierarchical linear modelling is employed to estimate the equations. The findings indicate that parents' attitudes towards science have a positive and statistically significant effect on science achievement, after controlling for other important student- and school-level variables. Moreover, students from poor backgrounds appear to benefit from more positive parental science attitudes as much as students from high socioeconomic status, such that equality of student achievement is not affected. This study recommends that schools and teachers encourage parents to play a more pro-active role in their children's science education, as well as educate parents about the importance of science and strategies that can be adopted to support their children's science learning.
Defining the ecological hydrology of Taiwan Rivers using multivariate statistical methods
NASA Astrophysics Data System (ADS)
Chang, Fi-John; Wu, Tzu-Ching; Tsai, Wen-Ping; Herricks, Edwin E.
2009-09-01
SummaryThe identification and verification of ecohydrologic flow indicators has found new support as the importance of ecological flow regimes is recognized in modern water resources management, particularly in river restoration and reservoir management. An ecohydrologic indicator system reflecting the unique characteristics of Taiwan's water resources and hydrology has been developed, the Taiwan ecohydrological indicator system (TEIS). A major challenge for the water resources community is using the TEIS to provide environmental flow rules that improve existing water resources management. This paper examines data from the extensive network of flow monitoring stations in Taiwan using TEIS statistics to define and refine environmental flow options in Taiwan. Multivariate statistical methods were used to examine TEIS statistics for 102 stations representing the geographic and land use diversity of Taiwan. The Pearson correlation coefficient showed high multicollinearity between the TEIS statistics. Watersheds were separated into upper and lower-watershed locations. An analysis of variance indicated significant differences between upstream, more natural, and downstream, more developed, locations in the same basin with hydrologic indicator redundancy in flow change and magnitude statistics. Issues of multicollinearity were examined using a Principal Component Analysis (PCA) with the first three components related to general flow and high/low flow statistics, frequency and time statistics, and quantity statistics. These principle components would explain about 85% of the total variation. A major conclusion is that managers must be aware of differences among basins, as well as differences within basins that will require careful selection of management procedures to achieve needed flow regimes.
Superordinate Shape Classification Using Natural Shape Statistics
ERIC Educational Resources Information Center
Wilder, John; Feldman, Jacob; Singh, Manish
2011-01-01
This paper investigates the classification of shapes into broad natural categories such as "animal" or "leaf". We asked whether such coarse classifications can be achieved by a simple statistical classification of the shape skeleton. We surveyed databases of natural shapes, extracting shape skeletons and tabulating their…
Breast cancer statistics, 2011.
DeSantis, Carol; Siegel, Rebecca; Bandi, Priti; Jemal, Ahmedin
2011-01-01
In this article, the American Cancer Society provides an overview of female breast cancer statistics in the United States, including trends in incidence, mortality, survival, and screening. Approximately 230,480 new cases of invasive breast cancer and 39,520 breast cancer deaths are expected to occur among US women in 2011. Breast cancer incidence rates were stable among all racial/ethnic groups from 2004 to 2008. Breast cancer death rates have been declining since the early 1990s for all women except American Indians/Alaska Natives, among whom rates have remained stable. Disparities in breast cancer death rates are evident by state, socioeconomic status, and race/ethnicity. While significant declines in mortality rates were observed for 36 states and the District of Columbia over the past 10 years, rates for 14 states remained level. Analyses by county-level poverty rates showed that the decrease in mortality rates began later and was slower among women residing in poor areas. As a result, the highest breast cancer death rates shifted from the affluent areas to the poor areas in the early 1990s. Screening rates continue to be lower in poor women compared with non-poor women, despite much progress in increasing mammography utilization. In 2008, 51.4% of poor women had undergone a screening mammogram in the past 2 years compared with 72.8% of non-poor women. Encouraging patients aged 40 years and older to have annual mammography and a clinical breast examination is the single most important step that clinicians can take to reduce suffering and death from breast cancer. Clinicians should also ensure that patients at high risk of breast cancer are identified and offered appropriate screening and follow-up. Continued progress in the control of breast cancer will require sustained and increased efforts to provide high-quality screening, diagnosis, and treatment to all segments of the population.
Playing at Statistical Mechanics
ERIC Educational Resources Information Center
Clark, Paul M.; And Others
1974-01-01
Discussed are the applications of counting techniques of a sorting game to distributions and concepts in statistical mechanics. Included are the following distributions: Fermi-Dirac, Bose-Einstein, and most probable. (RH)
Hemophilia Data and Statistics
... Hemophilia Women Healthcare Providers Partners Media Policy Makers Data & Statistics Language: English Español (Spanish) Recommend on Facebook ... at a very young age. Based on CDC data, the median age at diagnosis is 36 months ...
Cooperative Learning in Statistics.
ERIC Educational Resources Information Center
Keeler, Carolyn M.; And Others
1994-01-01
Formal use of cooperative learning techniques proved effective in improving student performance and retention in a freshman level statistics course. Lectures interspersed with group activities proved effective in increasing conceptual understanding and overall class performance. (11 references) (Author)
NASA Astrophysics Data System (ADS)
Richfield, Jon; bookfeller
2016-07-01
In reply to Ralph Kenna and Pádraig Mac Carron's feature article “Maths meets myths” in which they describe how they are using techniques from statistical physics to characterize the societies depicted in ancient Icelandic sagas.
NASA Astrophysics Data System (ADS)
Grégoire, G.
2016-05-01
This chapter is devoted to two objectives. The first one is to answer the request expressed by attendees of the first Astrostatistics School (Annecy, October 2013) to be provided with an elementary vademecum of statistics that would facilitate understanding of the given courses. In this spirit we recall very basic notions, that is definitions and properties that we think sufficient to benefit from courses given in the Astrostatistical School. Thus we give briefly definitions and elementary properties on random variables and vectors, distributions, estimation and tests, maximum likelihood methodology. We intend to present basic ideas in a hopefully comprehensible way. We do not try to give a rigorous presentation, and due to the place devoted to this chapter, can cover only a rather limited field of statistics. The second aim is to focus on some statistical tools that are useful in classification: basic introduction to Bayesian statistics, maximum likelihood methodology, Gaussian vectors and Gaussian mixture models.
... and Statistics Recommend on Facebook Tweet Share Compartir Plague in the United States Plague was first introduced ... per year in the United States: 1900-2012. Plague Worldwide Plague epidemics have occurred in Africa, Asia, ...
Understanding Solar Flare Statistics
NASA Astrophysics Data System (ADS)
Wheatland, M. S.
2005-12-01
A review is presented of work aimed at understanding solar flare statistics, with emphasis on the well known flare power-law size distribution. Although avalanche models are perhaps the favoured model to describe flare statistics, their physical basis is unclear, and they are divorced from developing ideas in large-scale reconnection theory. An alternative model, aimed at reconciling large-scale reconnection models with solar flare statistics, is revisited. The solar flare waiting-time distribution has also attracted recent attention. Observed waiting-time distributions are described, together with what they might tell us about the flare phenomenon. Finally, a practical application of flare statistics to flare prediction is described in detail, including the results of a year of automated (web-based) predictions from the method.
Titanic: A Statistical Exploration.
ERIC Educational Resources Information Center
Takis, Sandra L.
1999-01-01
Uses the available data about the Titanic's passengers to interest students in exploring categorical data and the chi-square distribution. Describes activities incorporated into a statistics class and gives additional resources for collecting information about the Titanic. (ASK)
Purposeful Statistical Investigations
ERIC Educational Resources Information Center
Day, Lorraine
2014-01-01
Lorraine Day provides us with a great range of statistical investigations using various resources such as maths300 and TinkerPlots. Each of the investigations link mathematics to students' lives and provide engaging and meaningful contexts for mathematical inquiry.
NASA Astrophysics Data System (ADS)
Testa, Massimo
2015-08-01
Starting with the basic principles of Relativistic Quantum Mechanics, we give a rigorous, but completely elementary proof of the relation between fundamental observables of a statistical system, when measured within two inertial reference frames, related by a Lorentz transformation.
How Statistics "Excel" Online.
ERIC Educational Resources Information Center
Chao, Faith; Davis, James
2000-01-01
Discusses the use of Microsoft Excel software and provides examples of its use in an online statistics course at Golden Gate University in the areas of randomness and probability, sampling distributions, confidence intervals, and regression analysis. (LRW)
Lessons from Inferentialism for Statistics Education
ERIC Educational Resources Information Center
Bakker, Arthur; Derry, Jan
2011-01-01
This theoretical paper relates recent interest in informal statistical inference (ISI) to the semantic theory termed inferentialism, a significant development in contemporary philosophy, which places inference at the heart of human knowing. This theory assists epistemological reflection on challenges in statistics education encountered when…
Achieving permanency for LGBTQ youth.
Jacobs, Jill; Freundlich, Madelyn
2006-01-01
This article brings together two significant efforts in the child welfare field: achieving permanence for youth in out-of-home care and meeting the needs of lesbian, gay, bisexual, transgender and questioning (LGBTQ) youth. During the past several years, a national movement has taken place to assure all children and youth have a permanent family connection before leaving the child welfare system; however, LGBTQ youth are not routinely included in the permanency discussions. At the same time, efforts in addressing the needs of LGBTQ youth have increased, but permanency is rarely mentioned as a need. This article offers models of permanence and practices to facilitate permanence with LGBTQ youth and their families. It also offers a youth-driven, individualized process, using youth development principles to achieve relational, physical, and legal permanence. Reunification efforts are discussed, including services, supports, and education required for youth to return to their family of origin. For those who cannot return home, other family resources are explored. The article also discusses cultural issues as they affect permanence for LGBTQ youth, and, finally, addresses the need for ongoing support services to sustain and support permanency.
Tools for Basic Statistical Analysis
NASA Technical Reports Server (NTRS)
Luz, Paul L.
2005-01-01
Statistical Analysis Toolset is a collection of eight Microsoft Excel spreadsheet programs, each of which performs calculations pertaining to an aspect of statistical analysis. These programs present input and output data in user-friendly, menu-driven formats, with automatic execution. The following types of calculations are performed: Descriptive statistics are computed for a set of data x(i) (i = 1, 2, 3 . . . ) entered by the user. Normal Distribution Estimates will calculate the statistical value that corresponds to cumulative probability values, given a sample mean and standard deviation of the normal distribution. Normal Distribution from two Data Points will extend and generate a cumulative normal distribution for the user, given two data points and their associated probability values. Two programs perform two-way analysis of variance (ANOVA) with no replication or generalized ANOVA for two factors with four levels and three repetitions. Linear Regression-ANOVA will curvefit data to the linear equation y=f(x) and will do an ANOVA to check its significance.
NASA Astrophysics Data System (ADS)
Cook, Samuel A.; Fukawa-Connelly, Timothy
2016-02-01
Studies have shown that at the end of an introductory statistics course, students struggle with building block concepts, such as mean and standard deviation, and rely on procedural understandings of the concepts. This study aims to investigate the understandings entering freshman of a department of mathematics and statistics (including mathematics education), students who are presumably better prepared in terms of mathematics and statistics than the average university student, have of introductory statistics. This case study found that these students enter college with common statistical misunderstandings, lack of knowledge, and idiosyncratic collections of correct statistical knowledge. Moreover, they also have a wide range of beliefs about their knowledge with some of the students who believe that they have the strongest knowledge also having significant misconceptions. More attention to these statistical building blocks may be required in a university introduction statistics course.
Achieving Peace in Afghanistan: Obstacles and Recommendations
2012-03-05
interests between insurgent factions could pose significant obstacles to negotiations. This 12 challenge to the organizations’ unity could become more...significant challenges during negotiations over the role of women in Afghan society. To overcome these potential obstacles , the United States...Achieving Peace in Afghanistan: Obstacles and Recommendations by Colonel Lee K. Grubbs United States Army United
Academic Achievement of NCAA Division III Athletes
ERIC Educational Resources Information Center
Barlow, Kathy A.; Hickey, Ann
2014-01-01
A study of 215 athletes at a small private liberal arts Division III college revealed that athletes (a) begin their college experience with SATs no different from non-athletes; (b) attain GPAs that do not significantly differ from those of nonathletes; (c) achieve GPAs that do not significantly differ between their "in-season" semester…
Psychological Barriers to Achievement in Women.
ERIC Educational Resources Information Center
Goldberg, Lois S.
1982-01-01
This study explored the relationships among birth order, number of course credits achieved, and personality integration for 56 women graduate students. No evidence of significant stress was found as these women approached career choice points, nor was there a significant effect from birth order. (Author/RD)
Predicting Success in Psychological Statistics Courses.
Lester, David
2016-06-01
Many students perform poorly in courses on psychological statistics, and it is useful to be able to predict which students will have difficulties. In a study of 93 undergraduates enrolled in Statistical Methods (18 men, 75 women; M age = 22.0 years, SD = 5.1), performance was significantly associated with sex (female students performed better) and proficiency in algebra in a linear regression analysis. Anxiety about statistics was not associated with course performance, indicating that basic mathematical skills are the best correlate for performance in statistics courses and can usefully be used to stream students into classes by ability.
Achieving TASAR Operational Readiness
NASA Technical Reports Server (NTRS)
Wing, David J.
2015-01-01
NASA has been developing and testing the Traffic Aware Strategic Aircrew Requests (TASAR) concept for aircraft operations featuring a NASA-developed cockpit automation tool, the Traffic Aware Planner (TAP), which computes traffic/hazard-compatible route changes to improve flight efficiency. The TAP technology is anticipated to save fuel and flight time and thereby provide immediate and pervasive benefits to the aircraft operator, as well as improving flight schedule compliance, passenger comfort, and pilot and controller workload. Previous work has indicated the potential for significant benefits for TASAR-equipped aircraft, and a flight trial of the TAP software application in the National Airspace System has demonstrated its technical viability. This paper reviews previous and ongoing activities to prepare TASAR for operational use.
Societal Statistics by virtue of the Statistical Drake Equation
NASA Astrophysics Data System (ADS)
Maccone, Claudio
2012-09-01
number of Habitable Planets follows the lognormal distribution as well. But the Dole equation is described by the first FOUR factors of the Drake equation. Thus, we may "divide" the 7-factor Drake equation by the 4-factor Dole equation getting the probability distribution of the last-3-factor Drake equation, i.e. the probability distribution of the SOCIETAL TERMS ONLY. These we study in detail in this paper, achieving new statistical results about the SOCIETAL ASPECTS OF SETI.
Lifting Minority Achievement: Complex Answers. The Achievement Gap.
ERIC Educational Resources Information Center
Viadero, Debra; Johnston, Robert C.
2000-01-01
This fourth in a four-part series on why academic achievement gaps exist describes the Minority Achievement Committee scholars program at Shaker Heights High School in Cleveland, Ohio, a powerful antidote to the achievement gap between minority and white and Asian American students. It explains the need to break down stereotypes about academic…
Achievement Motivation of Women: Effects of Achievement and Affiliation Arousal.
ERIC Educational Resources Information Center
Gama, Elizabeth Maria Pinheiro
1985-01-01
Assigned 139 Brazilian women to neutral, affiliation arousal, and achievement arousal conditions based on their levels of achievement (Ach) and affiliative (Aff) needs. Results of story analyses revealed that achievement arousal increased scores of high Ach subjects and that high Aff subjects obtained higher scores than low Aff subjects. (BL)
Attitude Towards Physics and Additional Mathematics Achievement Towards Physics Achievement
ERIC Educational Resources Information Center
Veloo, Arsaythamby; Nor, Rahimah; Khalid, Rozalina
2015-01-01
The purpose of this research is to identify the difference in students' attitude towards Physics and Additional Mathematics achievement based on gender and relationship between attitudinal variables towards Physics and Additional Mathematics achievement with achievement in Physics. This research focused on six variables, which is attitude towards…
The Impact of Reading Achievement on Overall Academic Achievement
ERIC Educational Resources Information Center
Churchwell, Dawn Earheart
2009-01-01
This study examined the relationship between reading achievement and achievement in other subject areas. The purpose of this study was to determine if there was a correlation between reading scores as measured by the Standardized Test for the Assessment of Reading (STAR) and academic achievement in language arts, math, science, and social studies…
Within-District Effects of Catholic Schooling on 12th Grad Math Achievement
Chen, Vivien W.; Pong, Suet-Ling
2015-01-01
Using a propensity score matching method and regression modeling based on the 2002 Education Longitudinal Study, this study found a significant Catholic school effect on mathematics achievement among those 12th graders who were least likely to attend Catholic school. This result is evident within-districts after we used the School District Demographics System map data to locate Catholic schools within school district boundaries. Furthermore, the Catholic school effects were statistically significant for students in districts that allowed publicly funded private education. PMID:25606028
Statistical Physics of Fracture
Alava, Mikko; Nukala, Phani K; Zapperi, Stefano
2006-05-01
Disorder and long-range interactions are two of the key components that make material failure an interesting playfield for the application of statistical mechanics. The cornerstone in this respect has been lattice models of the fracture in which a network of elastic beams, bonds, or electrical fuses with random failure thresholds are subject to an increasing external load. These models describe on a qualitative level the failure processes of real, brittle, or quasi-brittle materials. This has been particularly important in solving the classical engineering problems of material strength: the size dependence of maximum stress and its sample-to-sample statistical fluctuations. At the same time, lattice models pose many new fundamental questions in statistical physics, such as the relation between fracture and phase transitions. Experimental results point out to the existence of an intriguing crackling noise in the acoustic emission and of self-affine fractals in the crack surface morphology. Recent advances in computer power have enabled considerable progress in the understanding of such models. Among these partly still controversial issues, are the scaling and size-effects in material strength and accumulated damage, the statistics of avalanches or bursts of microfailures, and the morphology of the crack surface. Here we present an overview of the results obtained with lattice models for fracture, highlighting the relations with statistical physics theories and more conventional fracture mechanics approaches.
SANABRIA, FEDERICO; KILLEEN, PETER R.
2008-01-01
Despite being under challenge for the past 50 years, null hypothesis significance testing (NHST) remains dominant in the scientific field for want of viable alternatives. NHST, along with its significance level p, is inadequate for most of the uses to which it is put, a flaw that is of particular interest to educational practitioners who too often must use it to sanctify their research. In this article, we review the failure of NHST and propose prep, the probability of replicating an effect, as a more useful statistic for evaluating research and aiding practical decision making. PMID:19122766
Factors related to student performance in statistics courses in Lebanon
NASA Astrophysics Data System (ADS)
Naccache, Hiba Salim
The purpose of the present study was to identify factors that may contribute to business students in Lebanese universities having difficulty in introductory and advanced statistics courses. Two statistics courses are required for business majors at Lebanese universities. Students are not obliged to be enrolled in any math courses prior to taking statistics courses. Drawing on recent educational research, this dissertation attempted to identify the relationship between (1) students’ scores on Lebanese university math admissions tests; (2) students’ scores on a test of very basic mathematical concepts; (3) students’ scores on the survey of attitude toward statistics (SATS); (4) course performance as measured by students’ final scores in the course; and (5) their scores on the final exam. Data were collected from 561 students enrolled in multiple sections of two courses: 307 students in the introductory statistics course and 260 in the advanced statistics course in seven campuses across Lebanon over one semester. The multiple regressions results revealed four significant relationships at the introductory level: between students’ scores on the math quiz with their (1) final exam scores; (2) their final averages; (3) the Cognitive subscale of the SATS with their final exam scores; and (4) their final averages. These four significant relationships were also found at the advanced level. In addition, two more significant relationships were found between students’ final average and the two subscales of Effort (5) and Affect (6). No relationship was found between students’ scores on the admission math tests and both their final exam scores and their final averages in both the introductory and advanced level courses. On the other hand, there was no relationship between students’ scores on Lebanese admissions tests and their final achievement. Although these results were consistent across course formats and instructors, they may encourage Lebanese universities
NASA Astrophysics Data System (ADS)
Clifford, Betsey A.
The Massachusetts Department of Elementary and Secondary Education (DESE) released proposed Science and Technology/Engineering standards in 2013 outlining the concepts that should be taught at each grade level. Previously, standards were in grade spans and each district determined the method of implementation. There are two different methods used teaching middle school science: integrated and discipline-based. In the proposed standards, the Massachusetts DESE uses grade-by-grade standards using an integrated approach. It was not known if there is a statistically significant difference in student achievement on the 8th grade science MCAS assessment for students taught with an integrated or discipline-based approach. The results on the 8th grade science MCAS test from six public school districts from 2010 -- 2013 were collected and analyzed. The methodology used was quantitative. Results of an ANOVA showed that there was no statistically significant difference in overall student achievement between the two curriculum models. Furthermore, there was no statistically significant difference for the various domains: Earth and Space Science, Life Science, Physical Science, and Technology/Engineering. This information is useful for districts hesitant to make the change from a discipline-based approach to an integrated approach. More research should be conducted on this topic with a larger sample size to better support the results.
SHARE: Statistical hadronization with resonances
NASA Astrophysics Data System (ADS)
Torrieri, G.; Steinke, S.; Broniowski, W.; Florkowski, W.; Letessier, J.; Rafelski, J.
2005-05-01
errors are independent, since the systematic error is not a random variable). Aside of χ, the program also calculates the statistical significance [2], defined as the probability that, given a "true" theory and a statistical (Gaussian) experimental error, the fitted χ assumes the values at or above the considered value. In the case that the best fit has statistical significance significantly below unity, the model under consideration is very likely inappropriate. In the limit of many degrees of freedom ( N), the statistical significance function depends only on χ/N, with 90% statistical significance at χ/N˜1, and falling steeply at χ/N>1. However, the degrees of freedom in fits involving ratios are generally not sufficient to reach the asymptotic limit. Hence, statistical significance depends strongly on χ and N separately. In particular, if N<20, often for a fit to have an acceptable statistical significance, a χ/N significantly less than 1 is required. The fit routine does not always find the true lowest χ minimum. Specifically, multi-parameter fits with too few degrees of freedom generally exhibit a non-trivial structure in parameter space, with several secondary minima, saddle points, valleys, etc. To help the user perform the minimization effectively, we have added tools to compute the χ contours and profiles. In addition, our program's flexibility allows for many strategies in performing the fit. It is therefore possible, by following the techniques described in Section 3.7, to scan the parameter space and ensure that the minimum found is the true one. Further systematic deviations between the model and experiment can be recognized via the program's output, which includes a particle-by-particle comparison between experiment and theory. Additional comments: In consideration of the wide stream of new data coming out from RHIC, there is an on-going activity, with several groups performing analysis of particle yields. It is our hope that SHARE will allow to
Achievements in Stratospheric Ozone Protection
This report describes achievements in protecting the ozone layer, the benefits of these achievements, and strategies involved (e.g., using alternatives to ozone-depleting substances, phasing out harmful substances, and creating partnerships).
ERIC Educational Resources Information Center
De Lisle, Jerome; Smith, Peter; Jules, Vena
2010-01-01
This study analyzed the spatial distribution of gender differentials in Mathematics and Language Arts on national assessments of educational achievement in the primary school system of the Republic of Trinidad and Tobago. The findings indicate statistically significant medium-sized differences favouring females on Language Arts primarily in the…
ERIC Educational Resources Information Center
Soyibo, Kola; Pinnock, Jacqueline
2005-01-01
This study aimed at establishing if the level of performance of 500 Jamaican Grade 11 students on an achievement test on the concept of respiration was satisfactory (mean = 28 or 70% and above) or not (less than 70%); if there were statistically significant differences in their performance on the concept linked to their gender, cognitive abilities…
NASA Astrophysics Data System (ADS)
Murdock, John
This study is a secondary analysis of data from the 1995 administration of the Third International Mathematics and Science Study (TIMSS). The purpose is to compare breadth, depth, and recurrence of the typical physics curriculum in the United States with the typical curricula in different countries and to determine if there are associations between these three curricular constructs and physics achievement. The first data analysis consisted of descriptive statistics (means, standard deviations, and standardized scores) for each of the three curricular variables. This analysis was used to compare the curricular profile in physics of the United States with the profiles of the other countries in the sample. The second data analysis consisted of six sets of correlations relating the three curricular variables with achievement. Five of the correlations were for the five physics content areas and the sixth was for all of physics. This analysis was used to determine if any associations exist between the three curricular constructs and achievement. The results show that the U.S. curriculum has low breadth, low depth, and high recurrence. The U.S. curricular profile was also found to be unique when compared with the profiles of the other countries in the sample. The only statistically significant correlation is between achievement and depth in a positive direction. The correlations between breadth and achievement and between recurrence and achievement were both not statistically significant. Based on the results of this study, depth of curriculum is the only curricular variable that is closely related to physics achievement for the TIMSS sample. Recurrence of curriculum is not related to physics achievement in TIMSS Population 3 countries. The results show no relationship between breadth and achievement, but the physics topics in the TIMSS content framework do not give a complete picture of breadth of physics curriculum in the participating countries. The unique curricular
Statistical learning and selective inference
Taylor, Jonathan; Tibshirani, Robert J.
2015-01-01
We describe the problem of “selective inference.” This addresses the following challenge: Having mined a set of data to find potential associations, how do we properly assess the strength of these associations? The fact that we have “cherry-picked”—searched for the strongest associations—means that we must set a higher bar for declaring significant the associations that we see. This challenge becomes more important in the era of big data and complex statistical modeling. The cherry tree (dataset) can be very large and the tools for cherry picking (statistical learning methods) are now very sophisticated. We describe some recent new developments in selective inference and illustrate their use in forward stepwise regression, the lasso, and principal components analysis. PMID:26100887
Statistical learning and selective inference.
Taylor, Jonathan; Tibshirani, Robert J
2015-06-23
We describe the problem of "selective inference." This addresses the following challenge: Having mined a set of data to find potential associations, how do we properly assess the strength of these associations? The fact that we have "cherry-picked"--searched for the strongest associations--means that we must set a higher bar for declaring significant the associations that we see. This challenge becomes more important in the era of big data and complex statistical modeling. The cherry tree (dataset) can be very large and the tools for cherry picking (statistical learning methods) are now very sophisticated. We describe some recent new developments in selective inference and illustrate their use in forward stepwise regression, the lasso, and principal components analysis.
Perception in statistical graphics
NASA Astrophysics Data System (ADS)
VanderPlas, Susan Ruth
There has been quite a bit of research on statistical graphics and visualization, generally focused on new types of graphics, new software to create graphics, interactivity, and usability studies. Our ability to interpret and use statistical graphics hinges on the interface between the graph itself and the brain that perceives and interprets it, and there is substantially less research on the interplay between graph, eye, brain, and mind than is sufficient to understand the nature of these relationships. The goal of the work presented here is to further explore the interplay between a static graph, the translation of that graph from paper to mental representation (the journey from eye to brain), and the mental processes that operate on that graph once it is transferred into memory (mind). Understanding the perception of statistical graphics should allow researchers to create more effective graphs which produce fewer distortions and viewer errors while reducing the cognitive load necessary to understand the information presented in the graph. Taken together, these experiments should lay a foundation for exploring the perception of statistical graphics. There has been considerable research into the accuracy of numerical judgments viewers make from graphs, and these studies are useful, but it is more effective to understand how errors in these judgments occur so that the root cause of the error can be addressed directly. Understanding how visual reasoning relates to the ability to make judgments from graphs allows us to tailor graphics to particular target audiences. In addition, understanding the hierarchy of salient features in statistical graphics allows us to clearly communicate the important message from data or statistical models by constructing graphics which are designed specifically for the perceptual system.
Significance of periodogram peaks
NASA Astrophysics Data System (ADS)
Süveges, Maria; Guy, Leanne; Zucker, Shay
2016-10-01
Three versions of significance measures or False Alarm Probabilities (FAPs) for periodogram peaks are presented and compared for sinusoidal and box-like signals, with specific application on large-scale surveys in mind.
Students’ Achievement Goals, Learning-Related Emotions and Academic Achievement
Lüftenegger, Marko; Klug, Julia; Harrer, Katharina; Langer, Marie; Spiel, Christiane; Schober, Barbara
2016-01-01
In the present research, the recently proposed 3 × 2 model of achievement goals is tested and associations with achievement emotions and their joint influence on academic achievement are investigated. The study was conducted with 388 students using the 3 × 2 Achievement Goal Questionnaire including the six proposed goal constructs (task-approach, task-avoidance, self-approach, self-avoidance, other-approach, other-avoidance) and the enjoyment and boredom scales from the Achievement Emotion Questionnaire. Exam grades were used as an indicator of academic achievement. Findings from CFAs provided strong support for the proposed structure of the 3 × 2 achievement goal model. Self-based goals, other-based goals and task-approach goals predicted enjoyment. Task-approach goals negatively predicted boredom. Task-approach and other-approach predicted achievement. The indirect effects of achievement goals through emotion variables on achievement were assessed using bias-corrected bootstrapping. No mediation effects were found. Implications for educational practice are discussed. PMID:27199836
Banerjee, Rabin; Majhi, Bibhas Ranjan
2010-06-15
Starting from the definition of entropy used in statistical mechanics we show that it is proportional to the gravity action. For a stationary black hole this entropy is expressed as S=E/2T, where T is the Hawking temperature and E is shown to be the Komar energy. This relation is also compatible with the generalized Smarr formula for mass.
Statistical Reasoning over Lunch
ERIC Educational Resources Information Center
Selmer, Sarah J.; Bolyard, Johnna J.; Rye, James A.
2011-01-01
Students in the 21st century are exposed daily to a staggering amount of numerically infused media. In this era of abundant numeric data, students must be able to engage in sound statistical reasoning when making life decisions after exposure to varied information. The context of nutrition can be used to engage upper elementary and middle school…
ERIC Educational Resources Information Center
Akram, Muhammad; Siddiqui, Asim Jamal; Yasmeen, Farah
2004-01-01
In order to learn the concept of statistical techniques one needs to run real experiments that generate reliable data. In practice, the data from some well-defined process or system is very costly and time consuming. It is difficult to run real experiments during the teaching period in the university. To overcome these difficulties, statisticians…
Analogies for Understanding Statistics
ERIC Educational Resources Information Center
Hocquette, Jean-Francois
2004-01-01
This article describes a simple way to explain the limitations of statistics to scientists and students to avoid the publication of misleading conclusions. Biologists examine their results extremely critically and carefully choose the appropriate analytic methods depending on their scientific objectives. However, no such close attention is usually…
Structurally Sound Statistics Instruction
ERIC Educational Resources Information Center
Casey, Stephanie A.; Bostic, Jonathan D.
2016-01-01
The Common Core's Standards for Mathematical Practice (SMP) call for all K-grade 12 students to develop expertise in the processes and proficiencies of doing mathematics. However, the Common Core State Standards for Mathematics (CCSSM) (CCSSI 2010) as a whole addresses students' learning of not only mathematics but also statistics. This situation…
General Aviation Avionics Statistics.
1980-12-01
No. 2. Government Accession No. 3. Recipient’s Catalog No. 5" FAA-MS-80-7* a and. SubtitleDecember 1&80 "GENERAL AVIATION AVIONICS STATISTICS 0 6...Altimeter 8. Fuel gage 3. Compass 9. Landing gear 4. Tachometer 10. Belts 5. Oil temperature 11. Special equipment for 6. Emergency locator over water
NACME Statistical Report 1986.
ERIC Educational Resources Information Center
Miranda, Luis A.; Ruiz, Esther
This statistical report summarizes data on enrollment and graduation of minority students in engineering degree programs from 1974 to 1985. First, an introduction identifies major trends and briefly describes the Incentive Grants Program (IGP), the nation's largest privately supported source of scholarship funds available to minority engineering…
ERIC Educational Resources Information Center
Barnes, Bernis, Ed.; And Others
This teacher's guide to probability and statistics contains three major sections. The first section on elementary combinatorial principles includes activities, student problems, and suggested teaching procedures for the multiplication principle, permutations, and combinations. Section two develops an intuitive approach to probability through…
ERIC Educational Resources Information Center
Office of the Assistant Secretary of Defense -- Comptroller (DOD), Washington, DC.
This document contains summaries of basic manpower statistical data for the Department of Defense, with the Army, Navy, Marine Corps, and Air Force totals shown separately and collectively. Included are figures for active duty military personnel, civilian personnel, reserve components, and retired military personnel. Some of the data show…
NASA Astrophysics Data System (ADS)
Williams, R. L.; Gateley, Wilson Y.
1993-05-01
This paper summarizes the statistical quality control methods and procedures that can be employed in mass producing electronic parts (integrated circuits, buffers, capacitors, connectors) to reduce variability and ensure performance to specified radiation, current, voltage, temperature, shock, and vibration levels. Producing such quality parts reduces uncertainties in performance and will aid materially in validating the survivability of components, subsystems, and systems to specified threats.
Statistics for Learning Genetics
ERIC Educational Resources Information Center
Charles, Abigail Sheena
2012-01-01
This study investigated the knowledge and skills that biology students may need to help them understand statistics/mathematics as it applies to genetics. The data are based on analyses of current representative genetics texts, practicing genetics professors' perspectives, and more directly, students' perceptions of, and performance in, doing…
Education Statistics Quarterly, 2003.
ERIC Educational Resources Information Center
Marenus, Barbara; Burns, Shelley; Fowler, William; Greene, Wilma; Knepper, Paula; Kolstad, Andrew; McMillen Seastrom, Marilyn; Scott, Leslie
2003-01-01
This publication provides a comprehensive overview of work done across all parts of the National Center for Education Statistics (NCES). Each issue contains short publications, summaries, and descriptions that cover all NCES publications and data products released in a 3-month period. Each issue also contains a message from the NCES on a timely…
Quartiles in Elementary Statistics
ERIC Educational Resources Information Center
Langford, Eric
2006-01-01
The calculation of the upper and lower quartile values of a data set in an elementary statistics course is done in at least a dozen different ways, depending on the text or computer/calculator package being used (such as SAS, JMP, MINITAB, "Excel," and the TI-83 Plus). In this paper, we examine the various methods and offer a suggestion for a new…
... of benign genes ID’s ASD suspects More Additional Mental Health Information from NIMH Medications Statistics Clinical Trials Coping ... Finder Publicaciones en Español The National Institute of Mental Health (NIMH) is part of the National Institutes of ...
Statistical Energy Analysis Program
NASA Technical Reports Server (NTRS)
Ferebee, R. C.; Trudell, R. W.; Yano, L. I.; Nygaard, S. I.
1985-01-01
Statistical Energy Analysis (SEA) is powerful tool for estimating highfrequency vibration spectra of complex structural systems and incorporated into computer program. Basic SEA analysis procedure divided into three steps: Idealization, parameter generation, and problem solution. SEA computer program written in FORTRAN V for batch execution.
Library Research and Statistics.
ERIC Educational Resources Information Center
Lynch, Mary Jo; St. Lifer, Evan; Halstead, Kent; Fox, Bette-Lee; Miller, Marilyn L.; Shontz, Marilyn L.
2001-01-01
These nine articles discuss research and statistics on libraries and librarianship, including libraries in the United States, Canada, and Mexico; acquisition expenditures in public, academic, special, and government libraries; price indexes; state rankings of public library data; library buildings; expenditures in school library media centers; and…
Early predictors of high school mathematics achievement.
Siegler, Robert S; Duncan, Greg J; Davis-Kean, Pamela E; Duckworth, Kathryn; Claessens, Amy; Engel, Mimi; Susperreguy, Maria Ines; Chen, Meichu
2012-07-01
Identifying the types of mathematics content knowledge that are most predictive of students' long-term learning is essential for improving both theories of mathematical development and mathematics education. To identify these types of knowledge, we examined long-term predictors of high school students' knowledge of algebra and overall mathematics achievement. Analyses of large, nationally representative, longitudinal data sets from the United States and the United Kingdom revealed that elementary school students' knowledge of fractions and of division uniquely predicts those students' knowledge of algebra and overall mathematics achievement in high school, 5 or 6 years later, even after statistically controlling for other types of mathematical knowledge, general intellectual ability, working memory, and family income and education. Implications of these findings for understanding and improving mathematics learning are discussed.
Statistics for Learning Genetics
NASA Astrophysics Data System (ADS)
Charles, Abigail Sheena
This study investigated the knowledge and skills that biology students may need to help them understand statistics/mathematics as it applies to genetics. The data are based on analyses of current representative genetics texts, practicing genetics professors' perspectives, and more directly, students' perceptions of, and performance in, doing statistically-based genetics problems. This issue is at the emerging edge of modern college-level genetics instruction, and this study attempts to identify key theoretical components for creating a specialized biological statistics curriculum. The goal of this curriculum will be to prepare biology students with the skills for assimilating quantitatively-based genetic processes, increasingly at the forefront of modern genetics. To fulfill this, two college level classes at two universities were surveyed. One university was located in the northeastern US and the other in the West Indies. There was a sample size of 42 students and a supplementary interview was administered to a select 9 students. Interviews were also administered to professors in the field in order to gain insight into the teaching of statistics in genetics. Key findings indicated that students had very little to no background in statistics (55%). Although students did perform well on exams with 60% of the population receiving an A or B grade, 77% of them did not offer good explanations on a probability question associated with the normal distribution provided in the survey. The scope and presentation of the applicable statistics/mathematics in some of the most used textbooks in genetics teaching, as well as genetics syllabi used by instructors do not help the issue. It was found that the text books, often times, either did not give effective explanations for students, or completely left out certain topics. The omission of certain statistical/mathematical oriented topics was seen to be also true with the genetics syllabi reviewed for this study. Nonetheless
T1 VSAT Fade Compensation Statistical Results
NASA Technical Reports Server (NTRS)
Johnson, Sandra K.; Acosta, Roberto; Ugweje, Oke
2000-01-01
New satellite communication systems are steadily seeking to use higher frequency bands to accommodate the requirements for additional capacity. At these higher frequencies, propagation impairments that did not significantly affect the signal at lower frequencies begin to have considerable impact. In Ka-band. the next logical commercial frequency band to be used for satellite communication, attenuation of the signal due to rain is a primary concern. An experimental satellite built by NASA, the Advanced Communication Technology Satellite (ACTS). launched in September 1993, is the first U.S. communication satellite operating in the Ka-band. In addition to higher carrier frequencies, a number of other new technologies, including on-board baseband processing. multiple beam antennas, and rain fade detection and compensation techniques, were designed into the ACTS. Verification experiments have been conducted since the launch to characterize the new technologies. The focus of this paper is to characterize the method used by the ACTS TI Very Small Aperture Terminal (TI VSAT) ground stations in detecting the presence of fade in the communication signal and to adaptively compensate for it by the addition of burst rate reduction and forward error correction. Measured data obtained from the ACTS program was used to validate the compensation technique. A software process was developed and demonstrated to statistically characterize the increased availability achieved by the compensation techniques in terms of the bit error rate time enhancement factor. Several improvements to the ACTS technique are discussed and possible implementations for future Ka band system are offered.
Statistical pressure snakes based on color images.
Schaub, Hanspeter
2004-05-01
The traditional mono-color statistical pressure snake was modified to function on a color image with target errors defined in HSV color space. Large variations in target lighting and shading are permitted if the target color is only specified in terms of hue. This method works well with custom targets where the target is surrounded by a color of a very different hue. A significant robustness increase is achieved in the computer vision capability to track a specific target in an unstructured, outdoor environment. By specifying the target color to contain hue, saturation and intensity values, it is possible to establish a reasonably robust method to track general image features of a single color. This method is convenient to allow the operator to select arbitrary targets, or sections of a target, which have a common color. Further, a modification to the standard pixel averaging routine is introduced which allows the target to be specified not only in terms of a single color, but also using a list of colors. These algorithms were tested and verified by using a web camera attached to a personal computer.
Statistical aspects of solar flares
NASA Technical Reports Server (NTRS)
Wilson, Robert M.
1987-01-01
A survey of the statistical properties of 850 H alpha solar flares during 1975 is presented. Comparison of the results found here with those reported elsewhere for different epochs is accomplished. Distributions of rise time, decay time, and duration are given, as are the mean, mode, median, and 90th percentile values. Proportions by selected groupings are also determined. For flares in general, mean values for rise time, decay time, and duration are 5.2 + or - 0.4 min, and 18.1 + or 1.1 min, respectively. Subflares, accounting for nearly 90 percent of the flares, had mean values lower than those found for flares of H alpha importance greater than 1, and the differences are statistically significant. Likewise, flares of bright and normal relative brightness have mean values of decay time and duration that are significantly longer than those computed for faint flares, and mass-motion related flares are significantly longer than non-mass-motion related flares. Seventy-three percent of the mass-motion related flares are categorized as being a two-ribbon flare and/or being accompanied by a high-speed dark filament. Slow rise time flares (rise time greater than 5 min) have a mean value for duration that is significantly longer than that computed for fast rise time flares, and long-lived duration flares (duration greater than 18 min) have a mean value for rise time that is significantly longer than that computed for short-lived duration flares, suggesting a positive linear relationship between rise time and duration for flares. Monthly occurrence rates for flares in general and by group are found to be linearly related in a positive sense to monthly sunspot number. Statistical testing reveals the association between sunspot number and numbers of flares to be significant at the 95 percent level of confidence, and the t statistic for slope is significant at greater than 99 percent level of confidence. Dependent upon the specific fit, between 58 percent and 94 percent of
NASA Technical Reports Server (NTRS)
Black, D. C.
1986-01-01
The significance of brown dwarfs for resolving some major problems in astronomy is discussed. The importance of brown dwarfs for models of star formation by fragmentation of molecular clouds and for obtaining independent measurements of the ages of stars in binary systems is addressed. The relationship of brown dwarfs to planets is considered.
The Statistical Drake Equation
NASA Astrophysics Data System (ADS)
Maccone, Claudio
2010-12-01
We provide the statistical generalization of the Drake equation. From a simple product of seven positive numbers, the Drake equation is now turned into the product of seven positive random variables. We call this "the Statistical Drake Equation". The mathematical consequences of this transformation are then derived. The proof of our results is based on the Central Limit Theorem (CLT) of Statistics. In loose terms, the CLT states that the sum of any number of independent random variables, each of which may be ARBITRARILY distributed, approaches a Gaussian (i.e. normal) random variable. This is called the Lyapunov Form of the CLT, or the Lindeberg Form of the CLT, depending on the mathematical constraints assumed on the third moments of the various probability distributions. In conclusion, we show that: The new random variable N, yielding the number of communicating civilizations in the Galaxy, follows the LOGNORMAL distribution. Then, as a consequence, the mean value of this lognormal distribution is the ordinary N in the Drake equation. The standard deviation, mode, and all the moments of this lognormal N are also found. The seven factors in the ordinary Drake equation now become seven positive random variables. The probability distribution of each random variable may be ARBITRARY. The CLT in the so-called Lyapunov or Lindeberg forms (that both do not assume the factors to be identically distributed) allows for that. In other words, the CLT "translates" into our statistical Drake equation by allowing an arbitrary probability distribution for each factor. This is both physically realistic and practically very useful, of course. An application of our statistical Drake equation then follows. The (average) DISTANCE between any two neighboring and communicating civilizations in the Galaxy may be shown to be inversely proportional to the cubic root of N. Then, in our approach, this distance becomes a new random variable. We derive the relevant probability density
NASA Astrophysics Data System (ADS)
Maccone, C.
In this paper is provided the statistical generalization of the Fermi paradox. The statistics of habitable planets may be based on a set of ten (and possibly more) astrobiological requirements first pointed out by Stephen H. Dole in his book Habitable planets for man (1964). The statistical generalization of the original and by now too simplistic Dole equation is provided by replacing a product of ten positive numbers by the product of ten positive random variables. This is denoted the SEH, an acronym standing for “Statistical Equation for Habitables”. The proof in this paper is based on the Central Limit Theorem (CLT) of Statistics, stating that the sum of any number of independent random variables, each of which may be ARBITRARILY distributed, approaches a Gaussian (i.e. normal) random variable (Lyapunov form of the CLT). It is then shown that: 1. The new random variable NHab, yielding the number of habitables (i.e. habitable planets) in the Galaxy, follows the log- normal distribution. By construction, the mean value of this log-normal distribution is the total number of habitable planets as given by the statistical Dole equation. 2. The ten (or more) astrobiological factors are now positive random variables. The probability distribution of each random variable may be arbitrary. The CLT in the so-called Lyapunov or Lindeberg forms (that both do not assume the factors to be identically distributed) allows for that. In other words, the CLT "translates" into the SEH by allowing an arbitrary probability distribution for each factor. This is both astrobiologically realistic and useful for any further investigations. 3. By applying the SEH it is shown that the (average) distance between any two nearby habitable planets in the Galaxy may be shown to be inversely proportional to the cubic root of NHab. This distance is denoted by new random variable D. The relevant probability density function is derived, which was named the "Maccone distribution" by Paul Davies in
Neuroanatomical Correlates of the Income Achievement Gap
Mackey, Allyson P.; Finn, Amy S.; Leonard, Julia A.; Jacoby Senghor, Drew S.; West, Martin R.; Gabrieli, Christopher F.O.; Gabrieli, John D. E.
2015-01-01
In the United States, the difference in academic achievement between higher- and lower-income students (i.e., the income achievement gap) is substantial and growing. Here, we investigated neuroanatomical correlates of this gap in adolescents (n = 58) in whom academic achievement was measured by statewide standardized testing. Cortical gray matter volume was significantly greater in students from higher-income backgrounds (n = 35) compared to students from lower-income backgrounds (n = 23), but cortical white matter volume and total cortical surface area did not differ between groups. Cortical thickness in all lobes of the brain was greater in students from higher-income than lower-income backgrounds. Thicker cortex, particularly in temporal and occipital lobes, was associated with better test performance. These results represent the first evidence that cortical thickness differs across broad swaths of the brain between higher- and lower-income students, and that cortical thickness is related to academic achievement test scores. PMID:25896418
Statistical considerations for preclinical studies.
Aban, Inmaculada B; George, Brandon
2015-08-01
Research studies must always have proper planning, conduct, analysis and reporting in order to preserve scientific integrity. Preclinical studies, the first stage of the drug development process, are no exception to this rule. The decision to advance to clinical trials in humans relies on the results of these studies. Recent observations show that a significant number of preclinical studies lack rigor in their conduct and reporting. This paper discusses statistical aspects, such as design, sample size determination, and methods of analyses, that will help add rigor and improve the quality of preclinical studies.
Statistics, Uncertainty, and Transmitted Variation
Wendelberger, Joanne Roth
2014-11-05
The field of Statistics provides methods for modeling and understanding data and making decisions in the presence of uncertainty. When examining response functions, variation present in the input variables will be transmitted via the response function to the output variables. This phenomenon can potentially have significant impacts on the uncertainty associated with results from subsequent analysis. This presentation will examine the concept of transmitted variation, its impact on designed experiments, and a method for identifying and estimating sources of transmitted variation in certain settings.
Assessment of the effect of online homework on achievement in chemistry
NASA Astrophysics Data System (ADS)
El-Labban, Wassim Nabil
The purpose of the present study was to analyze the relationship between the OWL homework and achievement, to analyze the relationship between the attitudes towards OWL and achievement, and to find out the extent of popularity of OWL among students. Archival data on 115 students enrolled in the fall 1998 semester, 82 students enrolled in the fall 2000 semester, and 73 students enrolled in the fall 2001 semester were used in this study. In addition, 217 students enrolled in the fall 2002 semester participated in the study by completing an attitudinal survey that was developed by the researcher. The first finding was that there was no statistically significant difference between the ACS final exam scores of students who used OWL for homework and the ACS final exam scores of those who did written homework. This finding supported earlier reports suggesting that online homework did not do harm to student learning. The second finding was that a significant correlation existed between scores on the OWL homework and the ACS final exam scores suggesting that online homework, besides other factors, played a positive role in student learning. The third finding was that there was no statistically significant correlation between the attitude towards OWL and the scores on the ACS final exam. This finding did not support the previous reports that related attitudes to achievement in chemistry. The attitude towards OWL, however, did positively correlate with the achievement scores on OWL making this finding, contrary to the one before it, supportive of the previous reports relating attitudes to achievement in chemistry. Finally, concerning the popularity of OWL among students, the results of the attitudinal survey showed that OWL was a popular online homework system in spite of some problems that students reported they faced while using it.
Statistical Inference: The Big Picture.
Kass, Robert E
2011-02-01
Statistics has moved beyond the frequentist-Bayesian controversies of the past. Where does this leave our ability to interpret results? I suggest that a philosophy compatible with statistical practice, labelled here statistical pragmatism, serves as a foundation for inference. Statistical pragmatism is inclusive and emphasizes the assumptions that connect statistical models with observed data. I argue that introductory courses often mis-characterize the process of statistical inference and I propose an alternative "big picture" depiction.
Composite Defect Significance.
1982-07-13
A12i 299 COMPOSITE DEFECT SIGNIFICANCE(U) MATERIALS SCIENCES 1/1 \\ CORP SPRING HOUSE PA S N CHATTERJEE ET AL. 13 JUL 82 MSC/TFR/1288/il87 NADC-80848...Directorate 30 Sensors & Avionics Technology Directorate 40 Communication & Navigation Technology Directorate 50 Software Computer Directorate 60 Aircraft ...instructions concerning commercial products herein do not constitute an endorsement by the Government nor do they convey or imply the license or right to use
NASA Astrophysics Data System (ADS)
Dunbar, P. K.; Furtney, M.; McLean, S. J.; Sweeney, A. D.
2014-12-01
Tsunamis have inflicted death and destruction on the coastlines of the world throughout history. The occurrence of tsunamis and the resulting effects have been collected and studied as far back as the second millennium B.C. The knowledge gained from cataloging and examining these events has led to significant changes in our understanding of tsunamis, tsunami sources, and methods to mitigate the effects of tsunamis. The most significant, not surprisingly, are often the most devastating, such as the 2011 Tohoku, Japan earthquake and tsunami. The goal of this poster is to give a brief overview of the occurrence of tsunamis and then focus specifically on several significant tsunamis. There are various criteria to determine the most significant tsunamis: the number of deaths, amount of damage, maximum runup height, had a major impact on tsunami science or policy, etc. As a result, descriptions will include some of the most costly (2011 Tohoku, Japan), the most deadly (2004 Sumatra, 1883 Krakatau), and the highest runup ever observed (1958 Lituya Bay, Alaska). The discovery of the Cascadia subduction zone as the source of the 1700 Japanese "Orphan" tsunami and a future tsunami threat to the U.S. northwest coast, contributed to the decision to form the U.S. National Tsunami Hazard Mitigation Program. The great Lisbon earthquake of 1755 marked the beginning of the modern era of seismology. Knowledge gained from the 1964 Alaska earthquake and tsunami helped confirm the theory of plate tectonics. The 1946 Alaska, 1952 Kuril Islands, 1960 Chile, 1964 Alaska, and the 2004 Banda Aceh, tsunamis all resulted in warning centers or systems being established.The data descriptions on this poster were extracted from NOAA's National Geophysical Data Center (NGDC) global historical tsunami database. Additional information about these tsunamis, as well as water level data can be found by accessing the NGDC website www.ngdc.noaa.gov/hazard/
ERIC Educational Resources Information Center
Koparan, Timur
2016-01-01
In this study, the effect on the achievement and attitudes of prospective teachers is examined. With this aim ahead, achievement test, attitude scale for statistics and interviews were used as data collection tools. The achievement test comprises 8 problems based on statistical data, and the attitude scale comprises 13 Likert-type items. The study…
The Mechanics of Human Achievement.
Duckworth, Angela L; Eichstaedt, Johannes C; Ungar, Lyle H
2015-07-01
Countless studies have addressed why some individuals achieve more than others. Nevertheless, the psychology of achievement lacks a unifying conceptual framework for synthesizing these empirical insights. We propose organizing achievement-related traits by two possible mechanisms of action: Traits that determine the rate at which an individual learns a skill are talent variables and can be distinguished conceptually from traits that determine the effort an individual puts forth. This approach takes inspiration from Newtonian mechanics: achievement is akin to distance traveled, effort to time, skill to speed, and talent to acceleration. A novel prediction from this model is that individual differences in effort (but not talent) influence achievement (but not skill) more substantially over longer (rather than shorter) time intervals. Conceptualizing skill as the multiplicative product of talent and effort, and achievement as the multiplicative product of skill and effort, advances similar, but less formal, propositions by several important earlier thinkers.
The Mechanics of Human Achievement
Duckworth, Angela L.; Eichstaedt, Johannes C.; Ungar, Lyle H.
2015-01-01
Countless studies have addressed why some individuals achieve more than others. Nevertheless, the psychology of achievement lacks a unifying conceptual framework for synthesizing these empirical insights. We propose organizing achievement-related traits by two possible mechanisms of action: Traits that determine the rate at which an individual learns a skill are talent variables and can be distinguished conceptually from traits that determine the effort an individual puts forth. This approach takes inspiration from Newtonian mechanics: achievement is akin to distance traveled, effort to time, skill to speed, and talent to acceleration. A novel prediction from this model is that individual differences in effort (but not talent) influence achievement (but not skill) more substantially over longer (rather than shorter) time intervals. Conceptualizing skill as the multiplicative product of talent and effort, and achievement as the multiplicative product of skill and effort, advances similar, but less formal, propositions by several important earlier thinkers. PMID:26236393
Statistical evaluation of forecasts.
Mader, Malenka; Mader, Wolfgang; Gluckman, Bruce J; Timmer, Jens; Schelter, Björn
2014-08-01
Reliable forecasts of extreme but rare events, such as earthquakes, financial crashes, and epileptic seizures, would render interventions and precautions possible. Therefore, forecasting methods have been developed which intend to raise an alarm if an extreme event is about to occur. In order to statistically validate the performance of a prediction system, it must be compared to the performance of a random predictor, which raises alarms independent of the events. Such a random predictor can be obtained by bootstrapping or analytically. We propose an analytic statistical framework which, in contrast to conventional methods, allows for validating independently the sensitivity and specificity of a forecasting method. Moreover, our method accounts for the periods during which an event has to remain absent or occur after a respective forecast.
Thacker, Michael A.; Moseley, G. Lorimer
2017-01-01
Perception is seen as a process that utilises partial and noisy information to construct a coherent understanding of the world. Here we argue that the experience of pain is no different; it is based on incomplete, multimodal information, which is used to estimate potential bodily threat. We outline a Bayesian inference model, incorporating the key components of cue combination, causal inference, and temporal integration, which highlights the statistical problems in everyday perception. It is from this platform that we are able to review the pain literature, providing evidence from experimental, acute, and persistent phenomena to demonstrate the advantages of adopting a statistical account in pain. Our probabilistic conceptualisation suggests a principles-based view of pain, explaining a broad range of experimental and clinical findings and making testable predictions. PMID:28081134
Statistical evaluation of forecasts
NASA Astrophysics Data System (ADS)
Mader, Malenka; Mader, Wolfgang; Gluckman, Bruce J.; Timmer, Jens; Schelter, Björn
2014-08-01
Reliable forecasts of extreme but rare events, such as earthquakes, financial crashes, and epileptic seizures, would render interventions and precautions possible. Therefore, forecasting methods have been developed which intend to raise an alarm if an extreme event is about to occur. In order to statistically validate the performance of a prediction system, it must be compared to the performance of a random predictor, which raises alarms independent of the events. Such a random predictor can be obtained by bootstrapping or analytically. We propose an analytic statistical framework which, in contrast to conventional methods, allows for validating independently the sensitivity and specificity of a forecasting method. Moreover, our method accounts for the periods during which an event has to remain absent or occur after a respective forecast.
1979 DOE statistical symposium
Gardiner, D.A.; Truett T.
1980-09-01
The 1979 DOE Statistical Symposium was the fifth in the series of annual symposia designed to bring together statisticians and other interested parties who are actively engaged in helping to solve the nation's energy problems. The program included presentations of technical papers centered around exploration and disposal of nuclear fuel, general energy-related topics, and health-related issues, and workshops on model evaluation, risk analysis, analysis of large data sets, and resource estimation.
Relativistic statistical arbitrage
NASA Astrophysics Data System (ADS)
Wissner-Gross, A. D.; Freer, C. E.
2010-11-01
Recent advances in high-frequency financial trading have made light propagation delays between geographically separated exchanges relevant. Here we show that there exist optimal locations from which to coordinate the statistical arbitrage of pairs of spacelike separated securities, and calculate a representative map of such locations on Earth. Furthermore, trading local securities along chains of such intermediate locations results in a novel econophysical effect, in which the relativistic propagation of tradable information is effectively slowed or stopped by arbitrage.
Statistical Methods in Cosmology
NASA Astrophysics Data System (ADS)
Verde, L.
2010-03-01
The advent of large data-set in cosmology has meant that in the past 10 or 20 years our knowledge and understanding of the Universe has changed not only quantitatively but also, and most importantly, qualitatively. Cosmologists rely on data where a host of useful information is enclosed, but is encoded in a non-trivial way. The challenges in extracting this information must be overcome to make the most of a large experimental effort. Even after having converged to a standard cosmological model (the LCDM model) we should keep in mind that this model is described by 10 or more physical parameters and if we want to study deviations from it, the number of parameters is even larger. Dealing with such a high dimensional parameter space and finding parameters constraints is a challenge on itself. Cosmologists want to be able to compare and combine different data sets both for testing for possible disagreements (which could indicate new physics) and for improving parameter determinations. Finally, cosmologists in many cases want to find out, before actually doing the experiment, how much one would be able to learn from it. For all these reasons, sophisiticated statistical techniques are being employed in cosmology, and it has become crucial to know some statistical background to understand recent literature in the field. I will introduce some statistical tools that any cosmologist should know about in order to be able to understand recently published results from the analysis of cosmological data sets. I will not present a complete and rigorous introduction to statistics as there are several good books which are reported in the references. The reader should refer to those.
Statistical Challenges of Astronomy
NASA Astrophysics Data System (ADS)
Feigelson, Eric D.; Babu, G. Jogesh
Digital sky surveys, data from orbiting telescopes, and advances in computation have increased the quantity and quality of astronomical data by several orders of magnitude in recent years. Making sense of this wealth of data requires sophisticated statistical and data analytic techniques. Fortunately, statistical methodologies have similarly made great strides in recent years. Powerful synergies thus emerge when astronomers and statisticians join in examining astrostatistical problems and approaches. The volume focuses on several themes: · The increasing power of Bayesian approaches to modeling astronomical data · The growth of enormous databases, leading an emerging federated Virtual Observatory, and their impact on modern astronomical research · Statistical modeling of critical datasets, such as galaxy clustering and fluctuations in the microwave background radiation, leading to a new era of precision cosmology · Methodologies for uncovering clusters and patterns in multivariate data · The characterization of multiscale patterns in imaging and time series data As in earlier volumes in this series, research contributions discussing topics in one field are joined with commentary from scholars in the other. Short contributed papers covering dozens of astrostatistical topics are also included.
Statistics in fusion experiments
NASA Astrophysics Data System (ADS)
McNeill, D. H.
1997-11-01
Since the reasons for the variability in data from plasma experiments are often unknown or uncontrollable, statistical methods must be applied. Reliable interpretation and public accountability require full data sets. Two examples of data misrepresentation at PPPL are analyzed: Te >100 eV on S-1 spheromak.(M. Yamada, Nucl. Fusion 25, 1327 (1985); reports to DoE; etc.) The reported high values (statistical artifacts of Thomson scattering measurements) were selected from a mass of data with an average of 40 eV or less. ``Correlated'' spectroscopic data were meaningless. (2) Extrapolation to Q >=0.5 for DT in TFTR.(D. Meade et al., IAEA Baltimore (1990), V. 1, p. 9; H. P. Furth, Statements to U. S. Congress (1989).) The DD yield used there was the highest through 1990 (>= 50% above average) and the DT to DD power ratio used was about twice any published value. Average DD yields and published yield ratios scale to Q<0.15 for DT, in accord with the observed performance over the last 3 1/2 years. Press reports of outlier data from TFTR have obscured the fact that the DT behavior follows from trivial scaling of the DD data. Good practice in future fusion research would have confidence intervals and other descriptive statistics accompanying reported numerical values (cf. JAMA).
A Validity Study: Attitudes towards Statistics among Japanese College Students
ERIC Educational Resources Information Center
Satake, Eike
2015-01-01
This cross-cultural study investigated the relationship between attitudes toward statistics (ATS) and course achievement (CA) among Japanese college students. The sample consisted of 135 male and 134 female students from the first two-year liberal arts program of a four-year college in Tokyo, Japan. Attitudes about statistics were measured using…
Fast correspondences for statistical shape models of brain structures
NASA Astrophysics Data System (ADS)
Bernard, Florian; Vlassis, Nikos; Gemmar, Peter; Husch, Andreas; Thunberg, Johan; Goncalves, Jorge; Hertel, Frank
2016-03-01
Statistical shape models based on point distribution models are powerful tools for image segmentation or shape analysis. The most challenging part in the generation of point distribution models is the identification of corresponding landmarks among all training shapes. Since in general the true correspondences are unknown, correspondences are frequently established under the hypothesis that correct correspondences lead to a compact model, which is mostly tackled by continuous optimisation methods. In favour of the prospect of an efficient optimisation, we present a simplified view of the correspondence problem for statistical shape models that is based on point-set registration, the linear assignment problem and mesh fairing. At first, regularised deformable point-set registration is performed and combined with solving the linear assignment problem to obtain correspondences between shapes on a global scale. With that, rough correspondences are established that may not yet be accurate on a local scale. Then, by using a mesh fairing procedure, consensus of the correspondences on a global and local scale among the entire set of shapes is achieved. We demonstrate that for the generation of statistical shape models of deep brain structures, the proposed approach is preferable over existing population-based methods both in terms of a significantly shorter runtime and in terms of an improved quality of the resulting shape model.
Statistical Inference at Work: Statistical Process Control as an Example
ERIC Educational Resources Information Center
Bakker, Arthur; Kent, Phillip; Derry, Jan; Noss, Richard; Hoyles, Celia
2008-01-01
To characterise statistical inference in the workplace this paper compares a prototypical type of statistical inference at work, statistical process control (SPC), with a type of statistical inference that is better known in educational settings, hypothesis testing. Although there are some similarities between the reasoning structure involved in…
U-statistic with side information.
Yuan, Ao; He, Wenqing; Wang, Binhuan; Qin, Gengsheng
2012-10-01
In this paper we study U-statistics with side information incorporated using the method of empirical likelihood. Some basic properties of the proposed statistics are investigated. We find that by implementing the side information properly, the proposed U-statistics can have smaller asymptotic variance than the existing U-statistics in the literature. The proposed U-statistics can achieve asymptotic efficiency in a formal sense and their weak limits admit a convolution result. We also find that the corresponding U-likelihood ratio procedure, as well as the U-empirical likelihood based confidence interval construction, do not benefit from incorporating side information, a result that is consistent with the result under the standard empirical likelihood ratio procedure. The impact of incorrect side information implementation in the proposed U-statistics is also explored. Simulation studies are conducted to assess the finite sample performance of the proposed method. The numerical results show that with side information implemented, the deduction of asymptotic variance can be substantial in some cases, and the coverage probability of the confidence interval using the U-empirical likelihood ratio based method outperforms that of the normal approximation based method, in particular in the cases when the underlying distribution is skewed.
U-statistic with side information
Yuan, Ao; He, Wenqing; Wang, Binhuan; Qin, Gengsheng
2013-01-01
In this paper we study U-statistics with side information incorporated using the method of empirical likelihood. Some basic properties of the proposed statistics are investigated. We find that by implementing the side information properly, the proposed U-statistics can have smaller asymptotic variance than the existing U-statistics in the literature. The proposed U-statistics can achieve asymptotic efficiency in a formal sense and their weak limits admit a convolution result. We also find that the corresponding U-likelihood ratio procedure, as well as the U-empirical likelihood based confidence interval construction, do not benefit from incorporating side information, a result that is consistent with the result under the standard empirical likelihood ratio procedure. The impact of incorrect side information implementation in the proposed U-statistics is also explored. Simulation studies are conducted to assess the finite sample performance of the proposed method. The numerical results show that with side information implemented, the deduction of asymptotic variance can be substantial in some cases, and the coverage probability of the confidence interval using the U-empirical likelihood ratio based method outperforms that of the normal approximation based method, in particular in the cases when the underlying distribution is skewed. PMID:23704796
NASA Astrophysics Data System (ADS)
Demirci, Neset
The main goal of this study was to investigate the effects of a web-based physics software program on students' achievement and misconceptions in force and motion concepts. During the fall of 1999, a total 125 students (54.4% female and 45.6% male) from two public high schools in Brevard County, Florida, were selected by a sample of convenience to participate in this quasi-experimental study. The MANCOVA analysis yielded a significant interaction for pretest (covariate = priory physics knowledge) and gender for each dependent variable (Y 1 = Achievement, and Y2 = Misconception). Thus, the test for homogeneity of regression failed rendering an invalid MANCOVA model. As a result, separate ATI's were performed for each dependent variable. ATI interaction between pretest and gender relative to achievement and misconception was significant. Of the six initial hypotheses, only hypothesis 2, which examined differences in-group misconception scores, was rejected. Specifically, group membership contributed 12.6% additional knowledge of posttest misconception score variability, which was statistically significant (F1,9 = 20.03, p < .05). Based on this result, it can be concluded that incorporating the web-based physics program with traditional lecturing did have a significant effect on dispelling students' physics misconceptions about force and motion concepts. Thus, only the test for this hypothesis and the two interactions, which were not initially considered as research hypotheses, were significant. All other tests of hypotheses were not statistically significant and hence were not rejected.
Multilingualism, Mathematics Achievement and Instructional Language Policy
ERIC Educational Resources Information Center
Garrett, Rachel Singal
2010-01-01
A significant and growing proportion of students in the United States speak primarily a non-English language at home. This dissertation contributes to the understanding of academic achievement patterns among language minority students in the United States. The first essay uses data from the Early Childhood Longitudinal Survey Kindergarten Class…
Building Fires: Raising Achievement through Class Discussion
ERIC Educational Resources Information Center
Kahn, Elizabeth
2007-01-01
According to a growing body of research, discussion-based instruction, in the context of high academic demands, significantly enhances student achievement in reading. The effects apply to below- as well as above-average-ability students. These findings confirm what secondary English teachers have believed all along about the value of discussion.…
Sickle Cell Trait and Scholastic Achievement
ERIC Educational Resources Information Center
Jackson, Yvonne; Ayrer, James
1974-01-01
In a preliminary study, no significant interaction effects were found between scholastic achievement and sickle cell trait in black children currently in eight and ninth grades, as measured by the Iowa Tests of Basic Skills over a consecutive period of four years, 1968 through 1971, grades four through seven. (EH)
NRL SSD Research Achievements: 19902000. Volume 4
2015-10-30
stratCAT. Projected future increases in computing power offer no prospect of solving this fundamental resolution constraint. Thus novel new...extraordinary ranges of research and results have been achieved. To document significant SSD historical accomplishments, Drs. George Doschek and...Howard ....................... 04 1.0 Historical Perspective
Creativity: The Hub of Real Achievement
ERIC Educational Resources Information Center
Forster, Jill
2012-01-01
The aim of this article is to encourage a greater emphasis on creativity across and between varied fields of endeavour. It has been written to underline the interdisciplinary significance of creativity and the role of creativity in truly enhancing achievement. There is a reinvigorated awareness of the need for "big thinking", a global…
NASA Astrophysics Data System (ADS)
Rachmatullah, Arif; Diana, Sariwulan; Rustaman, Nuryani Y.
2016-02-01
Along with the development of science and technology, the basic ability to read, write and count is not enough just to be able to survive in the modern era that surrounded by the products of science and technology. Scientific literacy is an ability that might be added as basic ability for human in the modern era. Recently, Fives et al. developed a new scientific literacy assessment for students, named as SLA (Scientific Literacy Assessment). A pilot study on the achievements of scientific literacy of middle school students in Sumedang using SLA was conducted to investigate the profile scientific literacy achievement of 223 middle school students in Sumedang, and compare the outcomes between genders (159 girls and 64 boys) and school accreditation (A and B) using a quantitative method with descriptive research-school survey. Based on the results, the average achievement of scientific literacy Sumedang middle school students is 45.21 and classified as the low category. The five components of scientific literacy, which is only one component in the medium category, namely science motivation and beliefs, and the four other components are in the low and very low category. Boys have higher scientific literacy, but the differences not statistically significant. Student's scientific literacy in an accredited school is higher than B, and the differences are statistically significant. Recommendation for further are: involve more research subjects, add more number of questions for each indicator, and conduct an independent research for each component.
Machtay; Glatstein
1998-01-01
have shown overall survivals superior to age-matched controls). It is fallacious and illogical to compare nonrandomized series of observation to those of aggressive therapy. In addition to the above problem, the use of DSS introduces another potential issue which we will call the bias of cause-of-death-interpretation. All statistical endpoints (e.g., response rates, local-regional control, freedom from brain metastases), except OS, are known to depend heavily on the methods used to define the endpoint and are often subject to significant interobserver variability. There is no reason to believe that this problem does not occasionally occur with respect to defining a death as due to the index cancer or to intercurrent disease, even though this issue has been poorly studied. In many oncologic situations-for example, metastatic lung cancer-this form of bias does not exist. In some situations, such as head and neck cancer, this could be an intermediate problem (Was that lethal chest tumor a second primary or a metastasis?.Would the fatal aspiration pneumonia have occurred if he still had a tongue?.And what about Mr. B. described above?). In some situations, particularly relatively "good prognosis" neoplasms, this could be a substantial problem, particularly if the adjudication of whether or not a death is cancer-related is performed solely by researchers who have an "interest" in demonstrating a good DSS. What we are most concerned about with this form of bias relates to recent series on observation, such as in early prostate cancer. It is interesting to note that although only 10% of the "observed" patients die from prostate cancer, many develop distant metastases by 10 years (approximately 40% among patients with intermediate grade tumors). Thus, it is implied that many prostate cancer metastases are usually not of themselves lethal, which is a misconception to anyone experienced in taking care of prostate cancer patients. This is inconsistent with U.S. studies of
Student Health and Academic Achievement
... Evaluation FAQs Additional Evaluation Resources Health & Academics Anti-Bullying Policies and Enumeration: An Infobrief for Local Education ... 11 Resources Health and Academics Data and Statistics Bullying and Absenteeism: Information for State and Local Education ...
Correcting a Significance Test for Clustering
ERIC Educational Resources Information Center
Hedges, Larry V.
2007-01-01
A common mistake in analysis of cluster randomized trials is to ignore the effect of clustering and analyze the data as if each treatment group were a simple random sample. This typically leads to an overstatement of the precision of results and anticonservative conclusions about precision and statistical significance of treatment effects. This…
NASA Astrophysics Data System (ADS)
Muller, Patricia Ann
The purpose of this study was to gain a more complete understanding of the differences in science, mathematics and engineering education among racial-ethnic and gender subgroups by exploring factors related to precollege science achievement growth rates. Using Hierarchical Linear Modeling (HLM) and multi-wave, longitudinal data from the first three waves of the National Education Longitudinal Study of 1988--1994 (NELS:88/94), this study examined precollege science achievement growth rates during the 8th to 10th grade period and the 10th to 12th grade period for African American males, African American females, Latino males, Latina females, Asian American males, Asian American females, White males and White females. For the 8th--10th grade period, previous grades were significantly and positively related to science achievement growth for all subgroups; and socio-economic status and high school program were significantly and positively related to science achievement growth for all subgroups except one (Latino males, and Asian American males respectively). For the 10th--12th grade period, the quantity of science courses completed (science units) was the only variable that was statistically significant for more than one racial-ethnic by gender subgroup. Science units taken were significantly and positively related to 10 th--12th grade growth rates for all racial-ethnic by gender subgroups except Latino males. Locus-of-control was the only cognitive or psychosocial factor included from Eccles, Adler, Futterman, Goff, Kaczala, Meece and Midgley's (1983) theoretical framework for achievement behaviors that appeared to exhibit any pattern across race-ethnicities. Locus-of-control was positively related to 8th--10 th grade science achievement growth for females across all racial-ethnic subgroups, as well as for African American males. However, for both the 8 th--10th grade and 10th--12 th grade periods, there was no consistency across racial-ethnic or gender subgroups in
Statistical considerations in design of spacelab experiments
NASA Technical Reports Server (NTRS)
Robinson, J.
1978-01-01
After making an analysis of experimental error sources, statistical models were developed for the design and analysis of potential Space Shuttle experiments. Guidelines for statistical significance and/or confidence limits of expected results were also included. The models were then tested out on the following proposed Space Shuttle biomedical experiments: (1) bone density by computer tomography; (2) basal metabolism; and (3) total body water. Analysis of those results and therefore of the models proved inconclusive due to the lack of previous research data and statistical values. However, the models were seen as possible guides to making some predictions and decisions.
EDUCATIONAL ACHIEVEMENT AND THE NAVAJO.
ERIC Educational Resources Information Center
HAAS, JOHN; MELVILLE, ROBERT
A STUDY WAS DEVISED TO APPRAISE THE ACADEMIC ACHIEVEMENT OF NAVAJO STUDENTS LIVING IN DORMITORIES AWAY FROM THE INDIAN RESERVATION. THE FOLLOWING SEVEN FACTORS WERE CHOSEN TO BE INVESTIGATED AS BEING DIRECTLY RELATED TO ACHIEVEMENT--(1) INTELLIGENCE, (2) READING ABILITY, (3) ANXIETY, (4) SELF-CONCEPT, (5) MOTIVATION, (6) VERBAL DEVELOPMENT, (7)…
Sociocultural Origins of Achievement Motivation
ERIC Educational Resources Information Center
Maehr, Martin L.
1977-01-01
Presents a theoretical review of work on sociocultural influences on achievement, focusing on a critical evaluation of the work of David McClellan. Offers an alternative conception of achievement motivation which stresses the role of contextual and situational factors in addition to personality factors. Available from: Transaction Periodicals…
Raising Boys' Achievement in Schools.
ERIC Educational Resources Information Center
Bleach, Kevan, Ed.
This book offers insights into the range of strategies and good practice being used to raise the achievement of boys. Case studies by school-based practitioners suggest ideas and measures to address the issue of achievement by boys. The contributions are: (1) "Why the Likely Lads Lag Behind" (Kevan Bleach); (2) "Helping Boys Do…
Teaching the Low Level Achiever.
ERIC Educational Resources Information Center
Salomone, Ronald E., Ed.
1986-01-01
Intended for teachers of the English language arts, the articles in this issue offer suggestions and techniques for teaching the low level achiever. Titles and authors of the articles are as follows: (1) "A Point to Ponder" (Rachel Martin); (2) "Tracking: A Self-Fulfilling Prophecy of Failure for the Low Level Achiever" (James Christopher Davis);…
Early Intervention and Student Achievement
ERIC Educational Resources Information Center
Hormes, Mridula T.
2009-01-01
The United States Department of Education has been rigorous in holding all states accountable with regard to student achievement. The No Child Left Behind Act of 2001 clearly laid out federal mandates for all schools to follow. K-12 leaders of public schools are very aware of the fact that results in terms of student achievement need to improve…
Parental Involvement and Academic Achievement
ERIC Educational Resources Information Center
Goodwin, Sarah Christine
2015-01-01
This research study examined the correlation between student achievement and parent's perceptions of their involvement in their child's schooling. Parent participants completed the Parent Involvement Project Parent Questionnaire. Results slightly indicated parents of students with higher level of achievement perceived less demand or invitations…
Perils of Standardized Achievement Testing
ERIC Educational Resources Information Center
Haladyna, Thomas M.
2006-01-01
This article argues that the validity of standardized achievement test-score interpretation and use is problematic; consequently, confidence and trust in such test scores may often be unwarranted. The problem is particularly severe in high-stakes situations. This essay provides a context for understanding standardized achievement testing, then…
Stress Correlates and Academic Achievement.
ERIC Educational Resources Information Center
Bentley, Donna Anderson; And Others
An ongoing concern for educators is the identification of factors that contribute to or are associated with academic achievement; one such group of variables that has received little attention are those involving stress. The relationship between perceived sources of stress and academic achievement was examined to determine if reactions to stress…
School Size and Student Achievement
ERIC Educational Resources Information Center
Riggen, Vicki
2013-01-01
This study examined whether a relationship between high school size and student achievement exists in Illinois public high schools in reading and math, as measured by the Prairie State Achievement Exam (PSAE), which is administered to all Illinois 11th-grade students. This study also examined whether the factors of socioeconomic status, English…
Receptor arrays optimized for natural odor statistics
Zwicker, David; Murugan, Arvind; Brenner, Michael P.
2016-01-01
Natural odors typically consist of many molecules at different concentrations. It is unclear how the numerous odorant molecules and their possible mixtures are discriminated by relatively few olfactory receptors. Using an information theoretic model, we show that a receptor array is optimal for this task if it achieves two possibly conflicting goals: (i) Each receptor should respond to half of all odors and (ii) the response of different receptors should be uncorrelated when averaged over odors presented with natural statistics. We use these design principles to predict statistics of the affinities between receptors and odorant molecules for a broad class of odor statistics. We also show that optimal receptor arrays can be tuned to either resolve concentrations well or distinguish mixtures reliably. Finally, we use our results to predict properties of experimentally measured receptor arrays. Our work can thus be used to better understand natural olfaction, and it also suggests ways to improve artificial sensor arrays. PMID:27102871
Truth, Damn Truth, and Statistics
ERIC Educational Resources Information Center
Velleman, Paul F.
2008-01-01
Statisticians and Statistics teachers often have to push back against the popular impression that Statistics teaches how to lie with data. Those who believe incorrectly that Statistics is solely a branch of Mathematics (and thus algorithmic), often see the use of judgment in Statistics as evidence that we do indeed manipulate our results. In the…
Experimental Mathematics and Computational Statistics
Bailey, David H.; Borwein, Jonathan M.
2009-04-30
The field of statistics has long been noted for techniques to detect patterns and regularities in numerical data. In this article we explore connections between statistics and the emerging field of 'experimental mathematics'. These includes both applications of experimental mathematics in statistics, as well as statistical methods applied to computational mathematics.
The relationship between twelve-month home stimulation and school achievement.
van Doorninck, W J; Caldwell, B M; Wright, C; Frankenburg, W K
1981-09-01
Home Observation for Measurement of the Environment (HOME) was designed to reflect parental support of early cognitive and socioemotional development. 12-month HOME scores were correlated with elementary school achievement, 5--9 years later. 50 low-income children were rank ordered by a weighted average of centile estimates of achievement test scores, letter grades, and curriculum levels in reading and math. 24 children were classified as having significant school achievement problems. The HOME total score correlated significantly, r = .37, with school centile scores among the low-income families. The statistically more appropriate contingency table analysis revealed a 68% correct classification rate and a significantly reduced error rate over random or blanket prediction. The results supported the predictive value of the 12-month HOME for school achievement among low-income families. In an additional sample of 21 middle-income families, there was insufficient variability among HOME scores to allow prediction. The HOME total scores were highly correlated, r = .86, among siblings tested at least 10 months apart.
Using scientifically and statistically sufficient statistics in comparing image segmentations.
Chi, Yueh-Yun; Muller, Keith E
2010-01-01
Automatic computer segmentation in three dimensions creates opportunity to reduce the cost of three-dimensional treatment planning of radiotherapy for cancer treatment. Comparisons between human and computer accuracy in segmenting kidneys in CT scans generate distance values far larger in number than the number of CT scans. Such high dimension, low sample size (HDLSS) data present a grand challenge to statisticians: how do we find good estimates and make credible inference? We recommend discovering and using scientifically and statistically sufficient statistics as an additional strategy for overcoming the curse of dimensionality. First, we reduced the three-dimensional array of distances for each image comparison to a histogram to be modeled individually. Second, we used non-parametric kernel density estimation to explore distributional patterns and assess multi-modality. Third, a systematic exploratory search for parametric distributions and truncated variations led to choosing a Gaussian form as approximating the distribution of a cube root transformation of distance. Fourth, representing each histogram by an individually estimated distribution eliminated the HDLSS problem by reducing on average 26,000 distances per histogram to just 2 parameter estimates. In the fifth and final step we used classical statistical methods to demonstrate that the two human observers disagreed significantly less with each other than with the computer segmentation. Nevertheless, the size of all disagreements was clinically unimportant relative to the size of a kidney. The hierarchal modeling approach to object-oriented data created response variables deemed sufficient by both the scientists and statisticians. We believe the same strategy provides a useful addition to the imaging toolkit and will succeed with many other high throughput technologies in genetics, metabolomics and chemical analysis.
Who Needs Statistics? | Poster
You may know the feeling. You have collected a lot of new data on an important experiment. Now you are faced with multiple groups of data, a sea of numbers, and a deadline for submitting your paper to a peer-reviewed journal. And you are not sure which data are relevant, or even the best way to present them. The statisticians at Data Management Services (DMS) know how to help. This small group of experts provides a wide array of statistical and mathematical consulting services to the scientific community at NCI at Frederick and NCI-Bethesda.
International petroleum statistics report
1995-10-01
The International Petroleum Statistics Report is a monthly publication that provides current international oil data. This report presents data on international oil production, demand, imports, exports and stocks. The report has four sections. Section 1 contains time series data on world oil production, and on oil demand and stocks in the Organization for Economic Cooperation and Development (OECD). Section 2 presents an oil supply/demand balance for the world, in quarterly intervals for the most recent two years. Section 3 presents data on oil imports by OECD countries. Section 4 presents annual time series data on world oil production and oil stocks, demand, and trade in OECD countries.
NASA Technical Reports Server (NTRS)
1994-01-01
Pocket Statistics is published for the use of NASA managers and their staff. Included herein is Administrative and Organizational information, summaries of Space Flight Activity including the NASA Major Launch Record, and NASA Procurement, Financial, and Manpower data. The NASA Major Launch Record includes all launches of Scout class and larger vehicles. Vehicle and spacecraft development flights are also included in the Major Launch Record. Shuttle missions are counted as one launch and one payload, where free flying payloads are not involved. Satellites deployed from the cargo bay of the Shuttle and placed in a separate orbit or trajectory are counted as an additional payload.
NASA Technical Reports Server (NTRS)
1995-01-01
NASA Pocket Statistics is published for the use of NASA managers and their staff. Included herein is Administrative and Organizational information, summaries of Space Flight Activity including the NASA Major Launch Record, and NASA Procurement, Financial, and Manpower data. The NASA Major Launch Record includes all launches of Scout class and larger vehicles. Vehicle and spacecraft development flights are also included in the Major Launch Record. Shuttle missions are counted as one launch and one payload, where free flying payloads are not involved. Satellites deployed from the cargo bay of the Shuttle and placed in a separate orbit or trajectory are counted as an additional payload.
NASA Technical Reports Server (NTRS)
1996-01-01
This booklet of pocket statistics includes the 1996 NASA Major Launch Record, NASA Procurement, Financial, and Workforce data. The NASA Major Launch Record includes all launches of Scout class and larger vehicles. Vehicle and spacecraft development flights are also included in the Major Luanch Record. Shuttle missions are counted as one launch and one payload, where free flying payloads are not involved. Satellites deployed from the cargo bay of the Shuttle and placed in a separate orbit or trajectory are counted as an additional payload.
Influence of Mothers' Education on Children's Maths Achievement in Kenya
ERIC Educational Resources Information Center
Abuya, Benta A.; Oketch, Moses; Mutisya, Maurice; Ngware, Moses; Ciera, James
2013-01-01
Research shows that fathers' level of education predicts achievement of both boys and girls, with significantly greater effect for boys. Similarly, mothers' level of education predicts the achievement of girls but not boys. This study tests the mother-child education achievement hypothesis, by examining the effect of mothers' education on the…
2008-02-01
This DOE Save Energy Now case study describes how Chrysler LLC saves more than 70,000 MMBtu and $627,000 annually after increasing the steam system energy efficiency of a truck and minivan assembly plant in St. Louis, Missouri.
2012-03-01
Secretary of the Army for Manpower and Reserve Affairs COBOL Common Business Oriented Language CFO Chief Financial Officer CIGIE Council of the...pay, is an aging, Common Business Oriented Language ( COBOL ) 49
ERIC Educational Resources Information Center
Davis, Andrew
2015-01-01
PISA claims that it can extend its reach from its current core subjects of Reading, Science, Maths and problem-solving. Yet given the requirement for high levels of reliability for PISA, especially in the light of its current high stakes character, proposed widening of its subject coverage cannot embrace some important aspects of the social and…
ERIC Educational Resources Information Center
Lanfranchi, Andrea
2014-01-01
This article examines procedures and processes that result in the over-referral of migrant students to separate special education programmes and, as a consequence, their exclusion from general education. The particular focus is on the role of the school psychologist in this process. The empirical study is a comparison of Swiss teachers' and school…
Pandis, Nikolaos; Polychronopoulou, Argy; Eliades, Theodore
2011-12-01
Randomization is a key step in reducing selection bias during the treatment allocation phase in randomized clinical trials. The process of randomization follows specific steps, which include generation of the randomization list, allocation concealment, and implementation of randomization. The phenomenon in the dental and orthodontic literature of characterizing treatment allocation as random is frequent; however, often the randomization procedures followed are not appropriate. Randomization methods assign, at random, treatment to the trial arms without foreknowledge of allocation by either the participants or the investigators thus reducing selection bias. Randomization entails generation of random allocation, allocation concealment, and the actual methodology of implementing treatment allocation randomly and unpredictably. Most popular randomization methods include some form of restricted and/or stratified randomization. This article introduces the reasons, which make randomization an integral part of solid clinical trial methodology, and presents the main randomization schemes applicable to clinical trials in orthodontics.
Black Hegemony, a Significant Influence in the School Success of High-Achieving African Americans.
ERIC Educational Resources Information Center
Murphy, Jean C.
This is an interpretive study of the influence of Black Hegemony on the academic success of three successful African Americans: Clifton L. Taulbert, Henry Louis Gates, Jr., and Margaret Morgan Lawrence. All three spent their youth in southern communities strongly influenced by Jim Crow laws and customs, and their academic accomplishments were…
ERIC Educational Resources Information Center
Suarez-Orozco, Carola; Pimentel, Allyson; Martin, Margary
2009-01-01
Background/Context: Newcomer immigrant students are entering schools in the United States in unprecedented numbers. As they enter new school contexts, they face a number of challenges in their adjustment. Previous literature suggested that relationships in school play a particularly crucial role in promoting socially competent behavior in the…
NASA Technical Reports Server (NTRS)
Head, J. W. (Editor)
1978-01-01
Developments reported at a meeting of principal investigators for NASA's planetology geology program are summarized. Topics covered include: constraints on solar system formation; asteriods, comets, and satellites; constraints on planetary interiors; volatiles and regoliths; instrument development techniques; planetary cartography; geological and geochemical constraints on planetary evolution; fluvial processes and channel formation; volcanic processes; Eolian processes; radar studies of planetary surfaces; cratering as a process, landform, and dating method; and the Tharsis region of Mars. Activities at a planetary geology field conference on Eolian processes are reported and techniques recommended for the presentation and analysis of crater size-frequency data are included.
The Significance of Career Narrative in Examining a High-Achieving Woman's Career
ERIC Educational Resources Information Center
Elley-Brown, Margaret J.
2011-01-01
In this qualitative study, the career journey of one New Zealand woman was analysed. Three key findings emerged: the power of narrative as a vehicle for this woman's story, her movement towards greater authenticity and spiritual fulfilment as a mature woman, and the ongoing struggle for concurrent fulfilment from communal and agentic perspectives.…
Fungi producing significant mycotoxins.
2012-01-01
Mycotoxins are secondary metabolites of microfungi that are known to cause sickness or death in humans or animals. Although many such toxic metabolites are known, it is generally agreed that only a few are significant in causing disease: aflatoxins, fumonisins, ochratoxin A, deoxynivalenol, zearalenone, and ergot alkaloids. These toxins are produced by just a few species from the common genera Aspergillus, Penicillium, Fusarium, and Claviceps. All Aspergillus and Penicillium species either are commensals, growing in crops without obvious signs of pathogenicity, or invade crops after harvest and produce toxins during drying and storage. In contrast, the important Fusarium and Claviceps species infect crops before harvest. The most important Aspergillus species, occurring in warmer climates, are A. flavus and A. parasiticus, which produce aflatoxins in maize, groundnuts, tree nuts, and, less frequently, other commodities. The main ochratoxin A producers, A. ochraceus and A. carbonarius, commonly occur in grapes, dried vine fruits, wine, and coffee. Penicillium verrucosum also produces ochratoxin A but occurs only in cool temperate climates, where it infects small grains. F. verticillioides is ubiquitous in maize, with an endophytic nature, and produces fumonisins, which are generally more prevalent when crops are under drought stress or suffer excessive insect damage. It has recently been shown that Aspergillus niger also produces fumonisins, and several commodities may be affected. F. graminearum, which is the major producer of deoxynivalenol and zearalenone, is pathogenic on maize, wheat, and barley and produces these toxins whenever it infects these grains before harvest. Also included is a short section on Claviceps purpurea, which produces sclerotia among the seeds in grasses, including wheat, barley, and triticale. The main thrust of the chapter contains information on the identification of these fungi and their morphological characteristics, as well as factors
Public health significance of neuroticism.
Lahey, Benjamin B
2009-01-01
The personality trait of neuroticism refers to relatively stable tendencies to respond with negative emotions to threat, frustration, or loss. Individuals in the population vary markedly on this trait, ranging from frequent and intense emotional reactions to minor challenges to little emotional reaction even in the face of significant difficulties. Although not widely appreciated, there is growing evidence that neuroticism is a psychological trait of profound public health significance. Neuroticism is a robust correlate and predictor of many different mental and physical disorders, comorbidity among them, and the frequency of mental and general health service use. Indeed, neuroticism apparently is a predictor of the quality and longevity of our lives. Achieving a full understanding of the nature and origins of neuroticism, and the mechanisms through which neuroticism is linked to mental and physical disorders, should be a top priority for research. Knowing why neuroticism predicts such a wide variety of seemingly diverse outcomes should lead to improved understanding of commonalities among those outcomes and improved strategies for preventing them.
Personal and family factors as predictors of pupils' mathematics achievement.
Alomar, Bader O
2007-08-01
This study examined personal and family factors in prediction of mathematics achievement by Kuwaiti fourth graders (395 boys, 501 girls; M age=10.0 yr., SD=8.0 mo.). Personal variables included sex, total achievement, perception of parental involvement, pupil's attitude towards school, and mathematics achievement. Family variables included parental education and parental involvement, views of school, and income. The data had good fit with the suggested model. Analysis showed variables which had significant direct association with mathematics achievement were total achievement and sex. Parental education, pupil's sex, and attitude towards school had significant indirect associations with mathematical achievement. Associations were direct for boys and indirect for girls on mathematics achievement, so sex had minimal total effects on mathematics achievement.
Statistics of superior records
NASA Astrophysics Data System (ADS)
Ben-Naim, E.; Krapivsky, P. L.
2013-08-01
We study statistics of records in a sequence of random variables. These identical and independently distributed variables are drawn from the parent distribution ρ. The running record equals the maximum of all elements in the sequence up to a given point. We define a superior sequence as one where all running records are above the average record expected for the parent distribution ρ. We find that the fraction of superior sequences SN decays algebraically with sequence length N, SN˜N-β in the limit N→∞. Interestingly, the decay exponent β is nontrivial, being the root of an integral equation. For example, when ρ is a uniform distribution with compact support, we find β=0.450265. In general, the tail of the parent distribution governs the exponent β. We also consider the dual problem of inferior sequences, where all records are below average, and find that the fraction of inferior sequences IN decays algebraically, albeit with a different decay exponent, IN˜N-α. We use the above statistical measures to analyze earthquake data.
Fragile entanglement statistics
NASA Astrophysics Data System (ADS)
Brody, Dorje C.; Hughston, Lane P.; Meier, David M.
2015-10-01
If X and Y are independent, Y and Z are independent, and so are X and Z, one might be tempted to conclude that X, Y, and Z are independent. But it has long been known in classical probability theory that, intuitive as it may seem, this is not true in general. In quantum mechanics one can ask whether analogous statistics can emerge for configurations of particles in certain types of entangled states. The explicit construction of such states, along with the specification of suitable sets of observables that have the purported statistical properties, is not entirely straightforward. We show that an example of such a configuration arises in the case of an N-particle GHZ state, and we are able to identify a family of observables with the property that the associated measurement outcomes are independent for any choice of 2,3,\\ldots ,N-1 of the particles, even though the measurement outcomes for all N particles are not independent. Although such states are highly entangled, the entanglement turns out to be ‘fragile’, i.e. the associated density matrix has the property that if one traces out the freedom associated with even a single particle, the resulting reduced density matrix is separable.
International petroleum statistics report
1997-05-01
The International Petroleum Statistics Report is a monthly publication that provides current international oil data. This report is published for the use of Members of Congress, Federal agencies, State agencies, industry, and the general public. Publication of this report is in keeping with responsibilities given the Energy Information Administration in Public Law 95-91. The International Petroleum Statistics Report presents data on international oil production, demand, imports, and stocks. The report has four sections. Section 1 contains time series data on world oil production, and on oil demand and stocks in the Organization for Economic Cooperation and Development (OECD). This section contains annual data beginning in 1985, and monthly data for the most recent two years. Section 2 presents an oil supply/demand balance for the world. This balance is presented in quarterly intervals for the most recent two years. Section 3 presents data on oil imports by OECD countries. This section contains annual data for the most recent year, quarterly data for the most recent two quarters, and monthly data for the most recent twelve months. Section 4 presents annual time series data on world oil production and oil stocks, demand, and trade in OECD countries. World oil production and OECD demand data are for the years 1970 through 1995; OECD stocks from 1973 through 1995; and OECD trade from 1985 through 1995.
Elements of Statistical Mechanics
NASA Astrophysics Data System (ADS)
Sachs, Ivo; Sen, Siddhartha; Sexton, James
2006-05-01
This textbook provides a concise introduction to the key concepts and tools of modern statistical mechanics. It also covers advanced topics such as non-relativistic quantum field theory and numerical methods. After introducing classical analytical techniques, such as cluster expansion and Landau theory, the authors present important numerical methods with applications to magnetic systems, Lennard-Jones fluids and biophysics. Quantum statistical mechanics is discussed in detail and applied to Bose-Einstein condensation and topics in astrophysics and cosmology. In order to describe emergent phenomena in interacting quantum systems, canonical non-relativistic quantum field theory is introduced and then reformulated in terms of Feynman integrals. Combining the authors' many years' experience of teaching courses in this area, this textbook is ideal for advanced undergraduate and graduate students in physics, chemistry and mathematics. Analytical and numerical techniques in one text, including sample codes and solved problems on the web at www.cambridge.org/0521841984 Covers a wide range of applications including magnetic systems, turbulence astrophysics, and biology Contains a concise introduction to Markov processes and molecular dynamics
Statistical clumped isotope signatures
Röckmann, T.; Popa, M. E.; Krol, M. C.; Hofmann, M. E. G.
2016-01-01
High precision measurements of molecules containing more than one heavy isotope may provide novel constraints on element cycles in nature. These so-called clumped isotope signatures are reported relative to the random (stochastic) distribution of heavy isotopes over all available isotopocules of a molecule, which is the conventional reference. When multiple indistinguishable atoms of the same element are present in a molecule, this reference is calculated from the bulk (≈average) isotopic composition of the involved atoms. We show here that this referencing convention leads to apparent negative clumped isotope anomalies (anti-clumping) when the indistinguishable atoms originate from isotopically different populations. Such statistical clumped isotope anomalies must occur in any system where two or more indistinguishable atoms of the same element, but with different isotopic composition, combine in a molecule. The size of the anti-clumping signal is closely related to the difference of the initial isotope ratios of the indistinguishable atoms that have combined. Therefore, a measured statistical clumped isotope anomaly, relative to an expected (e.g. thermodynamical) clumped isotope composition, may allow assessment of the heterogeneity of the isotopic pools of atoms that are the substrate for formation of molecules. PMID:27535168
ERIC Educational Resources Information Center
Perepiczka, Michelle; Chandler, Nichelle; Becerra, Michael
2011-01-01
Statistics plays an integral role in graduate programs. However, numerous intra- and interpersonal factors may lead to successful completion of needed coursework in this area. The authors examined the extent of the relationship between self-efficacy to learn statistics and statistics anxiety, attitude towards statistics, and social support of 166…
Liu, Kun-Shia; Cheng, Ying-Yao; Chen, Yi-Ling; Wu, Yuh-Yih
2009-01-01
This study used nationwide data from the Taiwan Education Panel Survey (TEPS) to examine the longitudinal effects of educational expectations and achievement attributions on the academic achievements of adolescents. The sample included 2,000 Taiwanese secondary school students, each of whom completed three waves of questionnaires and cognitive tests: the first in grade 7 (in 2001), the second in grade 9 (in 2003), and the third in grade 11 (in 2005). Through multilevel longitudinal analysis, the results showed: (1) educational expectations accounted for a moderate amount of the variance in academic achievements; (2) students with high educational expectations and effort attribution exhibited higher growth rates in their academic achievements; and (3) studentswith lower educational expectations and those attributing success to others showed significantly fewer academic achievements and significantly lower growth rates in such achievements. The results demonstrated that adolescents' educational expectations and achievement attributions play crucial roles in the long-term course of academic accomplishments. Implications for educational practice and further studies are also discussed.
Stenderup, Karin; Rosada, Cecilia; Alifrangis, Lene; Andersen, Søren; Dam, Tomas Norman
2011-05-01
Psoriasis xenograft transplantation models where human skin is transplanted onto immune-deficient mice are generally accepted in psoriasis research. Over the last decade, they have been widely employed to screen for new therapeutics with a potential anti-psoriatic effect. However, experimental designs differ in several parameters. Especially, the number of donors and grafts per experimental design varies greatly; numbers that are directly related to the probability of detecting statistically significant drug effects. In this study, we performed a statistical evaluation of the effect of cyclosporine A, a recognized anti-psoriatic drug, to generate a statistical model employable to simulate different scenarios of experimental designs and to calculate the associated statistical study power, defined as the probability of detecting a statistically significant anti-psoriatic drug treatment effect. Results showed that to achieve a study power of 0.8, at least 20 grafts per treatment group and a minimum of five donors should be included in the chosen experimental setting. To our knowledge, this is the first time that study power calculations have been performed to evaluate treatment effects in a psoriasis xenograft transplantation model. This study was based on a defined experimental protocol, thus other parameters such as drug potency, treatment protocol, mouse strain and graft size should, also, be taken into account when designing an experiment. We propose that the results obtained in this study may lend a more quantitative support to the validity of results obtained when exploring new potential anti-psoriatic drug effects.
Fundamental Limitations of High Contrast Imaging Set by Small Sample Statistics
NASA Astrophysics Data System (ADS)
Mawet, D.; Milli, J.; Wahhaj, Z.; Pelat, D.; Absil, O.; Delacroix, C.; Boccaletti, A.; Kasper, M.; Kenworthy, M.; Marois, C.; Mennesson, B.; Pueyo, L.
2014-09-01
In this paper, we review the impact of small sample statistics on detection thresholds and corresponding confidence levels (CLs) in high-contrast imaging at small angles. When looking close to the star, the number of resolution elements decreases rapidly toward small angles. This reduction of the number of degrees of freedom dramatically affects CLs and false alarm probabilities. Naively using the same ideal hypothesis and methods as for larger separations, which are well understood and commonly assume Gaussian noise, can yield up to one order of magnitude error in contrast estimations at fixed CL. The statistical penalty exponentially increases toward very small inner working angles. Even at 5-10 resolution elements from the star, false alarm probabilities can be significantly higher than expected. Here we present a rigorous statistical analysis that ensures robustness of the CL, but also imposes a substantial limitation on corresponding achievable detection limits (thus contrast) at small angles. This unavoidable fundamental statistical effect has a significant impact on current coronagraphic and future high-contrast imagers. Finally, the paper concludes with practical recommendations to account for small number statistics when computing the sensitivity to companions at small angles and when exploiting the results of direct imaging planet surveys.