A Critique of One-Tailed Hypothesis Test Procedures in Business and Economics Statistics Textbooks.
ERIC Educational Resources Information Center
Liu, Tung; Stone, Courtenay C.
1999-01-01
Surveys introductory business and economics statistics textbooks and finds that they differ over the best way to explain one-tailed hypothesis tests: the simple null-hypothesis approach or the composite null-hypothesis approach. Argues that the composite null-hypothesis approach contains methodological shortcomings that make it more difficult for…
A shift from significance test to hypothesis test through power analysis in medical research.
Singh, G
2006-01-01
Medical research literature until recently, exhibited substantial dominance of the Fisher's significance test approach of statistical inference concentrating more on probability of type I error over Neyman-Pearson's hypothesis test considering both probability of type I and II error. Fisher's approach dichotomises results into significant or not significant results with a P value. The Neyman-Pearson's approach talks of acceptance or rejection of null hypothesis. Based on the same theory these two approaches deal with same objective and conclude in their own way. The advancement in computing techniques and availability of statistical software have resulted in increasing application of power calculations in medical research and thereby reporting the result of significance tests in the light of power of the test also. Significance test approach, when it incorporates power analysis contains the essence of hypothesis test approach. It may be safely argued that rising application of power analysis in medical research may have initiated a shift from Fisher's significance test to Neyman-Pearson's hypothesis test procedure.
Unscaled Bayes factors for multiple hypothesis testing in microarray experiments.
Bertolino, Francesco; Cabras, Stefano; Castellanos, Maria Eugenia; Racugno, Walter
2015-12-01
Multiple hypothesis testing collects a series of techniques usually based on p-values as a summary of the available evidence from many statistical tests. In hypothesis testing, under a Bayesian perspective, the evidence for a specified hypothesis against an alternative, conditionally on data, is given by the Bayes factor. In this study, we approach multiple hypothesis testing based on both Bayes factors and p-values, regarding multiple hypothesis testing as a multiple model selection problem. To obtain the Bayes factors we assume default priors that are typically improper. In this case, the Bayes factor is usually undetermined due to the ratio of prior pseudo-constants. We show that ignoring prior pseudo-constants leads to unscaled Bayes factor which do not invalidate the inferential procedure in multiple hypothesis testing, because they are used within a comparative scheme. In fact, using partial information from the p-values, we are able to approximate the sampling null distribution of the unscaled Bayes factor and use it within Efron's multiple testing procedure. The simulation study suggests that under normal sampling model and even with small sample sizes, our approach provides false positive and false negative proportions that are less than other common multiple hypothesis testing approaches based only on p-values. The proposed procedure is illustrated in two simulation studies, and the advantages of its use are showed in the analysis of two microarray experiments. © The Author(s) 2011.
Facio, Flavia M; Sapp, Julie C; Linn, Amy; Biesecker, Leslie G
2012-10-10
Massively-parallel sequencing (MPS) technologies create challenges for informed consent of research participants given the enormous scale of the data and the wide range of potential results. We propose that the consent process in these studies be based on whether they use MPS to test a hypothesis or to generate hypotheses. To demonstrate the differences in these approaches to informed consent, we describe the consent processes for two MPS studies. The purpose of our hypothesis-testing study is to elucidate the etiology of rare phenotypes using MPS. The purpose of our hypothesis-generating study is to test the feasibility of using MPS to generate clinical hypotheses, and to approach the return of results as an experimental manipulation. Issues to consider in both designs include: volume and nature of the potential results, primary versus secondary results, return of individual results, duty to warn, length of interaction, target population, and privacy and confidentiality. The categorization of MPS studies as hypothesis-testing versus hypothesis-generating can help to clarify the issue of so-called incidental or secondary results for the consent process, and aid the communication of the research goals to study participants.
Improving the Crossing-SIBTEST Statistic for Detecting Non-uniform DIF.
Chalmers, R Philip
2018-06-01
This paper demonstrates that, after applying a simple modification to Li and Stout's (Psychometrika 61(4):647-677, 1996) CSIBTEST statistic, an improved variant of the statistic could be realized. It is shown that this modified version of CSIBTEST has a more direct association with the SIBTEST statistic presented by Shealy and Stout (Psychometrika 58(2):159-194, 1993). In particular, the asymptotic sampling distributions and general interpretation of the effect size estimates are the same for SIBTEST and the new CSIBTEST. Given the more natural connection to SIBTEST, it is shown that Li and Stout's hypothesis testing approach is insufficient for CSIBTEST; thus, an improved hypothesis testing procedure is required. Based on the presented arguments, a new chi-squared-based hypothesis testing approach is proposed for the modified CSIBTEST statistic. Positive results from a modest Monte Carlo simulation study strongly suggest the original CSIBTEST procedure and randomization hypothesis testing approach should be replaced by the modified statistic and hypothesis testing method.
Kruschke, John K; Liddell, Torrin M
2018-02-01
In the practice of data analysis, there is a conceptual distinction between hypothesis testing, on the one hand, and estimation with quantified uncertainty on the other. Among frequentists in psychology, a shift of emphasis from hypothesis testing to estimation has been dubbed "the New Statistics" (Cumming 2014). A second conceptual distinction is between frequentist methods and Bayesian methods. Our main goal in this article is to explain how Bayesian methods achieve the goals of the New Statistics better than frequentist methods. The article reviews frequentist and Bayesian approaches to hypothesis testing and to estimation with confidence or credible intervals. The article also describes Bayesian approaches to meta-analysis, randomized controlled trials, and power analysis.
ERIC Educational Resources Information Center
Fan, Weihua; Hancock, Gregory R.
2012-01-01
This study proposes robust means modeling (RMM) approaches for hypothesis testing of mean differences for between-subjects designs in order to control the biasing effects of nonnormality and variance inequality. Drawing from structural equation modeling (SEM), the RMM approaches make no assumption of variance homogeneity and employ robust…
ERIC Educational Resources Information Center
Marmolejo-Ramos, Fernando; Cousineau, Denis
2017-01-01
The number of articles showing dissatisfaction with the null hypothesis statistical testing (NHST) framework has been progressively increasing over the years. Alternatives to NHST have been proposed and the Bayesian approach seems to have achieved the highest amount of visibility. In this last part of the special issue, a few alternative…
NASA Astrophysics Data System (ADS)
Lehmann, Rüdiger; Lösler, Michael
2017-12-01
Geodetic deformation analysis can be interpreted as a model selection problem. The null model indicates that no deformation has occurred. It is opposed to a number of alternative models, which stipulate different deformation patterns. A common way to select the right model is the usage of a statistical hypothesis test. However, since we have to test a series of deformation patterns, this must be a multiple test. As an alternative solution for the test problem, we propose the p-value approach. Another approach arises from information theory. Here, the Akaike information criterion (AIC) or some alternative is used to select an appropriate model for a given set of observations. Both approaches are discussed and applied to two test scenarios: A synthetic levelling network and the Delft test data set. It is demonstrated that they work but behave differently, sometimes even producing different results. Hypothesis tests are well-established in geodesy, but may suffer from an unfavourable choice of the decision error rates. The multiple test also suffers from statistical dependencies between the test statistics, which are neglected. Both problems are overcome by applying information criterions like AIC.
Bayesian Approaches to Imputation, Hypothesis Testing, and Parameter Estimation
ERIC Educational Resources Information Center
Ross, Steven J.; Mackey, Beth
2015-01-01
This chapter introduces three applications of Bayesian inference to common and novel issues in second language research. After a review of the critiques of conventional hypothesis testing, our focus centers on ways Bayesian inference can be used for dealing with missing data, for testing theory-driven substantive hypotheses without a default null…
Testing the null hypothesis: the forgotten legacy of Karl Popper?
Wilkinson, Mick
2013-01-01
Testing of the null hypothesis is a fundamental aspect of the scientific method and has its basis in the falsification theory of Karl Popper. Null hypothesis testing makes use of deductive reasoning to ensure that the truth of conclusions is irrefutable. In contrast, attempting to demonstrate the new facts on the basis of testing the experimental or research hypothesis makes use of inductive reasoning and is prone to the problem of the Uniformity of Nature assumption described by David Hume in the eighteenth century. Despite this issue and the well documented solution provided by Popper's falsification theory, the majority of publications are still written such that they suggest the research hypothesis is being tested. This is contrary to accepted scientific convention and possibly highlights a poor understanding of the application of conventional significance-based data analysis approaches. Our work should remain driven by conjecture and attempted falsification such that it is always the null hypothesis that is tested. The write up of our studies should make it clear that we are indeed testing the null hypothesis and conforming to the established and accepted philosophical conventions of the scientific method.
Hammer, Jennifer L; Marsh, Abigail A
2015-04-01
Despite communicating a "negative" emotion, fearful facial expressions predominantly elicit behavioral approach from perceivers. It has been hypothesized that this seemingly paradoxical effect may occur due to fearful expressions' resemblance to vulnerable, infantile faces. However, this hypothesis has not yet been tested. We used a combined approach-avoidance/implicit association test (IAT) to test this hypothesis. Participants completed an approach-avoidance lever task during which they responded to fearful and angry facial expressions as well as neutral infant and adult faces presented in an IAT format. Results demonstrated an implicit association between fearful facial expressions and infant faces and showed that both fearful expressions and infant faces primarily elicit behavioral approach. The dominance of approach responses to both fearful expressions and infant faces decreased as a function of psychopathic personality traits. Results suggest that the prosocial responses to fearful expressions observed in most individuals may stem from their associations with infantile faces. (PsycINFO Database Record (c) 2015 APA, all rights reserved).
Fisher, Neyman-Pearson or NHST? A tutorial for teaching data testing.
Perezgonzalez, Jose D
2015-01-01
Despite frequent calls for the overhaul of null hypothesis significance testing (NHST), this controversial procedure remains ubiquitous in behavioral, social and biomedical teaching and research. Little change seems possible once the procedure becomes well ingrained in the minds and current practice of researchers; thus, the optimal opportunity for such change is at the time the procedure is taught, be this at undergraduate or at postgraduate levels. This paper presents a tutorial for the teaching of data testing procedures, often referred to as hypothesis testing theories. The first procedure introduced is Fisher's approach to data testing-tests of significance; the second is Neyman-Pearson's approach-tests of acceptance; the final procedure is the incongruent combination of the previous two theories into the current approach-NSHT. For those researchers sticking with the latter, two compromise solutions on how to improve NHST conclude the tutorial.
The [Geo]Scientific Method; Hypothesis Testing and Geoscience Proposal Writing for Students
ERIC Educational Resources Information Center
Markley, Michelle J.
2010-01-01
Most undergraduate-level geoscience texts offer a paltry introduction to the nuanced approach to hypothesis testing that geoscientists use when conducting research and writing proposals. Fortunately, there are a handful of excellent papers that are accessible to geoscience undergraduates. Two historical papers by the eminent American geologists G.…
Mental Abilities and School Achievement: A Test of a Mediation Hypothesis
ERIC Educational Resources Information Center
Vock, Miriam; Preckel, Franzis; Holling, Heinz
2011-01-01
This study analyzes the interplay of four cognitive abilities--reasoning, divergent thinking, mental speed, and short-term memory--and their impact on academic achievement in school in a sample of adolescents in grades seven to 10 (N = 1135). Based on information processing approaches to intelligence, we tested a mediation hypothesis, which states…
Thou Shalt Not Bear False Witness against Null Hypothesis Significance Testing
ERIC Educational Resources Information Center
García-Pérez, Miguel A.
2017-01-01
Null hypothesis significance testing (NHST) has been the subject of debate for decades and alternative approaches to data analysis have been proposed. This article addresses this debate from the perspective of scientific inquiry and inference. Inference is an inverse problem and application of statistical methods cannot reveal whether effects…
A large scale test of the gaming-enhancement hypothesis.
Przybylski, Andrew K; Wang, John C
2016-01-01
A growing research literature suggests that regular electronic game play and game-based training programs may confer practically significant benefits to cognitive functioning. Most evidence supporting this idea, the gaming-enhancement hypothesis , has been collected in small-scale studies of university students and older adults. This research investigated the hypothesis in a general way with a large sample of 1,847 school-aged children. Our aim was to examine the relations between young people's gaming experiences and an objective test of reasoning performance. Using a Bayesian hypothesis testing approach, evidence for the gaming-enhancement and null hypotheses were compared. Results provided no substantive evidence supporting the idea that having preference for or regularly playing commercially available games was positively associated with reasoning ability. Evidence ranged from equivocal to very strong in support for the null hypothesis over what was predicted. The discussion focuses on the value of Bayesian hypothesis testing for investigating electronic gaming effects, the importance of open science practices, and pre-registered designs to improve the quality of future work.
Using Backward Design in Education Research: A Research Methods Essay †
Jensen, Jamie L.; Bailey, Elizabeth G.; Kummer, Tyler A.; Weber, K. Scott
2017-01-01
Education research within the STEM disciplines applies a scholarly approach to teaching and learning, with the intent of better understanding how people learn and of improving pedagogy at the undergraduate level. Most of the professionals practicing in this field have ‘crossed over’ from other disciplinary fields and thus have faced challenges in becoming experts in a new discipline. In this article, we offer a novel framework for approaching education research design called Backward Design in Education Research. It is patterned on backward curricular design and provides a three-step, systematic approach to designing education projects: 1) Define a research question that leads to a testable causal hypothesis based on a theoretical rationale; 2) Choose or design the assessment instruments to test the research hypothesis; and 3) Develop an experimental protocol that will be effective in testing the research hypothesis. This approach provides a systematic method to develop and carry out evidence-based research design. PMID:29854045
Differentiating between rights-based and relational ethical approaches.
Trobec, Irena; Herbst, Majda; Zvanut, Bostjan
2009-05-01
When forced treatment in mental health care is under consideration, two approaches guide clinicians in their actions: the dominant rights-based approach and the relational ethical approach. We hypothesized that nurses with bachelor's degrees differentiate better between the two approaches than nurses without a degree. To test this hypothesis a survey was performed in major Slovenian health institutions. We found that nurses emphasize the importance of ethics and personal values, but 55.4% of all the nurse participants confused the two approaches. The results confirmed our hypothesis and indicate the importance of nurses' formal education, especially when caring for patients with mental illness.
A Description of a Blind Student's Science Process Skills through Health Physics
ERIC Educational Resources Information Center
Bülbül, M. Sahin
2013-01-01
This study describes an approach for blind students thought health physics about how they could set a hypothesis and test it. The participant of the study used some health materials designed for high school blind student and tested her hypothesis with the data she gathered with those materials. It was asked that she should hypothesize which could…
Testing for purchasing power parity in 21 African countries using several unit root tests
NASA Astrophysics Data System (ADS)
Choji, Niri Martha; Sek, Siok Kun
2017-04-01
Purchasing power parity is used as a basis for international income and expenditure comparison through the exchange rate theory. However, empirical studies show disagreement on the validity of PPP. In this paper, we conduct the testing on the validity of PPP using panel data approach. We apply seven different panel unit root tests to test the validity of the purchasing power parity (PPP) hypothesis based on the quarterly data on real effective exchange rate for 21 African countries from the period 1971: Q1-2012: Q4. All the results of the seven tests rejected the hypothesis of stationarity meaning that absolute PPP does not hold in those African Countries. This result confirmed the claim from previous studies that standard panel unit tests fail to support the PPP hypothesis.
When Null Hypothesis Significance Testing Is Unsuitable for Research: A Reassessment.
Szucs, Denes; Ioannidis, John P A
2017-01-01
Null hypothesis significance testing (NHST) has several shortcomings that are likely contributing factors behind the widely debated replication crisis of (cognitive) neuroscience, psychology, and biomedical science in general. We review these shortcomings and suggest that, after sustained negative experience, NHST should no longer be the default, dominant statistical practice of all biomedical and psychological research. If theoretical predictions are weak we should not rely on all or nothing hypothesis tests. Different inferential methods may be most suitable for different types of research questions. Whenever researchers use NHST they should justify its use, and publish pre-study power calculations and effect sizes, including negative findings. Hypothesis-testing studies should be pre-registered and optimally raw data published. The current statistics lite educational approach for students that has sustained the widespread, spurious use of NHST should be phased out.
When Null Hypothesis Significance Testing Is Unsuitable for Research: A Reassessment
Szucs, Denes; Ioannidis, John P. A.
2017-01-01
Null hypothesis significance testing (NHST) has several shortcomings that are likely contributing factors behind the widely debated replication crisis of (cognitive) neuroscience, psychology, and biomedical science in general. We review these shortcomings and suggest that, after sustained negative experience, NHST should no longer be the default, dominant statistical practice of all biomedical and psychological research. If theoretical predictions are weak we should not rely on all or nothing hypothesis tests. Different inferential methods may be most suitable for different types of research questions. Whenever researchers use NHST they should justify its use, and publish pre-study power calculations and effect sizes, including negative findings. Hypothesis-testing studies should be pre-registered and optimally raw data published. The current statistics lite educational approach for students that has sustained the widespread, spurious use of NHST should be phased out. PMID:28824397
NASA Astrophysics Data System (ADS)
Sirenko, M. A.; Tarasenko, P. F.; Pushkarev, M. I.
2017-01-01
One of the most noticeable features of sign-based statistical procedures is an opportunity to build an exact test for simple hypothesis testing of parameters in a regression model. In this article, we expanded a sing-based approach to the nonlinear case with dependent noise. The examined model is a multi-quantile regression, which makes it possible to test hypothesis not only of regression parameters, but of noise parameters as well.
A large scale test of the gaming-enhancement hypothesis
Wang, John C.
2016-01-01
A growing research literature suggests that regular electronic game play and game-based training programs may confer practically significant benefits to cognitive functioning. Most evidence supporting this idea, the gaming-enhancement hypothesis, has been collected in small-scale studies of university students and older adults. This research investigated the hypothesis in a general way with a large sample of 1,847 school-aged children. Our aim was to examine the relations between young people’s gaming experiences and an objective test of reasoning performance. Using a Bayesian hypothesis testing approach, evidence for the gaming-enhancement and null hypotheses were compared. Results provided no substantive evidence supporting the idea that having preference for or regularly playing commercially available games was positively associated with reasoning ability. Evidence ranged from equivocal to very strong in support for the null hypothesis over what was predicted. The discussion focuses on the value of Bayesian hypothesis testing for investigating electronic gaming effects, the importance of open science practices, and pre-registered designs to improve the quality of future work. PMID:27896035
Concerns regarding a call for pluralism of information theory and hypothesis testing
Lukacs, P.M.; Thompson, W.L.; Kendall, W.L.; Gould, W.R.; Doherty, P.F.; Burnham, K.P.; Anderson, D.R.
2007-01-01
1. Stephens et al . (2005) argue for `pluralism? in statistical analysis, combining null hypothesis testing and information-theoretic (I-T) methods. We show that I-T methods are more informative even in single variable problems and we provide an ecological example. 2. I-T methods allow inferences to be made from multiple models simultaneously. We believe multimodel inference is the future of data analysis, which cannot be achieved with null hypothesis-testing approaches. 3. We argue for a stronger emphasis on critical thinking in science in general and less reliance on exploratory data analysis and data dredging. Deriving alternative hypotheses is central to science; deriving a single interesting science hypothesis and then comparing it to a default null hypothesis (e.g. `no difference?) is not an efficient strategy for gaining knowledge. We think this single-hypothesis strategy has been relied upon too often in the past. 4. We clarify misconceptions presented by Stephens et al . (2005). 5. We think inference should be made about models, directly linked to scientific hypotheses, and their parameters conditioned on data, Prob(Hj| data). I-T methods provide a basis for this inference. Null hypothesis testing merely provides a probability statement about the data conditioned on a null model, Prob(data |H0). 6. Synthesis and applications. I-T methods provide a more informative approach to inference. I-T methods provide a direct measure of evidence for or against hypotheses and a means to consider simultaneously multiple hypotheses as a basis for rigorous inference. Progress in our science can be accelerated if modern methods can be used intelligently; this includes various I-T and Bayesian methods.
Comparing Web, Group and Telehealth Formats of a Military Parenting Program
2017-06-01
directed approaches. Comparative effectiveness will be tested by specifying a non - equivalence hypothesis for group -based and web-facilitated relative...Comparative effectiveness will be tested by specifying a non - equivalence hypothesis fro group based and individualized facilitated relative to self-directed...documents for review and approval. 1a. Finalize human subjects protocol and consent documents for pilot group (N=5 families), and randomized controlled
Studying biodiversity: is a new paradigm really needed?
Nichols, James D.; Cooch, Evan G.; Nichols, Jonathan M.; Sauer, John R.
2012-01-01
Authors in this journal have recommended a new approach to the conduct of biodiversity science. This data-driven approach requires the organization of large amounts of ecological data, analysis of these data to discover complex patterns, and subsequent development of hypotheses corresponding to detected patterns. This proposed new approach has been contrasted with more-traditional knowledge-based approaches in which investigators deduce consequences of competing hypotheses to be confronted with actual data, providing a basis for discriminating among the hypotheses. We note that one approach is directed at hypothesis generation, whereas the other is also focused on discriminating among competing hypotheses. Here, we argue for the importance of using existing knowledge to the separate issues of (a) hypothesis selection and generation and (b) hypothesis discrimination and testing. In times of limited conservation funding, the relative efficiency of different approaches to learning should be an important consideration in decisions about how to study biodiversity.
Debates—Hypothesis testing in hydrology: Pursuing certainty versus pursuing uberty
NASA Astrophysics Data System (ADS)
Baker, Victor R.
2017-03-01
Modern hydrology places nearly all its emphasis on science-as-knowledge, the hypotheses of which are increasingly expressed as physical models, whose predictions are tested by correspondence to quantitative data sets. Though arguably appropriate for applications of theory to engineering and applied science, the associated emphases on truth and degrees of certainty are not optimal for the productive and creative processes that facilitate the fundamental advancement of science as a process of discovery. The latter requires an investigative approach, where the goal is uberty, a kind of fruitfulness of inquiry, in which the abductive mode of inference adds to the much more commonly acknowledged modes of deduction and induction. The resulting world-directed approach to hydrology provides a valuable complement to the prevailing hypothesis- (theory-) directed paradigm.
Marsh, Herbert W
2008-10-01
Following William James (1890/1963), many leading self-esteem researchers continue to support the Individual-importance hypothesis-that the relation between specific facets of self-concept and global self-esteem depends on the importance an individual places on each specific facet. However, empirical support for the hypothesis is surprisingly elusive, whether evaluated in terms of an importance-weighted average model, a generalized multiple regression approach for testing self-concept-by-importance interactions, or idiographic approaches. How can actual empirical support for such an intuitively appealing and widely cited psychological principle be so elusive? Hardy and Moriarty (2006), acknowledging this previous failure of the Individual-importance hypothesis, claim to have solved the conundrum, demonstrating an innovative idiographic approach that provides clear support for it. However, a critical evaluation of their new approach, coupled with a reanalysis of their data, undermines their claims. Indeed, their data provide compelling support against the Individual-importance hypothesis, which remains as elusive as ever.
Phi Index: A New Metric to Test the Flush Early and Avoid the Rush Hypothesis
Samia, Diogo S. M.; Blumstein, Daniel T.
2014-01-01
Optimal escape theory states that animals should counterbalance the costs and benefits of flight when escaping from a potential predator. However, in apparent contradiction with this well-established optimality model, birds and mammals generally initiate escape soon after beginning to monitor an approaching threat, a phenomena codified as the “Flush Early and Avoid the Rush” (FEAR) hypothesis. Typically, the FEAR hypothesis is tested using correlational statistics and is supported when there is a strong relationship between the distance at which an individual first responds behaviorally to an approaching predator (alert distance, AD), and its flight initiation distance (the distance at which it flees the approaching predator, FID). However, such correlational statistics are both inadequate to analyze relationships constrained by an envelope (such as that in the AD-FID relationship) and are sensitive to outliers with high leverage, which can lead one to erroneous conclusions. To overcome these statistical concerns we develop the phi index (Φ), a distribution-free metric to evaluate the goodness of fit of a 1∶1 relationship in a constraint envelope (the prediction of the FEAR hypothesis). Using both simulation and empirical data, we conclude that Φ is superior to traditional correlational analyses because it explicitly tests the FEAR prediction, is robust to outliers, and it controls for the disproportionate influence of observations from large predictor values (caused by the constrained envelope in AD-FID relationship). Importantly, by analyzing the empirical data we corroborate the strong effect that alertness has on flight as stated by the FEAR hypothesis. PMID:25405872
Phi index: a new metric to test the flush early and avoid the rush hypothesis.
Samia, Diogo S M; Blumstein, Daniel T
2014-01-01
Optimal escape theory states that animals should counterbalance the costs and benefits of flight when escaping from a potential predator. However, in apparent contradiction with this well-established optimality model, birds and mammals generally initiate escape soon after beginning to monitor an approaching threat, a phenomena codified as the "Flush Early and Avoid the Rush" (FEAR) hypothesis. Typically, the FEAR hypothesis is tested using correlational statistics and is supported when there is a strong relationship between the distance at which an individual first responds behaviorally to an approaching predator (alert distance, AD), and its flight initiation distance (the distance at which it flees the approaching predator, FID). However, such correlational statistics are both inadequate to analyze relationships constrained by an envelope (such as that in the AD-FID relationship) and are sensitive to outliers with high leverage, which can lead one to erroneous conclusions. To overcome these statistical concerns we develop the phi index (Φ), a distribution-free metric to evaluate the goodness of fit of a 1:1 relationship in a constraint envelope (the prediction of the FEAR hypothesis). Using both simulation and empirical data, we conclude that Φ is superior to traditional correlational analyses because it explicitly tests the FEAR prediction, is robust to outliers, and it controls for the disproportionate influence of observations from large predictor values (caused by the constrained envelope in AD-FID relationship). Importantly, by analyzing the empirical data we corroborate the strong effect that alertness has on flight as stated by the FEAR hypothesis.
Chiba, Yasutaka
2017-09-01
Fisher's exact test is commonly used to compare two groups when the outcome is binary in randomized trials. In the context of causal inference, this test explores the sharp causal null hypothesis (i.e. the causal effect of treatment is the same for all subjects), but not the weak causal null hypothesis (i.e. the causal risks are the same in the two groups). Therefore, in general, rejection of the null hypothesis by Fisher's exact test does not mean that the causal risk difference is not zero. Recently, Chiba (Journal of Biometrics and Biostatistics 2015; 6: 244) developed a new exact test for the weak causal null hypothesis when the outcome is binary in randomized trials; the new test is not based on any large sample theory and does not require any assumption. In this paper, we extend the new test; we create a version of the test applicable to a stratified analysis. The stratified exact test that we propose is general in nature and can be used in several approaches toward the estimation of treatment effects after adjusting for stratification factors. The stratified Fisher's exact test of Jung (Biometrical Journal 2014; 56: 129-140) tests the sharp causal null hypothesis. This test applies a crude estimator of the treatment effect and can be regarded as a special case of our proposed exact test. Our proposed stratified exact test can be straightforwardly extended to analysis of noninferiority trials and to construct the associated confidence interval. © 2017 WILEY-VCH Verlag GmbH & Co. KGaA, Weinheim.
NASA Astrophysics Data System (ADS)
Calderon, Christopher P.; Weiss, Lucien E.; Moerner, W. E.
2014-05-01
Experimental advances have improved the two- (2D) and three-dimensional (3D) spatial resolution that can be extracted from in vivo single-molecule measurements. This enables researchers to quantitatively infer the magnitude and directionality of forces experienced by biomolecules in their native environment. Situations where such force information is relevant range from mitosis to directed transport of protein cargo along cytoskeletal structures. Models commonly applied to quantify single-molecule dynamics assume that effective forces and velocity in the x ,y (or x ,y,z) directions are statistically independent, but this assumption is physically unrealistic in many situations. We present a hypothesis testing approach capable of determining if there is evidence of statistical dependence between positional coordinates in experimentally measured trajectories; if the hypothesis of independence between spatial coordinates is rejected, then a new model accounting for 2D (3D) interactions can and should be considered. Our hypothesis testing technique is robust, meaning it can detect interactions, even if the noise statistics are not well captured by the model. The approach is demonstrated on control simulations and on experimental data (directed transport of intraflagellar transport protein 88 homolog in the primary cilium).
A Bayesian Approach to the Paleomagnetic Conglomerate Test
NASA Astrophysics Data System (ADS)
Heslop, David; Roberts, Andrew P.
2018-02-01
The conglomerate test has served the paleomagnetic community for over 60 years as a means to detect remagnetizations. The test states that if a suite of clasts within a bed have uniformly random paleomagnetic directions, then the conglomerate cannot have experienced a pervasive event that remagnetized the clasts in the same direction. The current form of the conglomerate test is based on null hypothesis testing, which results in a binary "pass" (uniformly random directions) or "fail" (nonrandom directions) outcome. We have recast the conglomerate test in a Bayesian framework with the aim of providing more information concerning the level of support a given data set provides for a hypothesis of uniformly random paleomagnetic directions. Using this approach, we place the conglomerate test in a fully probabilistic framework that allows for inconclusive results when insufficient information is available to draw firm conclusions concerning the randomness or nonrandomness of directions. With our method, sample sets larger than those typically employed in paleomagnetism may be required to achieve strong support for a hypothesis of random directions. Given the potentially detrimental effect of unrecognized remagnetizations on paleomagnetic reconstructions, it is important to provide a means to draw statistically robust data-driven inferences. Our Bayesian analysis provides a means to do this for the conglomerate test.
Issues in Language Testing Research.
ERIC Educational Resources Information Center
Oller, John W., Jr., Ed.
Practical and technical aspects of language testing research are considered in 23 articles. Topical areas include: testing of general proficiency; the hypothesis of a single unitary factor accounting for reliable variance in tests; the structure of language proficiency; pros and cons of cloze testing; a new functional testing approach; and…
Robust Approach to Verifying the Weak Form of the Efficient Market Hypothesis
NASA Astrophysics Data System (ADS)
Střelec, Luboš
2011-09-01
The weak form of the efficient markets hypothesis states that prices incorporate only past information about the asset. An implication of this form of the efficient markets hypothesis is that one cannot detect mispriced assets and consistently outperform the market through technical analysis of past prices. One of possible formulations of the efficient market hypothesis used for weak form tests is that share prices follow a random walk. It means that returns are realizations of IID sequence of random variables. Consequently, for verifying the weak form of the efficient market hypothesis, we can use distribution tests, among others, i.e. some tests of normality and/or some graphical methods. Many procedures for testing the normality of univariate samples have been proposed in the literature [7]. Today the most popular omnibus test of normality for a general use is the Shapiro-Wilk test. The Jarque-Bera test is the most widely adopted omnibus test of normality in econometrics and related fields. In particular, the Jarque-Bera test (i.e. test based on the classical measures of skewness and kurtosis) is frequently used when one is more concerned about heavy-tailed alternatives. As these measures are based on moments of the data, this test has a zero breakdown value [2]. In other words, a single outlier can make the test worthless. The reason so many classical procedures are nonrobust to outliers is that the parameters of the model are expressed in terms of moments, and their classical estimators are expressed in terms of sample moments, which are very sensitive to outliers. Another approach to robustness is to concentrate on the parameters of interest suggested by the problem under this study. Consequently, novel robust testing procedures of testing normality are presented in this paper to overcome shortcomings of classical normality tests in the field of financial data, which are typical with occurrence of remote data points and additional types of deviations from normality. This study also discusses some results of simulation power studies of these tests for normality against selected alternatives. Based on outcome of the power simulation study, selected normality tests were consequently used to verify weak form of efficiency in Central Europe stock markets.
Krefeld-Schwalb, Antonia; Witte, Erich H.; Zenker, Frank
2018-01-01
In psychology as elsewhere, the main statistical inference strategy to establish empirical effects is null-hypothesis significance testing (NHST). The recent failure to replicate allegedly well-established NHST-results, however, implies that such results lack sufficient statistical power, and thus feature unacceptably high error-rates. Using data-simulation to estimate the error-rates of NHST-results, we advocate the research program strategy (RPS) as a superior methodology. RPS integrates Frequentist with Bayesian inference elements, and leads from a preliminary discovery against a (random) H0-hypothesis to a statistical H1-verification. Not only do RPS-results feature significantly lower error-rates than NHST-results, RPS also addresses key-deficits of a “pure” Frequentist and a standard Bayesian approach. In particular, RPS aggregates underpowered results safely. RPS therefore provides a tool to regain the trust the discipline had lost during the ongoing replicability-crisis. PMID:29740363
Krefeld-Schwalb, Antonia; Witte, Erich H; Zenker, Frank
2018-01-01
In psychology as elsewhere, the main statistical inference strategy to establish empirical effects is null-hypothesis significance testing (NHST). The recent failure to replicate allegedly well-established NHST-results, however, implies that such results lack sufficient statistical power, and thus feature unacceptably high error-rates. Using data-simulation to estimate the error-rates of NHST-results, we advocate the research program strategy (RPS) as a superior methodology. RPS integrates Frequentist with Bayesian inference elements, and leads from a preliminary discovery against a (random) H 0 -hypothesis to a statistical H 1 -verification. Not only do RPS-results feature significantly lower error-rates than NHST-results, RPS also addresses key-deficits of a "pure" Frequentist and a standard Bayesian approach. In particular, RPS aggregates underpowered results safely. RPS therefore provides a tool to regain the trust the discipline had lost during the ongoing replicability-crisis.
The potential for increased power from combining P-values testing the same hypothesis.
Ganju, Jitendra; Julie Ma, Guoguang
2017-02-01
The conventional approach to hypothesis testing for formal inference is to prespecify a single test statistic thought to be optimal. However, we usually have more than one test statistic in mind for testing the null hypothesis of no treatment effect but we do not know which one is the most powerful. Rather than relying on a single p-value, combining p-values from prespecified multiple test statistics can be used for inference. Combining functions include Fisher's combination test and the minimum p-value. Using randomization-based tests, the increase in power can be remarkable when compared with a single test and Simes's method. The versatility of the method is that it also applies when the number of covariates exceeds the number of observations. The increase in power is large enough to prefer combined p-values over a single p-value. The limitation is that the method does not provide an unbiased estimator of the treatment effect and does not apply to situations when the model includes treatment by covariate interaction.
Caricati, Luca
2017-01-01
The status-legitimacy hypothesis was tested by analyzing cross-national data about social inequality. Several indicators were used as indexes of social advantage: social class, personal income, and self-position in the social hierarchy. Moreover, inequality and freedom in nations, as indexed by Gini and by the human freedom index, were considered. Results from 36 nations worldwide showed no support for the status-legitimacy hypothesis. The perception that income distribution was fair tended to increase as social advantage increased. Moreover, national context increased the difference between advantaged and disadvantaged people in the perception of social fairness: Contrary to the status-legitimacy hypothesis, disadvantaged people were more likely than advantaged people to perceive income distribution as too large, and this difference increased in nations with greater freedom and equality. The implications for the status-legitimacy hypothesis are discussed.
Bug Distribution and Statistical Pattern Classification.
ERIC Educational Resources Information Center
Tatsuoka, Kikumi K.; Tatsuoka, Maurice M.
1987-01-01
The rule space model permits measurement of cognitive skill acquisition and error diagnosis. Further discussion introduces Bayesian hypothesis testing and bug distribution. An illustration involves an artificial intelligence approach to testing fractions and arithmetic. (Author/GDC)
A Novel Approach for Evaluating Carbamate Mixtures for Dose Additivity
Two mathematical approaches were used to test the hypothesis ofdose-addition for a binary and a seven-chemical mixture ofN-methyl carbamates, toxicologically similar chemicals that inhibit cholinesterase (ChE). In the more novel approach, mixture data were not included in the ana...
A default Bayesian hypothesis test for mediation.
Nuijten, Michèle B; Wetzels, Ruud; Matzke, Dora; Dolan, Conor V; Wagenmakers, Eric-Jan
2015-03-01
In order to quantify the relationship between multiple variables, researchers often carry out a mediation analysis. In such an analysis, a mediator (e.g., knowledge of a healthy diet) transmits the effect from an independent variable (e.g., classroom instruction on a healthy diet) to a dependent variable (e.g., consumption of fruits and vegetables). Almost all mediation analyses in psychology use frequentist estimation and hypothesis-testing techniques. A recent exception is Yuan and MacKinnon (Psychological Methods, 14, 301-322, 2009), who outlined a Bayesian parameter estimation procedure for mediation analysis. Here we complete the Bayesian alternative to frequentist mediation analysis by specifying a default Bayesian hypothesis test based on the Jeffreys-Zellner-Siow approach. We further extend this default Bayesian test by allowing a comparison to directional or one-sided alternatives, using Markov chain Monte Carlo techniques implemented in JAGS. All Bayesian tests are implemented in the R package BayesMed (Nuijten, Wetzels, Matzke, Dolan, & Wagenmakers, 2014).
Krypotos, Angelos-Miltiadis; Klugkist, Irene; Engelhard, Iris M.
2017-01-01
ABSTRACT Threat conditioning procedures have allowed the experimental investigation of the pathogenesis of Post-Traumatic Stress Disorder. The findings of these procedures have also provided stable foundations for the development of relevant intervention programs (e.g. exposure therapy). Statistical inference of threat conditioning procedures is commonly based on p-values and Null Hypothesis Significance Testing (NHST). Nowadays, however, there is a growing concern about this statistical approach, as many scientists point to the various limitations of p-values and NHST. As an alternative, the use of Bayes factors and Bayesian hypothesis testing has been suggested. In this article, we apply this statistical approach to threat conditioning data. In order to enable the easy computation of Bayes factors for threat conditioning data we present a new R package named condir, which can be used either via the R console or via a Shiny application. This article provides both a non-technical introduction to Bayesian analysis for researchers using the threat conditioning paradigm, and the necessary tools for computing Bayes factors easily. PMID:29038683
Constrained inversion as a hypothesis testing tool, what can we learn about the lithosphere?
NASA Astrophysics Data System (ADS)
Moorkamp, Max; Stewart, Fishwick; Jones, Alan G.
2017-04-01
Inversion of geophysical data constrained by a reference model is typically used to guide the inversion of low resolution data towards a geologically plausible solution. For example, a migrated seismic section can provide the location of lithological boundaries for potential field inversions. Here we consider the inversion of long-period magnetotelluric data constrained by models generated through surface wave inversion. In this case, we do not consider the surface wave model inherently better in any sense and want to guide the magnetotelluric inversion towards this model, but we want to test the hypothesis that both datasets can be explained by models with similar structure. If the hypothesis test is successful, i.e. we can fit the observations with a conductivity model with structural similarity to the seismic model, we have found an alternative explanation compared to the individual inversion and can use the differences to learn about the resolution of the magnetotelluric data and can improve our interpretation. Conversely, if the test refutes our hypothesis of coincident structure, we have found features in the models that are sensed fundamentally different by both methods which is potentially instructive on the nature of the anomalies. We use a MT dataset acquired in central Botswana over the Okwa terrane and the adjacent Kaapvaal and Zimbabwe Cratons together with a tomographic model for the region to illustrate and test this approach. Here, various conductive structures have been identified that bridge the Moho. Furthermore, the thickness of the lithosphere inferred from the different methods differs. In both cases the question is in how far this is a result of the ill-posed nature of inversion and in how far these differences can be reconciled. Thus this dataset is an ideal test case for our hypothesis testing approach. Finally, we will demonstrate how we can use the results of the constrained inversion to extract conductivity-velocity relationships in the region and gain further insight into the composition and thermal structure of the lithosphere.
True or False: Do 5-Year-Olds Understand Belief?
ERIC Educational Resources Information Center
Fabricius, William V.; Boyer, Ty W.; Weimer, Amy A.; Carroll, Kathleen
2010-01-01
In 3 studies (N = 188) we tested the hypothesis that children use a perceptual access approach to reason about mental states before they understand beliefs. The perceptual access hypothesis predicts a U-shaped developmental pattern of performance in true belief tasks, in which 3-year-olds who reason about reality should succeed, 4- to 5-year-olds…
Evidence for the Fundamental Difference Hypothesis or Not?: Island Constraints Revisited
ERIC Educational Resources Information Center
Belikova, Alyona; White, Lydia
2009-01-01
This article examines how changes in linguistic theory affect the debate between the fundamental difference hypothesis and the access-to-Universal Grammar (UG) approach to SLA. With a focus on subjacency (Chomsky, 1973), a principle of UG that places constraints on "wh"-movement and that has frequently been taken as a test case for verifying…
Yang, Yang; DeGruttola, Victor
2016-01-01
Traditional resampling-based tests for homogeneity in covariance matrices across multiple groups resample residuals, that is, data centered by group means. These residuals do not share the same second moments when the null hypothesis is false, which makes them difficult to use in the setting of multiple testing. An alternative approach is to resample standardized residuals, data centered by group sample means and standardized by group sample covariance matrices. This approach, however, has been observed to inflate type I error when sample size is small or data are generated from heavy-tailed distributions. We propose to improve this approach by using robust estimation for the first and second moments. We discuss two statistics: the Bartlett statistic and a statistic based on eigen-decomposition of sample covariance matrices. Both statistics can be expressed in terms of standardized errors under the null hypothesis. These methods are extended to test homogeneity in correlation matrices. Using simulation studies, we demonstrate that the robust resampling approach provides comparable or superior performance, relative to traditional approaches, for single testing and reasonable performance for multiple testing. The proposed methods are applied to data collected in an HIV vaccine trial to investigate possible determinants, including vaccine status, vaccine-induced immune response level and viral genotype, of unusual correlation pattern between HIV viral load and CD4 count in newly infected patients. PMID:22740584
Yang, Yang; DeGruttola, Victor
2012-06-22
Traditional resampling-based tests for homogeneity in covariance matrices across multiple groups resample residuals, that is, data centered by group means. These residuals do not share the same second moments when the null hypothesis is false, which makes them difficult to use in the setting of multiple testing. An alternative approach is to resample standardized residuals, data centered by group sample means and standardized by group sample covariance matrices. This approach, however, has been observed to inflate type I error when sample size is small or data are generated from heavy-tailed distributions. We propose to improve this approach by using robust estimation for the first and second moments. We discuss two statistics: the Bartlett statistic and a statistic based on eigen-decomposition of sample covariance matrices. Both statistics can be expressed in terms of standardized errors under the null hypothesis. These methods are extended to test homogeneity in correlation matrices. Using simulation studies, we demonstrate that the robust resampling approach provides comparable or superior performance, relative to traditional approaches, for single testing and reasonable performance for multiple testing. The proposed methods are applied to data collected in an HIV vaccine trial to investigate possible determinants, including vaccine status, vaccine-induced immune response level and viral genotype, of unusual correlation pattern between HIV viral load and CD4 count in newly infected patients.
Bayes factors for testing inequality constrained hypotheses: Issues with prior specification.
Mulder, Joris
2014-02-01
Several issues are discussed when testing inequality constrained hypotheses using a Bayesian approach. First, the complexity (or size) of the inequality constrained parameter spaces can be ignored. This is the case when using the posterior probability that the inequality constraints of a hypothesis hold, Bayes factors based on non-informative improper priors, and partial Bayes factors based on posterior priors. Second, the Bayes factor may not be invariant for linear one-to-one transformations of the data. This can be observed when using balanced priors which are centred on the boundary of the constrained parameter space with a diagonal covariance structure. Third, the information paradox can be observed. When testing inequality constrained hypotheses, the information paradox occurs when the Bayes factor of an inequality constrained hypothesis against its complement converges to a constant as the evidence for the first hypothesis accumulates while keeping the sample size fixed. This paradox occurs when using Zellner's g prior as a result of too much prior shrinkage. Therefore, two new methods are proposed that avoid these issues. First, partial Bayes factors are proposed based on transformed minimal training samples. These training samples result in posterior priors that are centred on the boundary of the constrained parameter space with the same covariance structure as in the sample. Second, a g prior approach is proposed by letting g go to infinity. This is possible because the Jeffreys-Lindley paradox is not an issue when testing inequality constrained hypotheses. A simulation study indicated that the Bayes factor based on this g prior approach converges fastest to the true inequality constrained hypothesis. © 2013 The British Psychological Society.
Comparisons of Means Using Exploratory and Confirmatory Approaches
ERIC Educational Resources Information Center
Kuiper, Rebecca M.; Hoijtink, Herbert
2010-01-01
This article discusses comparisons of means using exploratory and confirmatory approaches. Three methods are discussed: hypothesis testing, model selection based on information criteria, and Bayesian model selection. Throughout the article, an example is used to illustrate and evaluate the two approaches and the three methods. We demonstrate that…
Hevey, David; Dolan, Michelle
2014-08-01
The congruency hypothesis posits that approach-orientated individuals are persuaded to engage in prevention behaviours by positively framed messages; conversely, negatively framed messages are more persuasive in encouraging those who are avoidance-orientated. A 2 (frame: loss vs gain) × 2 (motivation: avoidance vs approach) design examined the effects of skin cancer information on sun-protective intentions and free sunscreen sample requests among 533 young adults. Gain-framed messages had the strongest effect on sun-protective intentions for approach-oriented individuals, whereas loss-framed messages had the strongest effect on avoidance-oriented individuals. Message framing effects on precautionary sun behaviour intentions were moderated by motivational differences. © The Author(s) 2013.
Behavioral Approach in ADHD: Testing a Motivational Dysfunction Hypothesis
ERIC Educational Resources Information Center
Mitchell, John T.
2010-01-01
Objective: Etiological models of attention-deficit hyperactivity disorder (ADHD) increasingly support the role of a motivational dysfunction pathway, particularly for hyperactive-impulsive symptoms. Overactive behavioral approach tendencies are implicated among these motivational accounts. However, other externalizing disorder symptoms, such as…
ERIC Educational Resources Information Center
Puorro, Michelle
A study examined two first-grade classrooms implementing the whole language approach and two utilizing the basal reading approach to determine the differences, if any, between the treatments. The hypothesis was that the whole language reading approach when combined with a phonics program would not result in higher test scores on a standardized…
The Effect of DBAE Approach on Teaching Painting of Undergraduate Art Students
ERIC Educational Resources Information Center
Hedayat, Mina; Kahn, Sabzali Musa; Honarvar, Habibeh; Bakar, Syed Alwi Syed Abu; Samsuddin, Mohd Effindi
2013-01-01
The aim of this study is to implement a new method of teaching painting which uses the Discipline-Based Art Education (DBAE) approach for the undergraduate art students at Tehran University. In the current study, the quasi-experimental method was used to test the hypothesis three times (pre, mid and post-tests). Thirty students from two classes…
An Empirical Test of the Modified C Index and SII, O*NET, and DHOC Occupational Code Classifications
ERIC Educational Resources Information Center
Dik, Bryan J.; Hu, Ryan S. C.; Hansen, Jo-Ida C.
2007-01-01
The present study investigated new approaches for assessing Holland's congruence hypothesis by (a) developing and applying four sets of decision rules for assigning Holland codes of varying lengths for purposes of computing Eggerth and Andrew's modified C index; (b) testing the modified C index computed using these four approaches against Brown…
NASA Astrophysics Data System (ADS)
Menne, Matthew J.; Williams, Claude N., Jr.
2005-10-01
An evaluation of three hypothesis test statistics that are commonly used in the detection of undocumented changepoints is described. The goal of the evaluation was to determine whether the use of multiple tests could improve undocumented, artificial changepoint detection skill in climate series. The use of successive hypothesis testing is compared to optimal approaches, both of which are designed for situations in which multiple undocumented changepoints may be present. In addition, the importance of the form of the composite climate reference series is evaluated, particularly with regard to the impact of undocumented changepoints in the various component series that are used to calculate the composite.In a comparison of single test changepoint detection skill, the composite reference series formulation is shown to be less important than the choice of the hypothesis test statistic, provided that the composite is calculated from the serially complete and homogeneous component series. However, each of the evaluated composite series is not equally susceptible to the presence of changepoints in its components, which may be erroneously attributed to the target series. Moreover, a reference formulation that is based on the averaging of the first-difference component series is susceptible to random walks when the composition of the component series changes through time (e.g., values are missing), and its use is, therefore, not recommended. When more than one test is required to reject the null hypothesis of no changepoint, the number of detected changepoints is reduced proportionately less than the number of false alarms in a wide variety of Monte Carlo simulations. Consequently, a consensus of hypothesis tests appears to improve undocumented changepoint detection skill, especially when reference series homogeneity is violated. A consensus of successive hypothesis tests using a semihierarchic splitting algorithm also compares favorably to optimal solutions, even when changepoints are not hierarchic.
Ganju, Jitendra; Yu, Xinxin; Ma, Guoguang Julie
2013-01-01
Formal inference in randomized clinical trials is based on controlling the type I error rate associated with a single pre-specified statistic. The deficiency of using just one method of analysis is that it depends on assumptions that may not be met. For robust inference, we propose pre-specifying multiple test statistics and relying on the minimum p-value for testing the null hypothesis of no treatment effect. The null hypothesis associated with the various test statistics is that the treatment groups are indistinguishable. The critical value for hypothesis testing comes from permutation distributions. Rejection of the null hypothesis when the smallest p-value is less than the critical value controls the type I error rate at its designated value. Even if one of the candidate test statistics has low power, the adverse effect on the power of the minimum p-value statistic is not much. Its use is illustrated with examples. We conclude that it is better to rely on the minimum p-value rather than a single statistic particularly when that single statistic is the logrank test, because of the cost and complexity of many survival trials. Copyright © 2013 John Wiley & Sons, Ltd.
Atilgan, Emre; Kilic, Dilek; Ertugrul, Hasan Murat
2017-06-01
The well-known health-led growth hypothesis claims a positive correlation between health expenditure and economic growth. The aim of this paper is to empirically investigate the health-led growth hypothesis for the Turkish economy. The bound test approach, autoregressive-distributed lag approach (ARDL) and Kalman filter modeling are employed for the 1975-2013 period to examine the co-integration relationship between economic growth and health expenditure. The ARDL model is employed in order to investigate the long-term and short-term static relationship between health expenditure and economic growth. The results show that a 1 % increase in per-capita health expenditure will lead to a 0.434 % increase in per-capita gross domestic product. These findings are also supported by the Kalman filter model's results. Our findings show that the health-led growth hypothesis is supported for Turkey.
Testing for Marshall-Lerner hypothesis: A panel approach
NASA Astrophysics Data System (ADS)
Azizan, Nur Najwa; Sek, Siok Kun
2014-12-01
The relationship between real exchange rate and trade balances are documented in many theories. One of the theories is the so-called Marshall-Lerner condition. In this study, we seek to test for the validity of Marshall-Lerner hypothesis, i.e. to reveal if the depreciation of real exchange rate leads to the improvement in trade balances. We focus our study in ASEAN-5 countries and their main trade partners of U.S., Japan and China. The dynamic panel data of pooled mean group (PMG) approach is used to detect the Marshall-Lerner hypothesis among ASEAN-5, between ASEAN-5 and U.S., between ASEAN-5 and Japan and between ASEAN-5 and China respectively. The estimation is based on the autoregressive Distributed Lag or ARDL model for the period of 1970-2012. The paper concludes that Marshal Lerner theory does not hold in bilateral trades in four groups of countries. The trade balances of ASEAN5 are mainly determined by the domestic income level and foreign production cost.
Statistical modeling, detection, and segmentation of stains in digitized fabric images
NASA Astrophysics Data System (ADS)
Gururajan, Arunkumar; Sari-Sarraf, Hamed; Hequet, Eric F.
2007-02-01
This paper will describe a novel and automated system based on a computer vision approach, for objective evaluation of stain release on cotton fabrics. Digitized color images of the stained fabrics are obtained, and the pixel values in the color and intensity planes of these images are probabilistically modeled as a Gaussian Mixture Model (GMM). Stain detection is posed as a decision theoretic problem, where the null hypothesis corresponds to absence of a stain. The null hypothesis and the alternate hypothesis mathematically translate into a first order GMM and a second order GMM respectively. The parameters of the GMM are estimated using a modified Expectation-Maximization (EM) algorithm. Minimum Description Length (MDL) is then used as the test statistic to decide the verity of the null hypothesis. The stain is then segmented by a decision rule based on the probability map generated by the EM algorithm. The proposed approach was tested on a dataset of 48 fabric images soiled with stains of ketchup, corn oil, mustard, ragu sauce, revlon makeup and grape juice. The decision theoretic part of the algorithm produced a correct detection rate (true positive) of 93% and a false alarm rate of 5% on these set of images.
A review and meta-analysis of the enemy release hypothesis in plant–herbivorous insect systems
Meijer, Kim; Schilthuizen, Menno; Beukeboom, Leo
2016-01-01
A suggested mechanism for the success of introduced non-native species is the enemy release hypothesis (ERH). Many studies have tested the predictions of the ERH using the community approach (native and non-native species studied in the same habitat) or the biogeographical approach (species studied in their native and non-native range), but results are highly variable, possibly due to large variety of study systems incorporated. We therefore focused on one specific system: plants and their herbivorous insects. We performed a systematic review and compiled a large number (68) of datasets from studies comparing herbivorous insects on native and non-native plants using the community or biogeographical approach. We performed a meta-analysis to test the predictions from the ERH for insect diversity (number of species), insect load (number of individuals) and level of herbivory for both the community and biogeographical approach. For both the community and biogeographical approach insect diversity was significantly higher on native than on non-native plants. Insect load tended to be higher on native than non-native plants at the community approach only. Herbivory was not different between native and non-native plants at the community approach, while there was too little data available for testing the biogeographical approach. Our meta-analysis generally supports the predictions from the ERH for both the community and biogeographical approach, but also shows that the outcome is importantly determined by the response measured and approach applied. So far, very few studies apply both approaches simultaneously in a reciprocal manner while this is arguably the best way for testing the ERH. PMID:28028463
Bayes Factor Approaches for Testing Interval Null Hypotheses
ERIC Educational Resources Information Center
Morey, Richard D.; Rouder, Jeffrey N.
2011-01-01
Psychological theories are statements of constraint. The role of hypothesis testing in psychology is to test whether specific theoretical constraints hold in data. Bayesian statistics is well suited to the task of finding supporting evidence for constraint, because it allows for comparing evidence for 2 hypotheses against each another. One issue…
Exercising self-control increases approach motivation.
Schmeichel, Brandon J; Harmon-Jones, Cindy; Harmon-Jones, Eddie
2010-07-01
The present research tested the hypothesis that exercising self-control causes an increase in approach motivation. Study 1 found that exercising (vs. not exercising) self-control increases self-reported approach motivation. Study 2a identified a behavior--betting on low-stakes gambles--that is correlated with approach motivation but is relatively uncorrelated with self-control, and Study 2b observed that exercising self-control temporarily increases this behavior. Last, Study 3 found that exercising self-control facilitates the perception of a reward-relevant symbol (i.e., a dollar sign) but not a reward-irrelevant symbol (i.e., a percent sign). Altogether, these results support the hypothesis that exercising self-control temporarily increases approach motivation. Failures of self-control that follow from prior efforts at self-control (i.e., ego depletion) may be explained in part by increased approach motivation.
Information extraction from dynamic PS-InSAR time series using machine learning
NASA Astrophysics Data System (ADS)
van de Kerkhof, B.; Pankratius, V.; Chang, L.; van Swol, R.; Hanssen, R. F.
2017-12-01
Due to the increasing number of SAR satellites, with shorter repeat intervals and higher resolutions, SAR data volumes are exploding. Time series analyses of SAR data, i.e. Persistent Scatterer (PS) InSAR, enable the deformation monitoring of the built environment at an unprecedented scale, with hundreds of scatterers per km2, updated weekly. Potential hazards, e.g. due to failure of aging infrastructure, can be detected at an early stage. Yet, this requires the operational data processing of billions of measurement points, over hundreds of epochs, updating this data set dynamically as new data come in, and testing whether points (start to) behave in an anomalous way. Moreover, the quality of PS-InSAR measurements is ambiguous and heterogeneous, which will yield false positives and false negatives. Such analyses are numerically challenging. Here we extract relevant information from PS-InSAR time series using machine learning algorithms. We cluster (group together) time series with similar behaviour, even though they may not be spatially close, such that the results can be used for further analysis. First we reduce the dimensionality of the dataset in order to be able to cluster the data, since applying clustering techniques on high dimensional datasets often result in unsatisfying results. Our approach is to apply t-distributed Stochastic Neighbor Embedding (t-SNE), a machine learning algorithm for dimensionality reduction of high-dimensional data to a 2D or 3D map, and cluster this result using Density-Based Spatial Clustering of Applications with Noise (DBSCAN). The results show that we are able to detect and cluster time series with similar behaviour, which is the starting point for more extensive analysis into the underlying driving mechanisms. The results of the methods are compared to conventional hypothesis testing as well as a Self-Organising Map (SOM) approach. Hypothesis testing is robust and takes the stochastic nature of the observations into account, but is time consuming. Therefore, we successively apply our machine learning approach with the hypothesis testing approach in order to benefit from both the reduced computation time of the machine learning approach as from the robust quality metrics of hypothesis testing. We acknowledge support from NASA AISTNNX15AG84G (PI V. Pankratius)
NASA Technical Reports Server (NTRS)
Carpenter, J. R.; Markley, F. L.; Alfriend, K. T.; Wright, C.; Arcido, J.
2011-01-01
Sequential probability ratio tests explicitly allow decision makers to incorporate false alarm and missed detection risks, and are potentially less sensitive to modeling errors than a procedure that relies solely on a probability of collision threshold. Recent work on constrained Kalman filtering has suggested an approach to formulating such a test for collision avoidance maneuver decisions: a filter bank with two norm-inequality-constrained epoch-state extended Kalman filters. One filter models 1he null hypothesis 1ha1 the miss distance is inside the combined hard body radius at the predicted time of closest approach, and one filter models the alternative hypothesis. The epoch-state filter developed for this method explicitly accounts for any process noise present in the system. The method appears to work well using a realistic example based on an upcoming highly-elliptical orbit formation flying mission.
In silico model-based inference: a contemporary approach for hypothesis testing in network biology
Klinke, David J.
2014-01-01
Inductive inference plays a central role in the study of biological systems where one aims to increase their understanding of the system by reasoning backwards from uncertain observations to identify causal relationships among components of the system. These causal relationships are postulated from prior knowledge as a hypothesis or simply a model. Experiments are designed to test the model. Inferential statistics are used to establish a level of confidence in how well our postulated model explains the acquired data. This iterative process, commonly referred to as the scientific method, either improves our confidence in a model or suggests that we revisit our prior knowledge to develop a new model. Advances in technology impact how we use prior knowledge and data to formulate models of biological networks and how we observe cellular behavior. However, the approach for model-based inference has remained largely unchanged since Fisher, Neyman and Pearson developed the ideas in the early 1900’s that gave rise to what is now known as classical statistical hypothesis (model) testing. Here, I will summarize conventional methods for model-based inference and suggest a contemporary approach to aid in our quest to discover how cells dynamically interpret and transmit information for therapeutic aims that integrates ideas drawn from high performance computing, Bayesian statistics, and chemical kinetics. PMID:25139179
In silico model-based inference: a contemporary approach for hypothesis testing in network biology.
Klinke, David J
2014-01-01
Inductive inference plays a central role in the study of biological systems where one aims to increase their understanding of the system by reasoning backwards from uncertain observations to identify causal relationships among components of the system. These causal relationships are postulated from prior knowledge as a hypothesis or simply a model. Experiments are designed to test the model. Inferential statistics are used to establish a level of confidence in how well our postulated model explains the acquired data. This iterative process, commonly referred to as the scientific method, either improves our confidence in a model or suggests that we revisit our prior knowledge to develop a new model. Advances in technology impact how we use prior knowledge and data to formulate models of biological networks and how we observe cellular behavior. However, the approach for model-based inference has remained largely unchanged since Fisher, Neyman and Pearson developed the ideas in the early 1900s that gave rise to what is now known as classical statistical hypothesis (model) testing. Here, I will summarize conventional methods for model-based inference and suggest a contemporary approach to aid in our quest to discover how cells dynamically interpret and transmit information for therapeutic aims that integrates ideas drawn from high performance computing, Bayesian statistics, and chemical kinetics. © 2014 American Institute of Chemical Engineers.
Testing the single-state dominance hypothesis
DOE Office of Scientific and Technical Information (OSTI.GOV)
Álvarez-Rodríguez, R.; Moreno, O.; Moya de Guerra, E.
2013-12-30
We present a theoretical analysis of the single-state dominance hypothesis for the two-neutrino double-beta decay process. The theoretical framework is a proton-neutron QRPA based on a deformed Hartree-Fock mean field with BCS pairing correlations. We focus on the decays of {sup 100}Mo, {sup 116}Cd and {sup 128}Te. We do not find clear evidences for single-state dominance within the present approach.
Schiffer, Anne-Marike; Nevado-Holgado, Alejo J; Johnen, Andreas; Schönberger, Anna R; Fink, Gereon R; Schubotz, Ricarda I
2015-11-01
Action observation is known to trigger predictions of the ongoing course of action and thus considered a hallmark example for predictive perception. A related task, which explicitly taps into the ability to predict actions based on their internal representations, is action segmentation; the task requires participants to demarcate where one action step is completed and another one begins. It thus benefits from a temporally precise prediction of the current action. Formation and exploitation of these temporal predictions of external events is now closely associated with a network including the basal ganglia and prefrontal cortex. Because decline of dopaminergic innervation leads to impaired function of the basal ganglia and prefrontal cortex in Parkinson's disease (PD), we hypothesised that PD patients would show increased temporal variability in the action segmentation task, especially under medication withdrawal (hypothesis 1). Another crucial aspect of action segmentation is its reliance on a semantic representation of actions. There is no evidence to suggest that action representations are substantially altered, or cannot be accessed, in non-demented PD patients. We therefore expected action segmentation judgments to follow the same overall patterns in PD patients and healthy controls (hypothesis 2), resulting in comparable segmentation profiles. Both hypotheses were tested with a novel classification approach. We present evidence for both hypotheses in the present study: classifier performance was slightly decreased when it was tested for its ability to predict the identity of movies segmented by PD patients, and a measure of normativity of response behaviour was decreased when patients segmented movies under medication-withdrawal without access to an episodic memory of the sequence. This pattern of results is consistent with hypothesis 1. However, the classifier analysis also revealed that responses given by patients and controls create very similar action-specific patterns, thus delivering evidence in favour hypothesis 2. In terms of methodology, the use of classifiers in the present study allowed us to establish similarity of behaviour across groups (hypothesis 2). The approach opens up a new avenue that standard statistical methods often fail to provide and is discussed in terms of its merits to measure hypothesised similarities across study populations. Copyright © 2015 Elsevier Ltd. All rights reserved.
Biostatistics Series Module 2: Overview of Hypothesis Testing.
Hazra, Avijit; Gogtay, Nithya
2016-01-01
Hypothesis testing (or statistical inference) is one of the major applications of biostatistics. Much of medical research begins with a research question that can be framed as a hypothesis. Inferential statistics begins with a null hypothesis that reflects the conservative position of no change or no difference in comparison to baseline or between groups. Usually, the researcher has reason to believe that there is some effect or some difference which is the alternative hypothesis. The researcher therefore proceeds to study samples and measure outcomes in the hope of generating evidence strong enough for the statistician to be able to reject the null hypothesis. The concept of the P value is almost universally used in hypothesis testing. It denotes the probability of obtaining by chance a result at least as extreme as that observed, even when the null hypothesis is true and no real difference exists. Usually, if P is < 0.05 the null hypothesis is rejected and sample results are deemed statistically significant. With the increasing availability of computers and access to specialized statistical software, the drudgery involved in statistical calculations is now a thing of the past, once the learning curve of the software has been traversed. The life sciences researcher is therefore free to devote oneself to optimally designing the study, carefully selecting the hypothesis tests to be applied, and taking care in conducting the study well. Unfortunately, selecting the right test seems difficult initially. Thinking of the research hypothesis as addressing one of five generic research questions helps in selection of the right hypothesis test. In addition, it is important to be clear about the nature of the variables (e.g., numerical vs. categorical; parametric vs. nonparametric) and the number of groups or data sets being compared (e.g., two or more than two) at a time. The same research question may be explored by more than one type of hypothesis test. While this may be of utility in highlighting different aspects of the problem, merely reapplying different tests to the same issue in the hope of finding a P < 0.05 is a wrong use of statistics. Finally, it is becoming the norm that an estimate of the size of any effect, expressed with its 95% confidence interval, is required for meaningful interpretation of results. A large study is likely to have a small (and therefore "statistically significant") P value, but a "real" estimate of the effect would be provided by the 95% confidence interval. If the intervals overlap between two interventions, then the difference between them is not so clear-cut even if P < 0.05. The two approaches are now considered complementary to one another.
Biostatistics Series Module 2: Overview of Hypothesis Testing
Hazra, Avijit; Gogtay, Nithya
2016-01-01
Hypothesis testing (or statistical inference) is one of the major applications of biostatistics. Much of medical research begins with a research question that can be framed as a hypothesis. Inferential statistics begins with a null hypothesis that reflects the conservative position of no change or no difference in comparison to baseline or between groups. Usually, the researcher has reason to believe that there is some effect or some difference which is the alternative hypothesis. The researcher therefore proceeds to study samples and measure outcomes in the hope of generating evidence strong enough for the statistician to be able to reject the null hypothesis. The concept of the P value is almost universally used in hypothesis testing. It denotes the probability of obtaining by chance a result at least as extreme as that observed, even when the null hypothesis is true and no real difference exists. Usually, if P is < 0.05 the null hypothesis is rejected and sample results are deemed statistically significant. With the increasing availability of computers and access to specialized statistical software, the drudgery involved in statistical calculations is now a thing of the past, once the learning curve of the software has been traversed. The life sciences researcher is therefore free to devote oneself to optimally designing the study, carefully selecting the hypothesis tests to be applied, and taking care in conducting the study well. Unfortunately, selecting the right test seems difficult initially. Thinking of the research hypothesis as addressing one of five generic research questions helps in selection of the right hypothesis test. In addition, it is important to be clear about the nature of the variables (e.g., numerical vs. categorical; parametric vs. nonparametric) and the number of groups or data sets being compared (e.g., two or more than two) at a time. The same research question may be explored by more than one type of hypothesis test. While this may be of utility in highlighting different aspects of the problem, merely reapplying different tests to the same issue in the hope of finding a P < 0.05 is a wrong use of statistics. Finally, it is becoming the norm that an estimate of the size of any effect, expressed with its 95% confidence interval, is required for meaningful interpretation of results. A large study is likely to have a small (and therefore “statistically significant”) P value, but a “real” estimate of the effect would be provided by the 95% confidence interval. If the intervals overlap between two interventions, then the difference between them is not so clear-cut even if P < 0.05. The two approaches are now considered complementary to one another. PMID:27057011
A critique of statistical hypothesis testing in clinical research
Raha, Somik
2011-01-01
Many have documented the difficulty of using the current paradigm of Randomized Controlled Trials (RCTs) to test and validate the effectiveness of alternative medical systems such as Ayurveda. This paper critiques the applicability of RCTs for all clinical knowledge-seeking endeavors, of which Ayurveda research is a part. This is done by examining statistical hypothesis testing, the underlying foundation of RCTs, from a practical and philosophical perspective. In the philosophical critique, the two main worldviews of probability are that of the Bayesian and the frequentist. The frequentist worldview is a special case of the Bayesian worldview requiring the unrealistic assumptions of knowing nothing about the universe and believing that all observations are unrelated to each other. Many have claimed that the first belief is necessary for science, and this claim is debunked by comparing variations in learning with different prior beliefs. Moving beyond the Bayesian and frequentist worldviews, the notion of hypothesis testing itself is challenged on the grounds that a hypothesis is an unclear distinction, and assigning a probability on an unclear distinction is an exercise that does not lead to clarity of action. This critique is of the theory itself and not any particular application of statistical hypothesis testing. A decision-making frame is proposed as a way of both addressing this critique and transcending ideological debates on probability. An example of a Bayesian decision-making approach is shown as an alternative to statistical hypothesis testing, utilizing data from a past clinical trial that studied the effect of Aspirin on heart attacks in a sample population of doctors. As a big reason for the prevalence of RCTs in academia is legislation requiring it, the ethics of legislating the use of statistical methods for clinical research is also examined. PMID:22022152
ERIC Educational Resources Information Center
Ngu, Bing Hiong; Yeung, Alexander Seeshing
2012-01-01
Holyoak and Koh (1987) and Holyoak (1984) propose four critical tasks for analogical transfer to occur in problem solving. A study was conducted to test this hypothesis by comparing a multiple components (MC) approach against worked examples (WE) in helping students to solve algebra word problems in chemistry classes. The MC approach incorporated…
Testing jumps via false discovery rate control.
Yen, Yu-Min
2013-01-01
Many recently developed nonparametric jump tests can be viewed as multiple hypothesis testing problems. For such multiple hypothesis tests, it is well known that controlling type I error often makes a large proportion of erroneous rejections, and such situation becomes even worse when the jump occurrence is a rare event. To obtain more reliable results, we aim to control the false discovery rate (FDR), an efficient compound error measure for erroneous rejections in multiple testing problems. We perform the test via the Barndorff-Nielsen and Shephard (BNS) test statistic, and control the FDR with the Benjamini and Hochberg (BH) procedure. We provide asymptotic results for the FDR control. From simulations, we examine relevant theoretical results and demonstrate the advantages of controlling the FDR. The hybrid approach is then applied to empirical analysis on two benchmark stock indices with high frequency data.
Rethinking developmental toxicity testing: Evolution or revolution?
Scialli, Anthony R; Daston, George; Chen, Connie; Coder, Prägati S; Euling, Susan Y; Foreman, Jennifer; Hoberman, Alan M; Hui, Julia; Knudsen, Thomas; Makris, Susan L; Morford, LaRonda; Piersma, Aldert H; Stanislaus, Dinesh; Thompson, Kary E
2018-06-01
Current developmental toxicity testing adheres largely to protocols suggested in 1966 involving the administration of test compound to pregnant laboratory animals. After more than 50 years of embryo-fetal development testing, are we ready to consider a different approach to human developmental toxicity testing? A workshop was held under the auspices of the Developmental and Reproductive Toxicology Technical Committee of the ILSI Health and Environmental Sciences Institute to consider how we might design developmental toxicity testing if we started over with 21st century knowledge and techniques (revolution). We first consider what changes to the current protocols might be recommended to make them more predictive for human risk (evolution). The evolutionary approach includes modifications of existing protocols and can include humanized models, disease models, more accurate assessment and testing of metabolites, and informed approaches to dose selection. The revolution could start with hypothesis-driven testing where we take what we know about a compound or close analog and answer specific questions using targeted experimental techniques rather than a one-protocol-fits-all approach. Central to the idea of hypothesis-driven testing is the concept that testing can be done at the level of mode of action. It might be feasible to identify a small number of key events at a molecular or cellular level that predict an adverse outcome and for which testing could be performed in vitro or in silico or, rarely, using limited in vivo models. Techniques for evaluating these key events exist today or are in development. Opportunities exist for refining and then replacing current developmental toxicity testing protocols using techniques that have already been developed or are within reach. © 2018 The Authors. Birth Defects Research Published by Wiley Periodicals, Inc.
Repeated Challenge Studies: A Comparison of Union-Intersection Testing with Linear Modeling.
ERIC Educational Resources Information Center
Levine, Richard A.; Ohman, Pamela A.
1997-01-01
Challenge studies can be used to see whether there is a causal relationship between an agent of interest and a response. An approach based on union-intersection testing is presented that allows researchers to examine observations on a single subject and test the hypothesis of interest. An application using psychological data is presented. (SLD)
Is "g" an Entity? A Japanese Twin Study Using Syllogisms and Intelligence Tests
ERIC Educational Resources Information Center
Shikishima, Chizuru; Hiraishi, Kai; Yamagata, Shinji; Sugimoto, Yutaro; Takemura, Ryo; Ozaki, Koken; Okada, Mitsuhiro; Toda, Tatsushi; Ando, Juko
2009-01-01
Using a behavioral genetic approach, we examined the validity of the hypothesis concerning the singularity of human general intelligence, the "g" theory, by analyzing data from two tests: the first consisted of 100 syllogism problems and the second a full-scale intelligence test. The participants were 448 Japanese young adult twins (167…
ERIC Educational Resources Information Center
Steacy, Laura M.; Elleman, Amy M.; Lovett, Maureen W.; Compton, Donald L.
2016-01-01
In English, gains in decoding skill do not map directly onto increases in word reading. However, beyond the Self-Teaching Hypothesis, little is known about the transfer of decoding skills to word reading. In this study, we offer a new approach to testing specific decoding elements on transfer to word reading. To illustrate, we modeled word-reading…
The Latent Variable Approach as Applied to Transitive Reasoning
ERIC Educational Resources Information Center
Bouwmeester, Samantha; Vermunt, Jeroen K.; Sijtsma, Klaas
2012-01-01
We discuss the limitations of hypothesis testing using (quasi-) experiments in the study of cognitive development and suggest latent variable modeling as a viable alternative to experimentation. Latent variable models allow testing a theory as a whole, incorporating individual differences with respect to developmental processes or abilities in the…
Sample Size Estimation: The Easy Way
ERIC Educational Resources Information Center
Weller, Susan C.
2015-01-01
This article presents a simple approach to making quick sample size estimates for basic hypothesis tests. Although there are many sources available for estimating sample sizes, methods are not often integrated across statistical tests, levels of measurement of variables, or effect sizes. A few parameters are required to estimate sample sizes and…
Yang, Xiaowei; Nie, Kun
2008-03-15
Longitudinal data sets in biomedical research often consist of large numbers of repeated measures. In many cases, the trajectories do not look globally linear or polynomial, making it difficult to summarize the data or test hypotheses using standard longitudinal data analysis based on various linear models. An alternative approach is to apply the approaches of functional data analysis, which directly target the continuous nonlinear curves underlying discretely sampled repeated measures. For the purposes of data exploration, many functional data analysis strategies have been developed based on various schemes of smoothing, but fewer options are available for making causal inferences regarding predictor-outcome relationships, a common task seen in hypothesis-driven medical studies. To compare groups of curves, two testing strategies with good power have been proposed for high-dimensional analysis of variance: the Fourier-based adaptive Neyman test and the wavelet-based thresholding test. Using a smoking cessation clinical trial data set, this paper demonstrates how to extend the strategies for hypothesis testing into the framework of functional linear regression models (FLRMs) with continuous functional responses and categorical or continuous scalar predictors. The analysis procedure consists of three steps: first, apply the Fourier or wavelet transform to the original repeated measures; then fit a multivariate linear model in the transformed domain; and finally, test the regression coefficients using either adaptive Neyman or thresholding statistics. Since a FLRM can be viewed as a natural extension of the traditional multiple linear regression model, the development of this model and computational tools should enhance the capacity of medical statistics for longitudinal data.
Observation-Oriented Modeling: Going beyond "Is It All a Matter of Chance"?
ERIC Educational Resources Information Center
Grice, James W.; Yepez, Maria; Wilson, Nicole L.; Shoda, Yuichi
2017-01-01
An alternative to null hypothesis significance testing is presented and discussed. This approach, referred to as observation-oriented modeling, is centered on model building in an effort to explicate the structures and processes believed to generate a set of observations. In terms of analysis, this novel approach complements traditional methods…
A risk-based approach to flood management decisions in a nonstationary world
NASA Astrophysics Data System (ADS)
Rosner, Ana; Vogel, Richard M.; Kirshen, Paul H.
2014-03-01
Traditional approaches to flood management in a nonstationary world begin with a null hypothesis test of "no trend" and its likelihood, with little or no attention given to the likelihood that we might ignore a trend if it really existed. Concluding a trend exists when it does not, or rejecting a trend when it exists are known as type I and type II errors, respectively. Decision-makers are poorly served by statistical and/or decision methods that do not carefully consider both over- and under-preparation errors, respectively. Similarly, little attention is given to how to integrate uncertainty in our ability to detect trends into a flood management decision context. We show how trend hypothesis test results can be combined with an adaptation's infrastructure costs and damages avoided to provide a rational decision approach in a nonstationary world. The criterion of expected regret is shown to be a useful metric that integrates the statistical, economic, and hydrological aspects of the flood management problem in a nonstationary world.
An objective Bayesian analysis of a crossover design via model selection and model averaging.
Li, Dandan; Sivaganesan, Siva
2016-11-10
Inference about the treatment effect in a crossover design has received much attention over time owing to the uncertainty in the existence of the carryover effect and its impact on the estimation of the treatment effect. Adding to this uncertainty is that the existence of the carryover effect and its size may depend on the presence of the treatment effect and its size. We consider estimation and testing hypothesis about the treatment effect in a two-period crossover design, assuming normally distributed response variable, and use an objective Bayesian approach to test the hypothesis about the treatment effect and to estimate its size when it exists while accounting for the uncertainty about the presence of the carryover effect as well as the treatment and period effects. We evaluate and compare the performance of the proposed approach with a standard frequentist approach using simulated data, and real data. Copyright © 2016 John Wiley & Sons, Ltd. Copyright © 2016 John Wiley & Sons, Ltd.
Sanchez, Gaëtan; Lecaignard, Françoise; Otman, Anatole; Maby, Emmanuel; Mattout, Jérémie
2016-01-01
The relatively young field of Brain-Computer Interfaces has promoted the use of electrophysiology and neuroimaging in real-time. In the meantime, cognitive neuroscience studies, which make extensive use of functional exploration techniques, have evolved toward model-based experiments and fine hypothesis testing protocols. Although these two developments are mostly unrelated, we argue that, brought together, they may trigger an important shift in the way experimental paradigms are being designed, which should prove fruitful to both endeavors. This change simply consists in using real-time neuroimaging in order to optimize advanced neurocognitive hypothesis testing. We refer to this new approach as the instantiation of an Active SAmpling Protocol (ASAP). As opposed to classical (static) experimental protocols, ASAP implements online model comparison, enabling the optimization of design parameters (e.g., stimuli) during the course of data acquisition. This follows the well-known principle of sequential hypothesis testing. What is radically new, however, is our ability to perform online processing of the huge amount of complex data that brain imaging techniques provide. This is all the more relevant at a time when physiological and psychological processes are beginning to be approached using more realistic, generative models which may be difficult to tease apart empirically. Based upon Bayesian inference, ASAP proposes a generic and principled way to optimize experimental design adaptively. In this perspective paper, we summarize the main steps in ASAP. Using synthetic data we illustrate its superiority in selecting the right perceptual model compared to a classical design. Finally, we briefly discuss its future potential for basic and clinical neuroscience as well as some remaining challenges.
On the widespread use of the Corrsin hypothesis in diffusion theories
DOE Office of Scientific and Technical Information (OSTI.GOV)
Tautz, R. C.; Shalchi, A.
2010-12-15
In the past four decades, several nonlinear theories have been developed to describe (i) the motion of charged test particles through a turbulent magnetized plasma and (ii) the random walk of magnetic field lines. In many such theories, the so-called Corrsin independence hypothesis has been applied to enforce analytical tractability. In this note, it is shown that the Corrsin hypothesis is part of most nonlinear diffusion theories. In some cases, the Corrsin approximation is somewhat hidden, while in other cases a different name is used for the same approach. It is shown that even the researchers who criticized the applicationmore » of this hypothesis have used it in their nonlinear diffusion theories. It is hoped that the present article will eliminate the recently caused confusion about the applicability and validity of the Corrsin hypothesis.« less
Crowell, Adrienne; Schmeichel, Brandon J
2016-01-01
Inspired by the elaborated intrusion theory of desire, the current research tested the hypothesis that persons higher in trait approach motivation process positive stimuli deeply, which enhances memory for them. Ninety-four undergraduates completed a measure of trait approach motivation, viewed positive or negative image slideshows in the presence or absence of a cognitive load, and one week later completed an image memory test. Higher trait approach motivation predicted better memory for the positive slideshow, but this memory boost disappeared under cognitive load. Approach motivation did not influence memory for the negative slideshow. The current findings support the idea that individuals higher in approach motivation spontaneously devote limited resources to processing positive stimuli.
[Dilemma of null hypothesis in ecological hypothesis's experiment test.
Li, Ji
2016-06-01
Experimental test is one of the major test methods of ecological hypothesis, though there are many arguments due to null hypothesis. Quinn and Dunham (1983) analyzed the hypothesis deduction model from Platt (1964) and thus stated that there is no null hypothesis in ecology that can be strictly tested by experiments. Fisher's falsificationism and Neyman-Pearson (N-P)'s non-decisivity inhibit statistical null hypothesis from being strictly tested. Moreover, since the null hypothesis H 0 (α=1, β=0) and alternative hypothesis H 1 '(α'=1, β'=0) in ecological progresses are diffe-rent from classic physics, the ecological null hypothesis can neither be strictly tested experimentally. These dilemmas of null hypothesis could be relieved via the reduction of P value, careful selection of null hypothesis, non-centralization of non-null hypothesis, and two-tailed test. However, the statistical null hypothesis significance testing (NHST) should not to be equivalent to the causality logistical test in ecological hypothesis. Hence, the findings and conclusions about methodological studies and experimental tests based on NHST are not always logically reliable.
A General Class of Test Statistics for Van Valen’s Red Queen Hypothesis
Wiltshire, Jelani; Huffer, Fred W.; Parker, William C.
2014-01-01
Van Valen’s Red Queen hypothesis states that within a homogeneous taxonomic group the age is statistically independent of the rate of extinction. The case of the Red Queen hypothesis being addressed here is when the homogeneous taxonomic group is a group of similar species. Since Van Valen’s work, various statistical approaches have been used to address the relationship between taxon age and the rate of extinction. We propose a general class of test statistics that can be used to test for the effect of age on the rate of extinction. These test statistics allow for a varying background rate of extinction and attempt to remove the effects of other covariates when assessing the effect of age on extinction. No model is assumed for the covariate effects. Instead we control for covariate effects by pairing or grouping together similar species. Simulations are used to compare the power of the statistics. We apply the test statistics to data on Foram extinctions and find that age has a positive effect on the rate of extinction. A derivation of the null distribution of one of the test statistics is provided in the supplementary material. PMID:24910489
A General Class of Test Statistics for Van Valen's Red Queen Hypothesis.
Wiltshire, Jelani; Huffer, Fred W; Parker, William C
2014-09-01
Van Valen's Red Queen hypothesis states that within a homogeneous taxonomic group the age is statistically independent of the rate of extinction. The case of the Red Queen hypothesis being addressed here is when the homogeneous taxonomic group is a group of similar species. Since Van Valen's work, various statistical approaches have been used to address the relationship between taxon age and the rate of extinction. We propose a general class of test statistics that can be used to test for the effect of age on the rate of extinction. These test statistics allow for a varying background rate of extinction and attempt to remove the effects of other covariates when assessing the effect of age on extinction. No model is assumed for the covariate effects. Instead we control for covariate effects by pairing or grouping together similar species. Simulations are used to compare the power of the statistics. We apply the test statistics to data on Foram extinctions and find that age has a positive effect on the rate of extinction. A derivation of the null distribution of one of the test statistics is provided in the supplementary material.
2010-09-30
planktonic ecosystems. OBJECTIVES Our objectives in this work are to 1) visualize and quantify herbivorous copepod feeding in the laboratory...and 2) to apply these methods in the field to observe the dynamics of copepod feeding in situ. In particular we intend to test the “feeding sorties...hypothesis vs. the “in situ feeding” hypothesis regarding the location and timing of copepod feeding and vertical migration. APPROACH Previous
Data-Driven Learning of Speech Acts Based on Corpora of DVD Subtitles
ERIC Educational Resources Information Center
Kitao, S. Kathleen; Kitao, Kenji
2013-01-01
Data-driven learning (DDL) is an inductive approach to language learning in which students study examples of authentic language and use them to find patterns of language use. This inductive approach to learning has the advantages of being learner-centered, encouraging hypothesis testing and learner autonomy, and helping develop learning skills.…
Bayesian meta-analysis of Cronbach's coefficient alpha to evaluate informative hypotheses.
Okada, Kensuke
2015-12-01
This paper proposes a new method to evaluate informative hypotheses for meta-analysis of Cronbach's coefficient alpha using a Bayesian approach. The coefficient alpha is one of the most widely used reliability indices. In meta-analyses of reliability, researchers typically form specific informative hypotheses beforehand, such as 'alpha of this test is greater than 0.8' or 'alpha of one form of a test is greater than the others.' The proposed method enables direct evaluation of these informative hypotheses. To this end, a Bayes factor is calculated to evaluate the informative hypothesis against its complement. It allows researchers to summarize the evidence provided by previous studies in favor of their informative hypothesis. The proposed approach can be seen as a natural extension of the Bayesian meta-analysis of coefficient alpha recently proposed in this journal (Brannick and Zhang, 2013). The proposed method is illustrated through two meta-analyses of real data that evaluate different kinds of informative hypotheses on superpopulation: one is that alpha of a particular test is above the criterion value, and the other is that alphas among different test versions have ordered relationships. Informative hypotheses are supported from the data in both cases, suggesting that the proposed approach is promising for application. Copyright © 2015 John Wiley & Sons, Ltd.
Suggestions for presenting the results of data analyses
Anderson, David R.; Link, William A.; Johnson, Douglas H.; Burnham, Kenneth P.
2001-01-01
We give suggestions for the presentation of research results from frequentist, information-theoretic, and Bayesian analysis paradigms, followed by several general suggestions. The information-theoretic and Bayesian methods offer alternative approaches to data analysis and inference compared to traditionally used methods. Guidance is lacking on the presentation of results under these alternative procedures and on nontesting aspects of classical frequentists methods of statistical analysis. Null hypothesis testing has come under intense criticism. We recommend less reporting of the results of statistical tests of null hypotheses in cases where the null is surely false anyway, or where the null hypothesis is of little interest to science or management.
Islam, M T; Trevorah, R M; Appadoo, D R T; Best, S P; Chantler, C T
2017-04-15
We present methodology for the first FTIR measurements of ferrocene using dilute wax solutions for dispersion and to preserve non-crystallinity; a new method for removal of channel spectra interference for high quality data; and a consistent approach for the robust estimation of a defined uncertainty for advanced structural χ r 2 analysis and mathematical hypothesis testing. While some of these issues have been investigated previously, the combination of novel approaches gives markedly improved results. Methods for addressing these in the presence of a modest signal and how to quantify the quality of the data irrespective of preprocessing for subsequent hypothesis testing are applied to the FTIR spectra of Ferrocene (Fc) and deuterated ferrocene (dFc, Fc-d 10 ) collected at the THz/Far-IR beam-line of the Australian Synchrotron at operating temperatures of 7K through 353K. Copyright © 2017 Elsevier B.V. All rights reserved.
NASA Astrophysics Data System (ADS)
Islam, M. T.; Trevorah, R. M.; Appadoo, D. R. T.; Best, S. P.; Chantler, C. T.
2017-04-01
We present methodology for the first FTIR measurements of ferrocene using dilute wax solutions for dispersion and to preserve non-crystallinity; a new method for removal of channel spectra interference for high quality data; and a consistent approach for the robust estimation of a defined uncertainty for advanced structural χr2 analysis and mathematical hypothesis testing. While some of these issues have been investigated previously, the combination of novel approaches gives markedly improved results. Methods for addressing these in the presence of a modest signal and how to quantify the quality of the data irrespective of preprocessing for subsequent hypothesis testing are applied to the FTIR spectra of Ferrocene (Fc) and deuterated ferrocene (dFc, Fc-d10) collected at the THz/Far-IR beam-line of the Australian Synchrotron at operating temperatures of 7 K through 353 K.
Testing competing forms of the Milankovitch hypothesis: A multivariate approach
NASA Astrophysics Data System (ADS)
Kaufmann, Robert K.; Juselius, Katarina
2016-02-01
We test competing forms of the Milankovitch hypothesis by estimating the coefficients and diagnostic statistics for a cointegrated vector autoregressive model that includes 10 climate variables and four exogenous variables for solar insolation. The estimates are consistent with the physical mechanisms postulated to drive glacial cycles. They show that the climate variables are driven partly by solar insolation, determining the timing and magnitude of glaciations and terminations, and partly by internal feedback dynamics, pushing the climate variables away from equilibrium. We argue that the latter is consistent with a weak form of the Milankovitch hypothesis and that it should be restated as follows: internal climate dynamics impose perturbations on glacial cycles that are driven by solar insolation. Our results show that these perturbations are likely caused by slow adjustment between land ice volume and solar insolation. The estimated adjustment dynamics show that solar insolation affects an array of climate variables other than ice volume, each at a unique rate. This implies that previous efforts to test the strong form of the Milankovitch hypothesis by examining the relationship between solar insolation and a single climate variable are likely to suffer from omitted variable bias.
Bayesian models based on test statistics for multiple hypothesis testing problems.
Ji, Yuan; Lu, Yiling; Mills, Gordon B
2008-04-01
We propose a Bayesian method for the problem of multiple hypothesis testing that is routinely encountered in bioinformatics research, such as the differential gene expression analysis. Our algorithm is based on modeling the distributions of test statistics under both null and alternative hypotheses. We substantially reduce the complexity of the process of defining posterior model probabilities by modeling the test statistics directly instead of modeling the full data. Computationally, we apply a Bayesian FDR approach to control the number of rejections of null hypotheses. To check if our model assumptions for the test statistics are valid for various bioinformatics experiments, we also propose a simple graphical model-assessment tool. Using extensive simulations, we demonstrate the performance of our models and the utility of the model-assessment tool. In the end, we apply the proposed methodology to an siRNA screening and a gene expression experiment.
Mudge, Joseph F; Penny, Faith M; Houlahan, Jeff E
2012-12-01
Setting optimal significance levels that minimize Type I and Type II errors allows for more transparent and well-considered statistical decision making compared to the traditional α = 0.05 significance level. We use the optimal α approach to re-assess conclusions reached by three recently published tests of the pace-of-life syndrome hypothesis, which attempts to unify occurrences of different physiological, behavioral, and life history characteristics under one theory, over different scales of biological organization. While some of the conclusions reached using optimal α were consistent to those previously reported using the traditional α = 0.05 threshold, opposing conclusions were also frequently reached. The optimal α approach reduced probabilities of Type I and Type II errors, and ensured statistical significance was associated with biological relevance. Biologists should seriously consider their choice of α when conducting null hypothesis significance tests, as there are serious disadvantages with consistent reliance on the traditional but arbitrary α = 0.05 significance level. Copyright © 2012 WILEY Periodicals, Inc.
A Gaussian Model-Based Probabilistic Approach for Pulse Transit Time Estimation.
Jang, Dae-Geun; Park, Seung-Hun; Hahn, Minsoo
2016-01-01
In this paper, we propose a new probabilistic approach to pulse transit time (PTT) estimation using a Gaussian distribution model. It is motivated basically by the hypothesis that PTTs normalized by RR intervals follow the Gaussian distribution. To verify the hypothesis, we demonstrate the effects of arterial compliance on the normalized PTTs using the Moens-Korteweg equation. Furthermore, we observe a Gaussian distribution of the normalized PTTs on real data. In order to estimate the PTT using the hypothesis, we first assumed that R-waves in the electrocardiogram (ECG) can be correctly identified. The R-waves limit searching ranges to detect pulse peaks in the photoplethysmogram (PPG) and to synchronize the results with cardiac beats--i.e., the peaks of the PPG are extracted within the corresponding RR interval of the ECG as pulse peak candidates. Their probabilities of being the actual pulse peak are then calculated using a Gaussian probability function. The parameters of the Gaussian function are automatically updated when a new pulse peak is identified. This update makes the probability function adaptive to variations of cardiac cycles. Finally, the pulse peak is identified as the candidate with the highest probability. The proposed approach is tested on a database where ECG and PPG waveforms are collected simultaneously during the submaximal bicycle ergometer exercise test. The results are promising, suggesting that the method provides a simple but more accurate PTT estimation in real applications.
The evolution of bacterial cell size: the internal diffusion-constraint hypothesis.
Gallet, Romain; Violle, Cyrille; Fromin, Nathalie; Jabbour-Zahab, Roula; Enquist, Brian J; Lenormand, Thomas
2017-07-01
Size is one of the most important biological traits influencing organismal ecology and evolution. However, we know little about the drivers of body size evolution in unicellulars. A long-term evolution experiment (Lenski's LTEE) in which Escherichia coli adapts to a simple glucose medium has shown that not only the growth rate and the fitness of the bacterium increase over time but also its cell size. This increase in size contradicts prominent 'external diffusion' theory (EDC) predicting that cell size should have evolved toward smaller cells. Among several scenarios, we propose and test an alternative 'internal diffusion-constraint' (IDC) hypothesis for cell size evolution. A change in cell volume affects metabolite concentrations in the cytoplasm. The IDC states that a higher metabolism can be achieved by a reduction in the molecular traffic time inside of the cell, by increasing its volume. To test this hypothesis, we studied a population from the LTEE. We show that bigger cells with greater growth and CO 2 production rates and lower mass-to-volume ratio were selected over time in the LTEE. These results are consistent with the IDC hypothesis. This novel hypothesis offers a promising approach for understanding the evolutionary constraints on cell size.
Shifflett, Benjamin; Huang, Rong; Edland, Steven D
2017-01-01
Genotypic association studies are prone to inflated type I error rates if multiple hypothesis testing is performed, e.g., sequentially testing for recessive, multiplicative, and dominant risk. Alternatives to multiple hypothesis testing include the model independent genotypic χ 2 test, the efficiency robust MAX statistic, which corrects for multiple comparisons but with some loss of power, or a single Armitage test for multiplicative trend, which has optimal power when the multiplicative model holds but with some loss of power when dominant or recessive models underlie the genetic association. We used Monte Carlo simulations to describe the relative performance of these three approaches under a range of scenarios. All three approaches maintained their nominal type I error rates. The genotypic χ 2 and MAX statistics were more powerful when testing a strictly recessive genetic effect or when testing a dominant effect when the allele frequency was high. The Armitage test for multiplicative trend was most powerful for the broad range of scenarios where heterozygote risk is intermediate between recessive and dominant risk. Moreover, all tests had limited power to detect recessive genetic risk unless the sample size was large, and conversely all tests were relatively well powered to detect dominant risk. Taken together, these results suggest the general utility of the multiplicative trend test when the underlying genetic model is unknown.
Mody, R K; Meyer, S; Trees, E; White, P L; Nguyen, T; Sowadsky, R; Henao, O L; Lafon, P C; Austin, J; Azzam, I; Griffin, P M; Tauxe, R V; Smith, K; Williams, I T
2014-05-01
We investigated an outbreak of 396 Salmonella enterica serotype I 4,5,12:i:- infections to determine the source. After 7 weeks of extensive hypothesis-generation interviews, no refined hypothesis was formed. Nevertheless, a case-control study was initiated. Subsequently, an iterative hypothesis-generation approach used by a single interviewing team identified brand A not-ready-to-eat frozen pot pies as a likely vehicle. The case-control study, modified to assess this new hypothesis, along with product testing indicated that the turkey variety of pot pies was responsible. Review of product labels identified inconsistent language regarding preparation, and the cooking instructions included undefined microwave wattage categories. Surveys found that most patients did not follow the product's cooking instructions and did not know their oven's wattage. The manufacturer voluntarily recalled pot pies and improved the product's cooking instructions. This investigation highlights the value of careful hypothesis-generation and the risks posed by frozen not-ready-to-eat microwavable foods.
Moscoso del Prado Martín, Fermín
2013-12-01
I introduce the Bayesian assessment of scaling (BAS), a simple but powerful Bayesian hypothesis contrast methodology that can be used to test hypotheses on the scaling regime exhibited by a sequence of behavioral data. Rather than comparing parametric models, as typically done in previous approaches, the BAS offers a direct, nonparametric way to test whether a time series exhibits fractal scaling. The BAS provides a simpler and faster test than do previous methods, and the code for making the required computations is provided. The method also enables testing of finely specified hypotheses on the scaling indices, something that was not possible with the previously available methods. I then present 4 simulation studies showing that the BAS methodology outperforms the other methods used in the psychological literature. I conclude with a discussion of methodological issues on fractal analyses in experimental psychology. PsycINFO Database Record (c) 2014 APA, all rights reserved.
Firing the Executive: When an Analytic Approach to Problem Solving Helps and Hurts
ERIC Educational Resources Information Center
Aiello, Daniel A.; Jarosz, Andrew F.; Cushen, Patrick J.; Wiley, Jennifer
2012-01-01
There is a general assumption that a more controlled or more focused attentional state is beneficial for most cognitive tasks. However, there has been a growing realization that creative problem solving tasks, such as the Remote Associates Task (RAT), may benefit from a less controlled solution approach. To test this hypothesis, in a 2x2 design,…
A brain network instantiating approach and avoidance motivation.
Spielberg, Jeffrey M; Miller, Gregory A; Warren, Stacie L; Engels, Anna S; Crocker, Laura D; Banich, Marie T; Sutton, Bradley P; Heller, Wendy
2012-09-01
Research indicates that dorsolateral prefrontal cortex (DLPFC) is important for pursuing goals, and areas of DLPFC are differentially involved in approach and avoidance motivation. Given the complexity of the processes involved in goal pursuit, DLPFC is likely part of a network that includes orbitofrontal cortex (OFC), cingulate, amygdala, and basal ganglia. This hypothesis was tested with regard to one component of goal pursuit, the maintenance of goals in the face of distraction. Examination of connectivity with motivation-related areas of DLPFC supported the network hypothesis. Differential patterns of connectivity suggest a distinct role for DLPFC areas, with one involved in selecting approach goals, one in selecting avoidance goals, and one in selecting goal pursuit strategies. Finally, differences in trait motivation moderated connectivity between DLPFC and OFC, suggesting that this connectivity is important for instantiating motivation. Copyright © 2012 Society for Psychophysiological Research.
A Brain Network Instantiating Approach and Avoidance Motivation
Spielberg, Jeffrey M.; Miller, Gregory A.; Warren, Stacie L.; Engels, Anna S.; Crocker, Laura D.; Banich, Marie T.; Sutton, Bradley P.; Heller, Wendy
2015-01-01
Research indicates that dorsolateral prefrontal cortex (DLPFC) is important for pursuing goals, and areas of DLPFC are differentially involved in approach and avoidance motivation. Given the complexity of the processes involved in goal pursuit, DLPFC is likely part of a network that includes orbitofrontal cortex (OFC), cingulate, amygdala, and basal ganglia. This hypothesis was tested with regard to one component of goal pursuit, the maintenance of goals in the face of distraction. Examination of connectivity with motivation-related areas of DLPFC supported the network hypothesis. Differential patterns of connectivity suggest a distinct role for DLPFC areas, with one involved in selecting approach goals, one in selecting avoidance goals, and one in selecting goal pursuit strategies. Finally, differences in trait motivation moderated connectivity between DLPFC and OFC, suggesting that this connectivity is important for instantiating motivation. PMID:22845892
Simmons, L W
2003-07-01
The sexy-sperm hypothesis predicts that females obtain indirect benefits for their offspring via polyandy, in the form of increased fertilization success for their sons. I use a quantitative genetic approach to test the sexy-sperm hypothesis using the field cricket Teleogryllus oceanicus. Previous studies of this species have shown considerable phenotypic variation in fertilization success when two or more males compete. There were high broad-sense heritabilities for both paternity and polyandry. Patterns of genotypic variance were consistent with X-linked inheritance and/or maternal effects on these traits. The genetic architecture therefore precludes the evolution of polyandry via a sexy-sperm process. Thus the positive genetic correlation between paternity in sons and polyandry in daughters predicted by the sexy-sperm hypothesis was absent. There was significant heritable variation in the investment by females in ovaries and by males in the accessory gland. Surprisingly there was a very strong genetic correlation between these two traits. The significance of this genetic correlation for the coevolution of male seminal products and polyandry is discussed.
NASA Technical Reports Server (NTRS)
Phatak, A. V.
1980-01-01
A systematic analytical approach to the determination of helicopter IFR precision approach requirements is formulated. The approach is based upon the hypothesis that pilot acceptance level or opinion rating of a given system is inversely related to the degree of pilot involvement in the control task. A nonlinear simulation of the helicopter approach to landing task incorporating appropriate models for UH-1H aircraft, the environmental disturbances and the human pilot was developed as a tool for evaluating the pilot acceptance hypothesis. The simulated pilot model is generic in nature and includes analytical representation of the human information acquisition, processing, and control strategies. Simulation analyses in the flight director mode indicate that the pilot model used is reasonable. Results of the simulation are used to identify candidate pilot workload metrics and to test the well known performance-work-load relationship. A pilot acceptance analytical methodology is formulated as a basis for further investigation, development and validation.
Custodio, Tomas; Garcia, Jose; Markovski, Jasmina; McKay Gifford, James; Hristovski, Kiril D; Olson, Larry W
2017-12-15
The underlying hypothesis of this study was that pseudo-equilibrium and column testing conditions would provide the same sorbent ranking trends although the values of sorbents' performance descriptors (e.g. sorption capacity) may vary because of different kinetics and competition effects induced by the two testing approaches. To address this hypothesis, nano-enabled hybrid media were fabricated and its removal performances were assessed for two model contaminants under multi-point batch pseudo-equilibrium and continuous-flow conditions. Calculation of simultaneous removal capacity indices (SRC) demonstrated that the more resource demanding continuous-flow tests are able to generate the same performance rankings as the ones obtained by conducing the simpler pseudo-equilibrium tests. Furthermore, continuous overlap between the 98% confidence boundaries for each SRC index trend, not only validated the hypothesis that both testing conditions provide the same ranking trends, but also pointed that SRC indices are statistically the same for each media, regardless of employed method. In scenarios where rapid screening of new media is required to obtain the best performing synthesis formulation, use of pseudo-equilibrium tests proved to be reliable. Considering that kinetics induced effects on sorption capacity must not be neglected, more resource demanding column test could be conducted only with the top performing media that exhibit the highest sorption capacity. Copyright © 2017 Elsevier B.V. All rights reserved.
Behavioral Treatment of Pseudobulbar Affect: A Case Report.
Perotti, Laurence P; Cummings, Latiba D; Mercado, Janyna
2016-04-01
To determine if it is possible to successfully treat pseudobulbar affect (PBA) using a behavioral approach. Two experiments were conducted, each a double reversal design with the same single subject in both. The first experiment tested the hypothesis that the rate of PBA could be controlled by manipulation of its consequences. The second experiment tested the hypothesis that use of a self-control procedure would control the rate of PBA. Rate of PBA could not be controlled by consequence manipulation, but rate of PBA could be controlled through use of a self-control procedure. Pending confirmatory research, behavioral interventions utilizing self-control procedures should be considered in patients with PBA. © 2016 Wiley Periodicals, Inc.
Matthews, Luke J.; Tehrani, Jamie J.; Jordan, Fiona M.; Collard, Mark; Nunn, Charles L.
2011-01-01
Background Archaeologists and anthropologists have long recognized that different cultural complexes may have distinct descent histories, but they have lacked analytical techniques capable of easily identifying such incongruence. Here, we show how Bayesian phylogenetic analysis can be used to identify incongruent cultural histories. We employ the approach to investigate Iranian tribal textile traditions. Methods We used Bayes factor comparisons in a phylogenetic framework to test two models of cultural evolution: the hierarchically integrated system hypothesis and the multiple coherent units hypothesis. In the hierarchically integrated system hypothesis, a core tradition of characters evolves through descent with modification and characters peripheral to the core are exchanged among contemporaneous populations. In the multiple coherent units hypothesis, a core tradition does not exist. Rather, there are several cultural units consisting of sets of characters that have different histories of descent. Results For the Iranian textiles, the Bayesian phylogenetic analyses supported the multiple coherent units hypothesis over the hierarchically integrated system hypothesis. Our analyses suggest that pile-weave designs represent a distinct cultural unit that has a different phylogenetic history compared to other textile characters. Conclusions The results from the Iranian textiles are consistent with the available ethnographic evidence, which suggests that the commercial rug market has influenced pile-rug designs but not the techniques or designs incorporated in the other textiles produced by the tribes. We anticipate that Bayesian phylogenetic tests for inferring cultural units will be of great value for researchers interested in studying the evolution of cultural traits including language, behavior, and material culture. PMID:21559083
Modeling Cross-Situational Word–Referent Learning: Prior Questions
Yu, Chen; Smith, Linda B.
2013-01-01
Both adults and young children possess powerful statistical computation capabilities—they can infer the referent of a word from highly ambiguous contexts involving many words and many referents by aggregating cross-situational statistical information across contexts. This ability has been explained by models of hypothesis testing and by models of associative learning. This article describes a series of simulation studies and analyses designed to understand the different learning mechanisms posited by the 2 classes of models and their relation to each other. Variants of a hypothesis-testing model and a simple or dumb associative mechanism were examined under different specifications of information selection, computation, and decision. Critically, these 3 components of the models interact in complex ways. The models illustrate a fundamental tradeoff between amount of data input and powerful computations: With the selection of more information, dumb associative models can mimic the powerful learning that is accomplished by hypothesis-testing models with fewer data. However, because of the interactions among the component parts of the models, the associative model can mimic various hypothesis-testing models, producing the same learning patterns but through different internal components. The simulations argue for the importance of a compositional approach to human statistical learning: the experimental decomposition of the processes that contribute to statistical learning in human learners and models with the internal components that can be evaluated independently and together. PMID:22229490
Raggad, Bechir
2018-05-01
This study investigates the existence of long-run relationship between CO 2 emissions, economic growth, energy use, and urbanization in Saudi Arabia over the period 1971-2014. The autoregressive distributed lag (ARDL) approach with structural breaks, where structural breaks are identified with the recently impulse saturation break tests, is applied to conduct the analysis. The bounds test result supports the existence of long-run relationship among the variables. The existence of environmental Kuznets curve (EKC) hypothesis has also been tested. The results reveal the non-validity of the EKC hypothesis for Saudi Arabia as the relationship between GDP and pollution is positive in both the short and the long run. Moreover, energy use increases pollution both in short and long run in the country. On the contrary, the results show a negative and significant impact of urbanization on carbon emissions in Saudi Arabia, which means that urban development is not an obstacle to the improvement of environmental quality. Consequently, policy-makers in Saudi Arabia should consider the efficiency enhancement, frugality in energy consumption, and especially increase the share of renewable energies in the total energy mix.
Quantification and Visualization of Variation in Anatomical Trees
DOE Office of Scientific and Technical Information (OSTI.GOV)
Amenta, Nina; Datar, Manasi; Dirksen, Asger
This paper presents two approaches to quantifying and visualizing variation in datasets of trees. The first approach localizes subtrees in which significant population differences are found through hypothesis testing and sparse classifiers on subtree features. The second approach visualizes the global metric structure of datasets through low-distortion embedding into hyperbolic planes in the style of multidimensional scaling. A case study is made on a dataset of airway trees in relation to Chronic Obstructive Pulmonary Disease.
The structure of affective action representations: temporal binding of affective response codes.
Eder, Andreas B; Müsseler, Jochen; Hommel, Bernhard
2012-01-01
Two experiments examined the hypothesis that preparing an action with a specific affective connotation involves the binding of this action to an affective code reflecting this connotation. This integration into an action plan should lead to a temporary occupation of the affective code, which should impair the concurrent representation of affectively congruent events, such as the planning of another action with the same valence. This hypothesis was tested with a dual-task setup that required a speeded choice between approach- and avoidance-type lever movements after having planned and before having executed an evaluative button press. In line with the code-occupation hypothesis, slower lever movements were observed when the lever movement was affectively compatible with the prepared evaluative button press than when the two actions were affectively incompatible. Lever movements related to approach and avoidance and evaluative button presses thus seem to share a code that represents affective meaning. A model of affective action control that is based on the theory of event coding is discussed.
The role of control groups in mutagenicity studies: matching biological and statistical relevance.
Hauschke, Dieter; Hothorn, Torsten; Schäfer, Juliane
2003-06-01
The statistical test of the conventional hypothesis of "no treatment effect" is commonly used in the evaluation of mutagenicity experiments. Failing to reject the hypothesis often leads to the conclusion in favour of safety. The major drawback of this indirect approach is that what is controlled by a prespecified level alpha is the probability of erroneously concluding hazard (producer risk). However, the primary concern of safety assessment is the control of the consumer risk, i.e. limiting the probability of erroneously concluding that a product is safe. In order to restrict this risk, safety has to be formulated as the alternative, and hazard, i.e. the opposite, has to be formulated as the hypothesis. The direct safety approach is examined for the case when the corresponding threshold value is expressed either as a fraction of the population mean for the negative control, or as a fraction of the difference between the positive and negative controls.
The Global Phylogeography of Lyssaviruses - Challenging the 'Out of Africa' Hypothesis
Fooks, Anthony R.; Marston, Denise A.; Garcia-R, Juan C.
2016-01-01
Rabies virus kills tens of thousands of people globally each year, especially in resource-limited countries. Yet, there are genetically- and antigenically-related lyssaviruses, all capable of causing the disease rabies, circulating globally among bats without causing conspicuous disease outbreaks. The species richness and greater genetic diversity of African lyssaviruses, along with the lack of antibody cross-reactivity among them, has led to the hypothesis that Africa is the origin of lyssaviruses. This hypothesis was tested using a probabilistic phylogeographical approach. The nucleoprotein gene sequences from 153 representatives of 16 lyssavirus species, collected between 1956 and 2015, were used to develop a phylogenetic tree which incorporated relevant geographic and temporal data relating to the viruses. In addition, complete genome sequences from all 16 (putative) species were analysed. The most probable ancestral distribution for the internal nodes was inferred using three different approaches and was confirmed by analysis of complete genomes. These results support a Palearctic origin for lyssaviruses (posterior probability = 0.85), challenging the ‘out of Africa’ hypothesis, and suggest three independent transmission events to the Afrotropical region, representing the three phylogroups that form the three major lyssavirus clades. PMID:28036390
Ensembles vs. information theory: supporting science under uncertainty
NASA Astrophysics Data System (ADS)
Nearing, Grey S.; Gupta, Hoshin V.
2018-05-01
Multi-model ensembles are one of the most common ways to deal with epistemic uncertainty in hydrology. This is a problem because there is no known way to sample models such that the resulting ensemble admits a measure that has any systematic (i.e., asymptotic, bounded, or consistent) relationship with uncertainty. Multi-model ensembles are effectively sensitivity analyses and cannot - even partially - quantify uncertainty. One consequence of this is that multi-model approaches cannot support a consistent scientific method - in particular, multi-model approaches yield unbounded errors in inference. In contrast, information theory supports a coherent hypothesis test that is robust to (i.e., bounded under) arbitrary epistemic uncertainty. This paper may be understood as advocating a procedure for hypothesis testing that does not require quantifying uncertainty, but is coherent and reliable (i.e., bounded) in the presence of arbitrary (unknown and unknowable) uncertainty. We conclude by offering some suggestions about how this proposed philosophy of science suggests new ways to conceptualize and construct simulation models of complex, dynamical systems.
Current-State Constrained Filter Bank for Wald Testing of Spacecraft Conjunctions
NASA Technical Reports Server (NTRS)
Carpenter, J. Russell; Markley, F. Landis
2012-01-01
We propose a filter bank consisting of an ordinary current-state extended Kalman filter, and two similar but constrained filters: one is constrained by a null hypothesis that the miss distance between two conjuncting spacecraft is inside their combined hard body radius at the predicted time of closest approach, and one is constrained by an alternative complementary hypothesis. The unconstrained filter is the basis of an initial screening for close approaches of interest. Once the initial screening detects a possibly risky conjunction, the unconstrained filter also governs measurement editing for all three filters, and predicts the time of closest approach. The constrained filters operate only when conjunctions of interest occur. The computed likelihoods of the innovations of the two constrained filters form a ratio for a Wald sequential probability ratio test. The Wald test guides risk mitigation maneuver decisions based on explicit false alarm and missed detection criteria. Since only current-state Kalman filtering is required to compute the innovations for the likelihood ratio, the present approach does not require the mapping of probability density forward to the time of closest approach. Instead, the hard-body constraint manifold is mapped to the filter update time by applying a sigma-point transformation to a projection function. Although many projectors are available, we choose one based on Lambert-style differential correction of the current-state velocity. We have tested our method using a scenario based on the Magnetospheric Multi-Scale mission, scheduled for launch in late 2014. This mission involves formation flight in highly elliptical orbits of four spinning spacecraft equipped with antennas extending 120 meters tip-to-tip. Eccentricities range from 0.82 to 0.91, and close approaches generally occur in the vicinity of perigee, where rapid changes in geometry may occur. Testing the method using two 12,000-case Monte Carlo simulations, we found the method achieved a missed detection rate of 0.1%, and a false alarm rate of 2%.
A new modeling and inference approach for the Systolic Blood Pressure Intervention Trial outcomes.
Yang, Song; Ambrosius, Walter T; Fine, Lawrence J; Bress, Adam P; Cushman, William C; Raj, Dominic S; Rehman, Shakaib; Tamariz, Leonardo
2018-06-01
Background/aims In clinical trials with time-to-event outcomes, usually the significance tests and confidence intervals are based on a proportional hazards model. Thus, the temporal pattern of the treatment effect is not directly considered. This could be problematic if the proportional hazards assumption is violated, as such violation could impact both interim and final estimates of the treatment effect. Methods We describe the application of inference procedures developed recently in the literature for time-to-event outcomes when the treatment effect may or may not be time-dependent. The inference procedures are based on a new model which contains the proportional hazards model as a sub-model. The temporal pattern of the treatment effect can then be expressed and displayed. The average hazard ratio is used as the summary measure of the treatment effect. The test of the null hypothesis uses adaptive weights that often lead to improvement in power over the log-rank test. Results Without needing to assume proportional hazards, the new approach yields results consistent with previously published findings in the Systolic Blood Pressure Intervention Trial. It provides a visual display of the time course of the treatment effect. At four of the five scheduled interim looks, the new approach yields smaller p values than the log-rank test. The average hazard ratio and its confidence interval indicates a treatment effect nearly a year earlier than a restricted mean survival time-based approach. Conclusion When the hazards are proportional between the comparison groups, the new methods yield results very close to the traditional approaches. When the proportional hazards assumption is violated, the new methods continue to be applicable and can potentially be more sensitive to departure from the null hypothesis.
Hypothesis testing for differentially correlated features.
Sheng, Elisa; Witten, Daniela; Zhou, Xiao-Hua
2016-10-01
In a multivariate setting, we consider the task of identifying features whose correlations with the other features differ across conditions. Such correlation shifts may occur independently of mean shifts, or differences in the means of the individual features across conditions. Previous approaches for detecting correlation shifts consider features simultaneously, by computing a correlation-based test statistic for each feature. However, since correlations involve two features, such approaches do not lend themselves to identifying which feature is the culprit. In this article, we instead consider a serial testing approach, by comparing columns of the sample correlation matrix across two conditions, and removing one feature at a time. Our method provides a novel perspective and favorable empirical results compared with competing approaches. © The Author 2016. Published by Oxford University Press. All rights reserved. For permissions, please e-mail: journals.permissions@oup.com.
Explorations in statistics: hypothesis tests and P values.
Curran-Everett, Douglas
2009-06-01
Learning about statistics is a lot like learning about science: the learning is more meaningful if you can actively explore. This second installment of Explorations in Statistics delves into test statistics and P values, two concepts fundamental to the test of a scientific null hypothesis. The essence of a test statistic is that it compares what we observe in the experiment to what we expect to see if the null hypothesis is true. The P value associated with the magnitude of that test statistic answers this question: if the null hypothesis is true, what proportion of possible values of the test statistic are at least as extreme as the one I got? Although statisticians continue to stress the limitations of hypothesis tests, there are two realities we must acknowledge: hypothesis tests are ingrained within science, and the simple test of a null hypothesis can be useful. As a result, it behooves us to explore the notions of hypothesis tests, test statistics, and P values.
Evaluation of the Air Void Analyzer
2013-07-01
lack of measurement would help explain the difference in values shown. Brief descriptions of other unpublished testing (Wang et al. 2008) CTL Group...structure measurements taken from the controlled laboratory mixtures. A three-phase approach was used to evaluate the machine. First, a global ...method. Hypothesis testing using t-statistics was performed to increase understanding of the data collected globally in terms of the processes used for
ERIC Educational Resources Information Center
Vasu, Ellen S.; Elmore, Patricia B.
The effects of the violation of the assumption of normality coupled with the condition of multicollinearity upon the outcome of testing the hypothesis Beta equals zero in the two-predictor regression equation is investigated. A monte carlo approach was utilized in which three differenct distributions were sampled for two sample sizes over…
Genome-wide detection of intervals of genetic heterogeneity associated with complex traits
Llinares-López, Felipe; Grimm, Dominik G.; Bodenham, Dean A.; Gieraths, Udo; Sugiyama, Mahito; Rowan, Beth; Borgwardt, Karsten
2015-01-01
Motivation: Genetic heterogeneity, the fact that several sequence variants give rise to the same phenotype, is a phenomenon that is of the utmost interest in the analysis of complex phenotypes. Current approaches for finding regions in the genome that exhibit genetic heterogeneity suffer from at least one of two shortcomings: (i) they require the definition of an exact interval in the genome that is to be tested for genetic heterogeneity, potentially missing intervals of high relevance, or (ii) they suffer from an enormous multiple hypothesis testing problem due to the large number of potential candidate intervals being tested, which results in either many false positives or a lack of power to detect true intervals. Results: Here, we present an approach that overcomes both problems: it allows one to automatically find all contiguous sequences of single nucleotide polymorphisms in the genome that are jointly associated with the phenotype. It also solves both the inherent computational efficiency problem and the statistical problem of multiple hypothesis testing, which are both caused by the huge number of candidate intervals. We demonstrate on Arabidopsis thaliana genome-wide association study data that our approach can discover regions that exhibit genetic heterogeneity and would be missed by single-locus mapping. Conclusions: Our novel approach can contribute to the genome-wide discovery of intervals that are involved in the genetic heterogeneity underlying complex phenotypes. Availability and implementation: The code can be obtained at: http://www.bsse.ethz.ch/mlcb/research/bioinformatics-and-computational-biology/sis.html. Contact: felipe.llinares@bsse.ethz.ch Supplementary information: Supplementary data are available at Bioinformatics online. PMID:26072488
An Herbal Derivative as the Basis for a New Approach to Treating Post-Traumatic Osteoarthritis
2017-09-01
AWARD NUMBER: W81XWH-15-1-0397 TITLE: An Herbal Derivative as the Basis for a New Approach to Treating Post - Traumatic Osteoarthritis...TITLE AND SUBTITLE An Herbal Derivative as the Basis for a New Approach to Treating Post - Traumatic Osteoarthritis 5a. CONTRACT NUMBER 5b. GRANT...responsible for charging tRNAs with the amino acid proline. The goal of this grant is to test the hypothesis that EPRS inhibitors will provide the basis
NASA Astrophysics Data System (ADS)
Leonard, William H.; Cavana, Gordon R.; Lowery, Lawrence F.
Discretion-the exercise of independent judgment-was observed to be lacking in most commercially available laboratory investigations for high school biology. An Extended Discretion (ED) laboratory approach was developed and tested experimentally against the BSCS Green Version laboratory program, using ten classes of 10th-grade biology in a suburban California high school. Five teachers were each assigned one experimental and one control group. The primary differences between the two approaches were that the BSCS was more prescriptive and directive than the ED approach and the ED approach increased discretionary demands upon the student over the school year. A treatment verification procedure showed statistically significant differences between the two approaches. The hypothesis under test was that when high school biology students are taught laboratory concepts under comparatively high discretionary demands, they would perform as well as or better than a similar group of students taught with BSCS Green Version investigations. A second hypothesis was that teachers would prefer to use the ED approach over the BSCS approach for their future classes. A t analysis between experimental and control groups for each teacher was employed. There were significant differences in favor of the ED group on laboratory report scores for three teachers and no differences for two teachers. There were significant differences in favor of the ED group on laboratory concepts quiz scores for three teachers, no differences for one teacher, and significant differences in favor of the BSCS group for only one teacher. A t analysis of teacher evaluation of the two approaches showed a significant teacher preference overall for the ED approach. Both experimental hypotheses were accepted. The ED approach was observed to be difficult for students at first, but it was found to be a workable and productive means of teaching laboratory concepts in biology which also required extensive use of individual student discretion.
Metastatic melanoma moves on: translational science in the era of personalized medicine.
Levesque, Mitchell P; Cheng, Phil F; Raaijmakers, Marieke I G; Saltari, Annalisa; Dummer, Reinhard
2017-03-01
Progress in understanding and treating metastatic melanoma is the result of decades of basic and translational research as well as the development of better in vitro tools for modeling the disease. Here, we review the latest therapeutic options for metastatic melanoma and the known genetic and non-genetic mechanisms of resistance to these therapies, as well as the in vitro toolbox that has provided the greatest insights into melanoma progression. These include next-generation sequencing technologies and more complex 2D and 3D cell culture models to functionally test the data generated by genomics approaches. The combination of hypothesis generating and hypothesis testing paradigms reviewed here will be the foundation for the next phase of metastatic melanoma therapies in the coming years.
ERIC Educational Resources Information Center
Marsh, Herbert W.; Hau, Kit-Tai; Wen, Zhonglin
2004-01-01
Goodness-of-fit (GOF) indexes provide "rules of thumb"?recommended cutoff values for assessing fit in structural equation modeling. Hu and Bentler (1999) proposed a more rigorous approach to evaluating decision rules based on GOF indexes and, on this basis, proposed new and more stringent cutoff values for many indexes. This article discusses…
On Some Assumptions of the Null Hypothesis Statistical Testing
ERIC Educational Resources Information Center
Patriota, Alexandre Galvão
2017-01-01
Bayesian and classical statistical approaches are based on different types of logical principles. In order to avoid mistaken inferences and misguided interpretations, the practitioner must respect the inference rules embedded into each statistical method. Ignoring these principles leads to the paradoxical conclusions that the hypothesis…
Delay discounting moderates the effect of food reinforcement on energy intake among non-obese women☆
Rollins, Brandi Y.; Dearing, Kelly K.; Epstein, Leonard H.
2011-01-01
Recent theoretical approaches to food intake hypothesize that eating represents a balance between reward-driven motivation to eat versus inhibitory executive function processes, however this hypothesis remains to be tested. The objective of the current study was to test the hypothesis that the motivation to eat, operationalized by the relative reinforcing value (RRV) of food, and inhibitory processes, assessed by delay discounting (DD), interact to influence energy intake in an ad libitum eating task. Female subjects (n = 24) completed a DD of money procedure, RRV task, and an ad libitum eating task in counterbalanced sessions. RRV of food predicted total energy intake, however the effect of the RRV of food on energy intake was moderated by DD. Women higher in DD and RRV of food consumed greater total energy, whereas women higher in RRV of food but lower in DD consumed less total energy. Our findings support the hypothesis that reinforcing value and executive function mediated processes interactively influence food consumption. PMID:20678532
New Approaches to Robust Confidence Intervals for Location: A Simulation Study.
1984-06-01
obtain a denominator for the test statistic. Those statistics based on location estimates derived from Hampel’s redescending influence function or v...defined an influence function for a test in terms of the behavior of its P-values when the data are sampled from a model distribution modified by point...proposal could be used for interval estimation as well as hypothesis testing, the extension is immediate. Once an influence function has been defined
1990-09-12
electronics reading to the next. To test this hypothesis and the suitability of EBL to acquiring schemas, I have implemented an automated reader/learner as...used. For example, testing the utility of a kidnapping schema using several readings about kidnapping can only go so far toward establishing the...the cost of carrying the new rules while processing unrelated material will be underestimated. The present research tests the utility of new schemas in
Testing a single regression coefficient in high dimensional linear models
Zhong, Ping-Shou; Li, Runze; Wang, Hansheng; Tsai, Chih-Ling
2017-01-01
In linear regression models with high dimensional data, the classical z-test (or t-test) for testing the significance of each single regression coefficient is no longer applicable. This is mainly because the number of covariates exceeds the sample size. In this paper, we propose a simple and novel alternative by introducing the Correlated Predictors Screening (CPS) method to control for predictors that are highly correlated with the target covariate. Accordingly, the classical ordinary least squares approach can be employed to estimate the regression coefficient associated with the target covariate. In addition, we demonstrate that the resulting estimator is consistent and asymptotically normal even if the random errors are heteroscedastic. This enables us to apply the z-test to assess the significance of each covariate. Based on the p-value obtained from testing the significance of each covariate, we further conduct multiple hypothesis testing by controlling the false discovery rate at the nominal level. Then, we show that the multiple hypothesis testing achieves consistent model selection. Simulation studies and empirical examples are presented to illustrate the finite sample performance and the usefulness of the proposed method, respectively. PMID:28663668
Testing a single regression coefficient in high dimensional linear models.
Lan, Wei; Zhong, Ping-Shou; Li, Runze; Wang, Hansheng; Tsai, Chih-Ling
2016-11-01
In linear regression models with high dimensional data, the classical z -test (or t -test) for testing the significance of each single regression coefficient is no longer applicable. This is mainly because the number of covariates exceeds the sample size. In this paper, we propose a simple and novel alternative by introducing the Correlated Predictors Screening (CPS) method to control for predictors that are highly correlated with the target covariate. Accordingly, the classical ordinary least squares approach can be employed to estimate the regression coefficient associated with the target covariate. In addition, we demonstrate that the resulting estimator is consistent and asymptotically normal even if the random errors are heteroscedastic. This enables us to apply the z -test to assess the significance of each covariate. Based on the p -value obtained from testing the significance of each covariate, we further conduct multiple hypothesis testing by controlling the false discovery rate at the nominal level. Then, we show that the multiple hypothesis testing achieves consistent model selection. Simulation studies and empirical examples are presented to illustrate the finite sample performance and the usefulness of the proposed method, respectively.
Protecting Privacy Using k-Anonymity
El Emam, Khaled; Dankar, Fida Kamal
2008-01-01
Objective There is increasing pressure to share health information and even make it publicly available. However, such disclosures of personal health information raise serious privacy concerns. To alleviate such concerns, it is possible to anonymize the data before disclosure. One popular anonymization approach is k-anonymity. There have been no evaluations of the actual re-identification probability of k-anonymized data sets. Design Through a simulation, we evaluated the re-identification risk of k-anonymization and three different improvements on three large data sets. Measurement Re-identification probability is measured under two different re-identification scenarios. Information loss is measured by the commonly used discernability metric. Results For one of the re-identification scenarios, k-Anonymity consistently over-anonymizes data sets, with this over-anonymization being most pronounced with small sampling fractions. Over-anonymization results in excessive distortions to the data (i.e., high information loss), making the data less useful for subsequent analysis. We found that a hypothesis testing approach provided the best control over re-identification risk and reduces the extent of information loss compared to baseline k-anonymity. Conclusion Guidelines are provided on when to use the hypothesis testing approach instead of baseline k-anonymity. PMID:18579830
Hardy, Nate B.; Otto, Sarah P.
2014-01-01
Evolutionary biologists have often assumed that ecological generalism comes at the expense of less intense exploitation of specific resources and that this trade-off will promote the evolution of ecologically specialized daughter species. Using a phylogenetic comparative approach with butterflies as a model system, we test hypotheses that incorporate changes in niche breadth and location into explanations of the taxonomic diversification of insect herbivores. Specifically, we compare the oscillation hypothesis, where speciation is driven by host-plant generalists giving rise to specialist daughter species, to the musical chairs hypothesis, where speciation is driven by host-plant switching, without changes in niche breadth. Contrary to the predictions of the oscillation hypothesis, we recover a negative relationship between host-plant breadth and diversification rate and find that changes in host breadth are seldom coupled to speciation events. By contrast, we present evidence for a positive relationship between rates of host switching and butterfly diversification, consonant with the musical chairs hypothesis. These results suggest that the costs of trophic generalism in plant-feeding insects may have been overvalued and that transitions from generalists to ecological specialists may not be an important driver of speciation in general. PMID:25274368
A functional perspective on social marketing: insights from Israel's bicycle helmet campaign.
Ressler, W H; Toledo, E
1997-01-01
This article examines the functional approach to attitudes for its potential contribution to improving models of attitude-behavior consistency and to demonstrate its potential application to social marketing. To this end, a study of children's attitudes toward bicycle helmets is reported on and its results examined. The study was undertaken to plan Israel's first-ever media campaign to encourage the use of helmets by children. Responses of the 783 Israeli children (ages 7 to 14 years) who participated in the study are analyzed to test the hypothesis generated by this application of functional theory--that children's attitudes toward wearing bicycle helmets serve primarily an expressive function. The results suggest cautious support for the functional hypothesis. In conclusion, possible extensions of this approach to other areas of social marketing are discussed.
Chen, Yvonne Y; Caplan, Jeremy B
2017-01-01
During study trials of a recognition memory task, alpha (∼10 Hz) oscillations decrease, and concurrently, theta (4-8 Hz) oscillations increase when later memory is successful versus unsuccessful (subsequent memory effect). Likewise, at test, reduced alpha and increased theta activity are associated with successful memory (retrieval success effect). Here we take an individual-differences approach to test three hypotheses about theta and alpha oscillations in verbal, old/new recognition, measuring the difference in oscillations between hit trials and miss trials. First, we test the hypothesis that theta and alpha oscillations have a moderately mutually exclusive relationship; but no support for this hypothesis was found. Second, we test the hypothesis that theta oscillations explain not only memory effects within participants, but also individual differences. Supporting this prediction, durations of theta (but not alpha) oscillations at study and at test correlated significantly with d' across participants. Third, we test the hypothesis that theta and alpha oscillations reflect familiarity and recollection processes by comparing oscillation measures to ERPs that are implicated in familiarity and recollection. The alpha-oscillation effects correlated with some ERP measures, but inversely, suggesting that the actions of alpha oscillations on memory processes are distinct from the roles of familiarity- and recollection-linked ERP signals. The theta-oscillation measures, despite differentiating hits from misses, did not correlate with any ERP measure; thus, theta oscillations may reflect elaborative processes not tapped by recollection-related ERPs. Our findings are consistent with alpha oscillations reflecting visual inattention, which can modulate memory, and with theta oscillations supporting recognition memory in ways that complement the most commonly studied ERPs.
2017-04-06
Research Hypothesis ........................................................................................................... 15 Research Design ...user community and of accommodating advancing software applications by the vendors. Research Design My approach to this project was to conduct... design descriptions , requirements specifications, test documentation, interface requirement specifications, product specifications, and software
Faculty Development for Educators: A Realist Evaluation
ERIC Educational Resources Information Center
Sorinola, Olanrewaju O.; Thistlethwaite, Jill; Davies, David; Peile, Ed
2015-01-01
The effectiveness of faculty development (FD) activities for educators in UK medical schools remains underexplored. This study used a realist approach to evaluate FD and to test the hypothesis that motivation, engagement and perception are key mechanisms of effective FD activities. The authors observed and interviewed 33 course participants at one…
Movement Regulation of Handsprings on Vault
ERIC Educational Resources Information Center
Heinen, Thomas; Vinken, Pia M.; Jeraj, Damian; Velentzas, Konstantinos
2013-01-01
Purpose: Visual information is utilized in gymnastics vaulting. The question remains as to which informational sources are used to regulate handspring performance. The purpose of this study was to examine springboard and vaulting table position as informational sources in gymnastics vaulting. The hypothesis tested was that the approach-run and…
Three Strategies for the Critical Use of Statistical Methods in Psychological Research
ERIC Educational Resources Information Center
Campitelli, Guillermo; Macbeth, Guillermo; Ospina, Raydonal; Marmolejo-Ramos, Fernando
2017-01-01
We present three strategies to replace the null hypothesis statistical significance testing approach in psychological research: (1) visual representation of cognitive processes and predictions, (2) visual representation of data distributions and choice of the appropriate distribution for analysis, and (3) model comparison. The three strategies…
Application of Transformations in Parametric Inference
ERIC Educational Resources Information Center
Brownstein, Naomi; Pensky, Marianna
2008-01-01
The objective of the present paper is to provide a simple approach to statistical inference using the method of transformations of variables. We demonstrate performance of this powerful tool on examples of constructions of various estimation procedures, hypothesis testing, Bayes analysis and statistical inference for the stress-strength systems.…
Performance Indicators: Information in Search of a Valid and Reliable Use.
ERIC Educational Resources Information Center
Carrigan, Sarah D.; Hackett, E. Raymond
1998-01-01
Examined the usefulness of performance indicators in campus decision making at 20 institutions with Carnegie Baccalaureate II classification using hypothesis testing and case-study approaches. Performance measures most commonly cited as measures of financial viability were of limited use for specific policy development, but were most useful within…
Testing the Turing Test — do Men Pass It?
NASA Astrophysics Data System (ADS)
Adam, Ruth; Hershberg, Uri; Schul, Yaacov; Solomon, Sorin
We are fascinated by the idea of giving life to the inanimate. The fields of Artificial Life and Artificial Intelligence (AI) attempt to use a scientific approach to pursue this desire. The first steps on this approach hark back to Turing and his suggestion of an imitation game as an alternative answer to the question "can machines think?".1 To test his hypothesis, Turing formulated the Turing test1 to detect human behavior in computers. But how do humans pass such a test? What would you say if you would learn that they do not pass it well? What would it mean for our understanding of human behavior? What would it mean for our design of tests of the success of artificial life? We report below an experiment in which men consistently failed the Turing test.
A Bayesian framework to estimate diversification rates and their variation through time and space
2011-01-01
Background Patterns of species diversity are the result of speciation and extinction processes, and molecular phylogenetic data can provide valuable information to derive their variability through time and across clades. Bayesian Markov chain Monte Carlo methods offer a promising framework to incorporate phylogenetic uncertainty when estimating rates of diversification. Results We introduce a new approach to estimate diversification rates in a Bayesian framework over a distribution of trees under various constant and variable rate birth-death and pure-birth models, and test it on simulated phylogenies. Furthermore, speciation and extinction rates and their posterior credibility intervals can be estimated while accounting for non-random taxon sampling. The framework is particularly suitable for hypothesis testing using Bayes factors, as we demonstrate analyzing dated phylogenies of Chondrostoma (Cyprinidae) and Lupinus (Fabaceae). In addition, we develop a model that extends the rate estimation to a meta-analysis framework in which different data sets are combined in a single analysis to detect general temporal and spatial trends in diversification. Conclusions Our approach provides a flexible framework for the estimation of diversification parameters and hypothesis testing while simultaneously accounting for uncertainties in the divergence times and incomplete taxon sampling. PMID:22013891
Diaz, Francisco J.; McDonald, Peter R.; Pinter, Abraham; Chaguturu, Rathnam
2018-01-01
Biomolecular screening research frequently searches for the chemical compounds that are most likely to make a biochemical or cell-based assay system produce a strong continuous response. Several doses are tested with each compound and it is assumed that, if there is a dose-response relationship, the relationship follows a monotonic curve, usually a version of the median-effect equation. However, the null hypothesis of no relationship cannot be statistically tested using this equation. We used a linearized version of this equation to define a measure of pharmacological effect size, and use this measure to rank the investigated compounds in order of their overall capability to produce strong responses. The null hypothesis that none of the examined doses of a particular compound produced a strong response can be tested with this approach. The proposed approach is based on a new statistical model of the important concept of response detection limit, a concept that is usually neglected in the analysis of dose-response data with continuous responses. The methodology is illustrated with data from a study searching for compounds that neutralize the infection by a human immunodeficiency virus of brain glioblastoma cells. PMID:24905187
Multi-modal Social Networks: A MRF Learning Approach
2016-06-20
number. PLEASE DO NOT RETURN YOUR FORM TO THE ABOVE ADDRESS. University of Texas at Austin 101 East 27th Street Suite 5.300 Austin , TX 78712 -1532...Proceedings of ACM Sigmetrics, Austin , TX June 2014. (17% acceptance) “Topic Modeling from Network Spread,” A. Ray, S. Sanghavi and S. Shakkottai...Proceedings of ACM Sigmetrics (poster paper), Austin , TX June 2014. Conclusions: Our approach based on hypothesis testing on graphs provides a
The glial growth factors deficiency and synaptic destabilization hypothesis of schizophrenia
Moises, Hans W; Zoega, Tomas; Gottesman, Irving I
2002-01-01
Background A systems approach to understanding the etiology of schizophrenia requires a theory which is able to integrate genetic as well as neurodevelopmental factors. Presentation of the hypothesis Based on a co-localization of loci approach and a large amount of circumstantial evidence, we here propose that a functional deficiency of glial growth factors and of growth factors produced by glial cells are among the distal causes in the genotype-to-phenotype chain leading to the development of schizophrenia. These factors include neuregulin, insulin-like growth factor I, insulin, epidermal growth factor, neurotrophic growth factors, erbB receptors, phosphatidylinositol-3 kinase, growth arrest specific genes, neuritin, tumor necrosis factor alpha, glutamate, NMDA and cholinergic receptors. A genetically and epigenetically determined low baseline of glial growth factor signaling and synaptic strength is expected to increase the vulnerability for additional reductions (e.g., by viruses such as HHV-6 and JC virus infecting glial cells). This should lead to a weakening of the positive feedback loop between the presynaptic neuron and its targets, and below a certain threshold to synaptic destabilization and schizophrenia. Testing the hypothesis Supported by informed conjectures and empirical facts, the hypothesis makes an attractive case for a large number of further investigations. Implications of the hypothesis The hypothesis suggests glial cells as the locus of the genes-environment interactions in schizophrenia, with glial asthenia as an important factor for the genetic liability to the disorder, and an increase of prolactin and/or insulin as possible working mechanisms of traditional and atypical neuroleptic treatments. PMID:12095426
Seeking health information on the web: positive hypothesis testing.
Kayhan, Varol Onur
2013-04-01
The goal of this study is to investigate positive hypothesis testing among consumers of health information when they search the Web. After demonstrating the extent of positive hypothesis testing using Experiment 1, we conduct Experiment 2 to test the effectiveness of two debiasing techniques. A total of 60 undergraduate students searched a tightly controlled online database developed by the authors to test the validity of a hypothesis. The database had four abstracts that confirmed the hypothesis and three abstracts that disconfirmed it. Findings of Experiment 1 showed that majority of participants (85%) exhibited positive hypothesis testing. In Experiment 2, we found that the recommendation technique was not effective in reducing positive hypothesis testing since none of the participants assigned to this server could retrieve disconfirming evidence. Experiment 2 also showed that the incorporation technique successfully reduced positive hypothesis testing since 75% of the participants could retrieve disconfirming evidence. Positive hypothesis testing on the Web is an understudied topic. More studies are needed to validate the effectiveness of the debiasing techniques discussed in this study and develop new techniques. Search engine developers should consider developing new options for users so that both confirming and disconfirming evidence can be presented in search results as users test hypotheses using search engines. Copyright © 2012 Elsevier Ireland Ltd. All rights reserved.
NASA Astrophysics Data System (ADS)
Jacob, Rinku; Harikrishnan, K. P.; Misra, R.; Ambika, G.
2018-01-01
Recurrence networks and the associated statistical measures have become important tools in the analysis of time series data. In this work, we test how effective the recurrence network measures are in analyzing real world data involving two main types of noise, white noise and colored noise. We use two prominent network measures as discriminating statistic for hypothesis testing using surrogate data for a specific null hypothesis that the data is derived from a linear stochastic process. We show that the characteristic path length is especially efficient as a discriminating measure with the conclusions reasonably accurate even with limited number of data points in the time series. We also highlight an additional advantage of the network approach in identifying the dimensionality of the system underlying the time series through a convergence measure derived from the probability distribution of the local clustering coefficients. As examples of real world data, we use the light curves from a prominent black hole system and show that a combined analysis using three primary network measures can provide vital information regarding the nature of temporal variability of light curves from different spectroscopic classes.
DOE Office of Scientific and Technical Information (OSTI.GOV)
Juanes, Ruben
The overarching goal of this project was to develop a new continuum theory of multiphase flow in porous media. The theory follows a phase-field modeling approach, and therefore has a sound thermodynamical basis. It is a phenomenological theory in the sense that its formulation is driven by macroscopic phenomena, such as viscous instabilities during multifluid displacement. The research agenda was organized around a set of hypothesis on hitherto unexplained behavior of multiphase flow. All these hypothesis are nontrivial, and testable. Indeed, a central aspect of the project was testing each hypothesis by means of carefully-designed laboratory experiments, therefore probing themore » validity of the proposed theory. The proposed research places an emphasis on the fundamentals of flow physics, but is motivated by important energy-driven applications in earth sciences, as well as microfluidic technology.« less
Hoven, Hanno; Siegrist, Johannes
2013-01-01
Social inequalities in health persist in modern societies. The contribution of adverse work and employment conditions towards their explanation is analysed by two approaches, mediation and moderation. Yet the relative significance of each approach remains unclear in respective research. We set out to study this question by conducting a systematic literature review. We included all original papers based on prospective observational studies of employed cohorts that were published between January 1980 and October 2012 meeting our search criteria, by using major databases and by observing established quality criteria. 26 reports were included after quality assessment. 17 studies examined the mediation hypothesis and nine studies tested the moderation hypothesis. Moderate support was found for the mediation hypothesis where OR or HR of health according to socioeconomic position (SEP) were reduced in a majority of analyses after introducing work characteristics in multivariate models. Evidence in favour of the moderation hypothesis was found in some studies, demonstrating stronger effects of adverse work on health among people with low SEP. Despite some support in favour of the two hypotheses future research should aim at reducing the heterogeneity in defining and measuring core variables and at applying advanced statistical analyses. Policy recommendations would benefit from a higher degree of consistency of respective research evidence. PMID:23739492
Hoven, Hanno; Siegrist, Johannes
2013-09-01
Social inequalities in health persist in modern societies. The contribution of adverse work and employment conditions towards their explanation is analysed by two approaches, mediation and moderation. Yet the relative significance of each approach remains unclear in respective research. We set out to study this question by conducting a systematic literature review. We included all original papers based on prospective observational studies of employed cohorts that were published between January 1980 and October 2012 meeting our search criteria, by using major databases and by observing established quality criteria. 26 reports were included after quality assessment. 17 studies examined the mediation hypothesis and nine studies tested the moderation hypothesis. Moderate support was found for the mediation hypothesis where OR or HR of health according to socioeconomic position (SEP) were reduced in a majority of analyses after introducing work characteristics in multivariate models. Evidence in favour of the moderation hypothesis was found in some studies, demonstrating stronger effects of adverse work on health among people with low SEP. Despite some support in favour of the two hypotheses future research should aim at reducing the heterogeneity in defining and measuring core variables and at applying advanced statistical analyses. Policy recommendations would benefit from a higher degree of consistency of respective research evidence.
Mayhew, Terry M; Lucocq, John M
2011-03-01
Various methods for quantifying cellular immunogold labelling on transmission electron microscope thin sections are currently available. All rely on sound random sampling principles and are applicable to single immunolabelling across compartments within a given cell type or between different experimental groups of cells. Although methods are also available to test for colocalization in double/triple immunogold labelling studies, so far, these have relied on making multiple measurements of gold particle densities in defined areas or of inter-particle nearest neighbour distances. Here, we present alternative two-step approaches to codistribution and colocalization assessment that merely require raw counts of gold particles in distinct cellular compartments. For assessing codistribution over aggregate compartments, initial statistical evaluation involves combining contingency table and chi-squared analyses to provide predicted gold particle distributions. The observed and predicted distributions allow testing of the appropriate null hypothesis, namely, that there is no difference in the distribution patterns of proteins labelled by different sizes of gold particle. In short, the null hypothesis is that of colocalization. The approach for assessing colabelling recognises that, on thin sections, a compartment is made up of a set of sectional images (profiles) of cognate structures. The approach involves identifying two groups of compartmental profiles that are unlabelled and labelled for one gold marker size. The proportions in each group that are also labelled for the second gold marker size are then compared. Statistical analysis now uses a 2 × 2 contingency table combined with the Fisher exact probability test. Having identified double labelling, the profiles can be analysed further in order to identify characteristic features that might account for the double labelling. In each case, the approach is illustrated using synthetic and/or experimental datasets and can be refined to correct observed labelling patterns to specific labelling patterns. These simple and efficient approaches should be of more immediate utility to those interested in codistribution and colocalization in multiple immunogold labelling investigations.
How Children’s Mentalistic Theory Widens their Conception of Pictorial Possibilities
Gilli, Gabriella M.; Ruggi, Simona; Gatti, Monica; Freeman, Norman H.
2016-01-01
An interpretative theory of mind enables young children to grasp that people fulfill varying intentions when making pictures. We tested the hypothesis that in middle childhood a unifunctional conception of artists’ intention to produce a picture widens to include artists’ intention to display their pictures to others. Children aged between 5 and 10 years viewed a brief video of an artist deliberately hiding her picture but her intention was thwarted when her picture was discovered and displayed. By 8 years of age children were almost unanimous that a picture-producer without an intention to show her work to others cannot be considered to be an artist. Further exploratory studies centered on aspects of picture-display involving normal public display as well as the contrary intentions of hiding an original picture and of deceitfully displaying a forgery. Interviews suggested that the concept of exhibition widened to take others’ minds into account viewers’ critical judgments and effects of forgeries on viewers’ minds. The approach of interpolating probes of typical possibilities between atypical intentions generated evidence that in middle childhood the foundations are laid for a conception of communication between artists’ minds and viewers’ minds via pictorial display. The combination of hypothesis-testing and exploratory opening-up of the area generates a new testable hypothesis about how an increasingly mentalistic approach enables children to understand diverse possibilities in the pictorial domain. PMID:26955360
Testing goodness of fit in regression: a general approach for specified alternatives.
Solari, Aldo; le Cessie, Saskia; Goeman, Jelle J
2012-12-10
When fitting generalized linear models or the Cox proportional hazards model, it is important to have tools to test for lack of fit. Because lack of fit comes in all shapes and sizes, distinguishing among different types of lack of fit is of practical importance. We argue that an adequate diagnosis of lack of fit requires a specified alternative model. Such specification identifies the type of lack of fit the test is directed against so that if we reject the null hypothesis, we know the direction of the departure from the model. The goodness-of-fit approach of this paper allows to treat different types of lack of fit within a unified general framework and to consider many existing tests as special cases. Connections with penalized likelihood and random effects are discussed, and the application of the proposed approach is illustrated with medical examples. Tailored functions for goodness-of-fit testing have been implemented in the R package global test. Copyright © 2012 John Wiley & Sons, Ltd.
Working memory in children predicts performance on a gambling task.
Audusseau, Jean; Juhel, Jacques
2015-01-01
The authors investigated whether working memory (WM) plays a significant role in the development of decision making in children, operationalized by the Children's Gambling Task (CGT). A total of 105 children aged 6-7, 8-9, and 10-11 years old carried out the CGT. Children aged 6-7 years old were found to have a lower performance than older children, which shows that the CGT is sensitive to participant's age. The hypothesis that WM plays a significant role in decision making was then tested following two approaches: (a) an experimental approach, comparing between groups the performance on the CGT in a control condition (the CGT only was administered) to that in a double task condition (participants had to carry out a recall task in addition to the CGT); (b) an interindividual approach, probing the relationship between CGT performance and performance on tasks measuring WM efficiency. The between-groups approach evidenced a better performance in the control group. Moreover, the interindividual approach showed that the higher the participants' WM efficiency was, the higher their performance in the CGT was. Taken together, these two approaches yield converging results that support the hypothesis that WM plays a significant role in decision making in children.
Independent test assessment using the extreme value distribution theory.
Almeida, Marcio; Blondell, Lucy; Peralta, Juan M; Kent, Jack W; Jun, Goo; Teslovich, Tanya M; Fuchsberger, Christian; Wood, Andrew R; Manning, Alisa K; Frayling, Timothy M; Cingolani, Pablo E; Sladek, Robert; Dyer, Thomas D; Abecasis, Goncalo; Duggirala, Ravindranath; Blangero, John
2016-01-01
The new generation of whole genome sequencing platforms offers great possibilities and challenges for dissecting the genetic basis of complex traits. With a very high number of sequence variants, a naïve multiple hypothesis threshold correction hinders the identification of reliable associations by the overreduction of statistical power. In this report, we examine 2 alternative approaches to improve the statistical power of a whole genome association study to detect reliable genetic associations. The approaches were tested using the Genetic Analysis Workshop 19 (GAW19) whole genome sequencing data. The first tested method estimates the real number of effective independent tests actually being performed in whole genome association project by the use of an extreme value distribution and a set of phenotype simulations. Given the familiar nature of the GAW19 data and the finite number of pedigree founders in the sample, the number of correlations between genotypes is greater than in a set of unrelated samples. Using our procedure, we estimate that the effective number represents only 15 % of the total number of independent tests performed. However, even using this corrected significance threshold, no genome-wide significant association could be detected for systolic and diastolic blood pressure traits. The second approach implements a biological relevance-driven hypothesis tested by exploiting prior computational predictions on the effect of nonsynonymous genetic variants detected in a whole genome sequencing association study. This guided testing approach was able to identify 2 promising single-nucleotide polymorphisms (SNPs), 1 for each trait, targeting biologically relevant genes that could help shed light on the genesis of the human hypertension. The first gene, PFH14 , associated with systolic blood pressure, interacts directly with genes involved in calcium-channel formation and the second gene, MAP4 , encodes a microtubule-associated protein and had already been detected by previous genome-wide association study experiments conducted in an Asian population. Our results highlight the necessity of the development of alternative approached to improve the efficiency on the detection of reasonable candidate associations in whole genome sequencing studies.
A Systems Approach to the Physiology of Weightlessness
NASA Technical Reports Server (NTRS)
White, Ronald J.; Leonard, Joel I.; Rummel, John A.; Leach, Carolyn S.
1991-01-01
A systems approach to the unraveling of the complex response pattern of the human subjected to weightlessness is presented. The major goal of this research is to obtain an understanding of the role that each of the major components of the human system plays following the transition to and from space. The cornerstone of this approach is the utilization of a variety of mathematical models in order to pose and test alternative hypotheses concerned with the adaptation process. An integrated hypothesis for the human physiological response to weightlessness is developed.
A systems approach to the physiology of weightlessness
NASA Technical Reports Server (NTRS)
White, Ronald J.; Leonard, Joel I.; Rummel, John A.; Leach, Carolyn S.
1991-01-01
A general systems approach to conducting and analyzing research on the human adaptation to weightlessness is presented. The research is aimed at clarifying the role that each of the major components of the human system plays following the transition to and from space. The approach utilizes a variety of mathematical models in order to pose and test alternative hypotheses concerned with the adaptation process. Certain aspects of the problem of fluid and electrolyte shifts in weightlessnes are considered, and an integrated hypothesis based on numerical simulation studies and experimental data is presented.
Career Development, Collective Efficacy, and Individual Task Performance
ERIC Educational Resources Information Center
Kellett, Janet B.; Humphrey, Ronald H.; Sleeth, Randall G.
2009-01-01
Purpose: The purpose of this paper is to test the hypothesis that perceived collective efficacy would mediate the effects of self-efficacy on individual task performance. Design/methodology/approach: An assessment center design with 147 participants in 49 three-person groups was used. Findings: It is found that for individuals working on an…
ERIC Educational Resources Information Center
Coyle, Emily F.; Liben, Lynn S.
2016-01-01
Gender schema theory (GST) posits that children approach opportunities perceived as gender appropriate, avoiding those deemed gender inappropriate, in turn affecting gender-differentiated career trajectories. To test the hypothesis that children's gender salience filters (GSF--tendency to attend to gender) moderate these processes, 62 preschool…
A Cybernetic Approach To Study the Learnability of the LOGO Turtle World.
ERIC Educational Resources Information Center
Ippel, Martin J.; Meulemans, Caroline J. M.
1998-01-01
This study of second- and third-grade students in the Netherlands tests the hypothesis that simplification of the semantic structure will facilitate semantic understanding and acquisition of syntax knowledge within the LOGO Turtle World. Two microworlds were designed applying the theory of automata and abstract languages. (Author/LRW)
Perceived Parental Bonding, Fear of Failure and Stress during Class Presentations
ERIC Educational Resources Information Center
Sideridis, Georgios D.; Kafetsios, Konstantinos
2008-01-01
The purpose of the present studies was to test the hypothesis that students' perceptions of parental bonding may be predictive of how individuals approach achievement situations. It was hypothesized that reports of parental overprotection would be predictive of elevated fears and subsequent stress and low achievement compared to perceived parental…
Predictors of the Perceived Importance of Food Skills of Home Economics Teachers
ERIC Educational Resources Information Center
Fordyce-Voorham, Sandra P.
2016-01-01
Purpose: The purpose of this paper is to test an hypothesis that teachers' personal orientations toward food preparation, nutrition and environmental issues would be related to their perceived importance of food skills. Design/methodology/approach: Little research has been conducted on home economics teachers' views on the importance of the food…
ERIC Educational Resources Information Center
Hull, Frank M.; And Others
1982-01-01
Presents new evidence on the effect of technology on worker alienation, using data from the organizational level as well as the individual level. In the latter approach, the impact of automation on the work of newspaper printers is examined. (SK)
Students' Reasoning about p-Values
ERIC Educational Resources Information Center
Aquilonius, Birgit C.; Brenner, Mary E.
2015-01-01
Results from a study of 16 community college students are presented. The research question concerned how students reasoned about p-values. Students' approach to p-values in hypothesis testing was procedural. Students viewed p-values as something that one compares to alpha values in order to arrive at an answer and did not attach much meaning to…
Oral Language and Reading Success: A Structural Equation Modeling Approach
ERIC Educational Resources Information Center
Beron, Kurt J.; Farkas, George
2004-01-01
Oral language skills and habits may serve as important resources for success or failure in school-related tasks such as learning to read. This article tests this hypothesis utilizing a unique data set, the original Woodcock-Johnson Psycho-Educational Battery-Revised norming sample. This article assesses the importance of oral language by focusing…
High Performance Visualization using Query-Driven Visualizationand Analytics
DOE Office of Scientific and Technical Information (OSTI.GOV)
Bethel, E. Wes; Campbell, Scott; Dart, Eli
2006-06-15
Query-driven visualization and analytics is a unique approach for high-performance visualization that offers new capabilities for knowledge discovery and hypothesis testing. The new capabilities akin to finding needles in haystacks are the result of combining technologies from the fields of scientific visualization and scientific data management. This approach is crucial for rapid data analysis and visualization in the petascale regime. This article describes how query-driven visualization is applied to a hero-sized network traffic analysis problem.
Keune, Philipp M; Wiedemann, Eva; Schneidt, Alexander; Schönenberg, Michael
2015-04-01
Attention-deficit/hyperactivity disorder (ADHD) involves motivational dysfunction, characterized by excessive behavioral approach tendencies. Frontal brain asymmetry in the alpha band (8-13 Hz) in resting-state electroencephalogram (EEG) represents a neural correlate of global motivational tendencies, and abnormal asymmetry, indicating elevated approach motivation, was observed in pediatric and adult patients. To date, the relation between ADHD symptoms, depression and alpha asymmetry, its temporal metric properties and putative gender-specificity remain to be explored. Adult ADHD patients (n=52) participated in two resting-state EEG recordings, two weeks apart. Asymmetry measures were aggregated across recordings to increase trait specificity. Putative region-specific associations between asymmetry, ADHD symptoms and depression, its gender-specificity and test-retest reliability were examined. ADHD symptoms were associated with approach-related asymmetry (stronger relative right-frontal alpha power). Approach-related asymmetry was pronounced in females, and also associated with depression. The latter association was mediated by ADHD symptoms. Test-retest reliability was sufficient. The association between reliably assessable alpha asymmetry and ADHD symptoms supports the motivational dysfunction hypothesis. ADHD symptoms mediating an atypical association between asymmetry and depression may be attributed to depression arising secondary to ADHD. Gender-specific findings require replication. Frontal alpha asymmetry may represent a new reliable marker of ADHD symptoms. Copyright © 2014 International Federation of Clinical Neurophysiology. Published by Elsevier Ireland Ltd. All rights reserved.
Trait motivation moderates neural activation associated with goal pursuit
Miller, Gregory A.; Warren, Stacie L.; Engels, Anna S.; Crocker, Laura D.; Sutton, Bradley P.; Heller, Wendy
2012-01-01
Research has indicated that regions of left and right dorsolateral prefrontal cortex (DLPFC) are involved in integrating the motivational and executive function processes related to, respectively, approach and avoidance goals. Given that sensitivity to pleasant and unpleasant stimuli is an important feature of conceptualizations of approach and avoidance motivation, it is possible that these regions of DLPFC are preferentially activated by valenced stimuli. The present study tested this hypothesis by using a task in which goal pursuit was threatened by distraction from valenced stimuli while functional magnetic resonance imaging data were collected. The analyses examined whether the impact of trait approach and avoidance motivation on the neural processes associated with executive function differed depending on the valence or arousal level of the distractor stimuli. The present findings support the hypothesis that the regions of DLPFC under investigation are involved in integrating motivational and executive function processes, and they also indicate the involvement of a number of other brain areas in maintaining goal pursuit. However, DLPFC did not display differential sensitivity to valence. PMID:22460723
Trait motivation moderates neural activation associated with goal pursuit.
Spielberg, Jeffrey M; Miller, Gregory A; Warren, Stacie L; Engels, Anna S; Crocker, Laura D; Sutton, Bradley P; Heller, Wendy
2012-06-01
Research has indicated that regions of left and right dorsolateral prefrontal cortex (DLPFC) are involved in integrating the motivational and executive function processes related to, respectively, approach and avoidance goals. Given that sensitivity to pleasant and unpleasant stimuli is an important feature of conceptualizations of approach and avoidance motivation, it is possible that these regions of DLPFC are preferentially activated by valenced stimuli. The present study tested this hypothesis by using a task in which goal pursuit was threatened by distraction from valenced stimuli while functional magnetic resonance imaging data were collected. The analyses examined whether the impact of trait approach and avoidance motivation on the neural processes associated with executive function differed depending on the valence or arousal level of the distractor stimuli. The present findings support the hypothesis that the regions of DLPFC under investigation are involved in integrating motivational and executive function processes, and they also indicate the involvement of a number of other brain areas in maintaining goal pursuit. However, DLPFC did not display differential sensitivity to valence.
Two-condition within-participant statistical mediation analysis: A path-analytic framework.
Montoya, Amanda K; Hayes, Andrew F
2017-03-01
Researchers interested in testing mediation often use designs where participants are measured on a dependent variable Y and a mediator M in both of 2 different circumstances. The dominant approach to assessing mediation in such a design, proposed by Judd, Kenny, and McClelland (2001), relies on a series of hypothesis tests about components of the mediation model and is not based on an estimate of or formal inference about the indirect effect. In this article we recast Judd et al.'s approach in the path-analytic framework that is now commonly used in between-participant mediation analysis. By so doing, it is apparent how to estimate the indirect effect of a within-participant manipulation on some outcome through a mediator as the product of paths of influence. This path-analytic approach eliminates the need for discrete hypothesis tests about components of the model to support a claim of mediation, as Judd et al.'s method requires, because it relies only on an inference about the product of paths-the indirect effect. We generalize methods of inference for the indirect effect widely used in between-participant designs to this within-participant version of mediation analysis, including bootstrap confidence intervals and Monte Carlo confidence intervals. Using this path-analytic approach, we extend the method to models with multiple mediators operating in parallel and serially and discuss the comparison of indirect effects in these more complex models. We offer macros and code for SPSS, SAS, and Mplus that conduct these analyses. (PsycINFO Database Record (c) 2017 APA, all rights reserved).
Estimating equivalence with quantile regression
Cade, B.S.
2011-01-01
Equivalence testing and corresponding confidence interval estimates are used to provide more enlightened statistical statements about parameter estimates by relating them to intervals of effect sizes deemed to be of scientific or practical importance rather than just to an effect size of zero. Equivalence tests and confidence interval estimates are based on a null hypothesis that a parameter estimate is either outside (inequivalence hypothesis) or inside (equivalence hypothesis) an equivalence region, depending on the question of interest and assignment of risk. The former approach, often referred to as bioequivalence testing, is often used in regulatory settings because it reverses the burden of proof compared to a standard test of significance, following a precautionary principle for environmental protection. Unfortunately, many applications of equivalence testing focus on establishing average equivalence by estimating differences in means of distributions that do not have homogeneous variances. I discuss how to compare equivalence across quantiles of distributions using confidence intervals on quantile regression estimates that detect differences in heterogeneous distributions missed by focusing on means. I used one-tailed confidence intervals based on inequivalence hypotheses in a two-group treatment-control design for estimating bioequivalence of arsenic concentrations in soils at an old ammunition testing site and bioequivalence of vegetation biomass at a reclaimed mining site. Two-tailed confidence intervals based both on inequivalence and equivalence hypotheses were used to examine quantile equivalence for negligible trends over time for a continuous exponential model of amphibian abundance. ?? 2011 by the Ecological Society of America.
New methods of testing nonlinear hypothesis using iterative NLLS estimator
NASA Astrophysics Data System (ADS)
Mahaboob, B.; Venkateswarlu, B.; Mokeshrayalu, G.; Balasiddamuni, P.
2017-11-01
This research paper discusses the method of testing nonlinear hypothesis using iterative Nonlinear Least Squares (NLLS) estimator. Takeshi Amemiya [1] explained this method. However in the present research paper, a modified Wald test statistic due to Engle, Robert [6] is proposed to test the nonlinear hypothesis using iterative NLLS estimator. An alternative method for testing nonlinear hypothesis using iterative NLLS estimator based on nonlinear hypothesis using iterative NLLS estimator based on nonlinear studentized residuals has been proposed. In this research article an innovative method of testing nonlinear hypothesis using iterative restricted NLLS estimator is derived. Pesaran and Deaton [10] explained the methods of testing nonlinear hypothesis. This paper uses asymptotic properties of nonlinear least squares estimator proposed by Jenrich [8]. The main purpose of this paper is to provide very innovative methods of testing nonlinear hypothesis using iterative NLLS estimator, iterative NLLS estimator based on nonlinear studentized residuals and iterative restricted NLLS estimator. Eakambaram et al. [12] discussed least absolute deviation estimations versus nonlinear regression model with heteroscedastic errors and also they studied the problem of heteroscedasticity with reference to nonlinear regression models with suitable illustration. William Grene [13] examined the interaction effect in nonlinear models disused by Ai and Norton [14] and suggested ways to examine the effects that do not involve statistical testing. Peter [15] provided guidelines for identifying composite hypothesis and addressing the probability of false rejection for multiple hypotheses.
Hitchcock, Elaine R.; Ferron, John
2017-01-01
Purpose Single-case experimental designs are widely used to study interventions for communication disorders. Traditionally, single-case experiments follow a response-guided approach, where design decisions during the study are based on participants' observed patterns of behavior. However, this approach has been criticized for its high rate of Type I error. In masked visual analysis (MVA), response-guided decisions are made by a researcher who is blinded to participants' identities and treatment assignments. MVA also makes it possible to conduct a hypothesis test assessing the significance of treatment effects. Method This tutorial describes the principles of MVA, including both how experiments can be set up and how results can be used for hypothesis testing. We then report a case study showing how MVA was deployed in a multiple-baseline across-subjects study investigating treatment for residual errors affecting rhotics. Strengths and weaknesses of MVA are discussed. Conclusions Given their important role in the evidence base that informs clinical decision making, it is critical for single-case experimental studies to be conducted in a way that allows researchers to draw valid inferences. As a method that can increase the rigor of single-case studies while preserving the benefits of a response-guided approach, MVA warrants expanded attention from researchers in communication disorders. PMID:28595354
Byun, Tara McAllister; Hitchcock, Elaine R; Ferron, John
2017-06-10
Single-case experimental designs are widely used to study interventions for communication disorders. Traditionally, single-case experiments follow a response-guided approach, where design decisions during the study are based on participants' observed patterns of behavior. However, this approach has been criticized for its high rate of Type I error. In masked visual analysis (MVA), response-guided decisions are made by a researcher who is blinded to participants' identities and treatment assignments. MVA also makes it possible to conduct a hypothesis test assessing the significance of treatment effects. This tutorial describes the principles of MVA, including both how experiments can be set up and how results can be used for hypothesis testing. We then report a case study showing how MVA was deployed in a multiple-baseline across-subjects study investigating treatment for residual errors affecting rhotics. Strengths and weaknesses of MVA are discussed. Given their important role in the evidence base that informs clinical decision making, it is critical for single-case experimental studies to be conducted in a way that allows researchers to draw valid inferences. As a method that can increase the rigor of single-case studies while preserving the benefits of a response-guided approach, MVA warrants expanded attention from researchers in communication disorders.
NASA Technical Reports Server (NTRS)
Hess, R. A.
1977-01-01
A brief review of some of the more pertinent applications of analytical pilot models to the prediction of aircraft handling qualities is undertaken. The relative ease with which multiloop piloting tasks can be modeled via the optimal control formulation makes the use of optimal pilot models particularly attractive for handling qualities research. To this end, a rating hypothesis is introduced which relates the numerical pilot opinion rating assigned to a particular vehicle and task to the numerical value of the index of performance resulting from an optimal pilot modeling procedure as applied to that vehicle and task. This hypothesis is tested using data from piloted simulations and is shown to be reasonable. An example concerning a helicopter landing approach is introduced to outline the predictive capability of the rating hypothesis in multiaxis piloting tasks.
Petróczi, Andrea; Naughton, Declan P
2007-01-01
Background Supplement use by athletes is complex and research supports the alarming notion of misinformed decisions regarding supplements. Hypothesis A frequent divergence between the type of supplements chosen by athletes and the rationale dictating the supplement use is hypothesized. Thus, a potentially dangerous incongruence may exist between rationale and practice. Testing the hypothesis In the continued absence of reliable data on supplement use, an alternative approach of studying the reasons underlying supplement use in athletes is proposed to determine whether there is an incongruence between rationale and practice. Existing data from large scale national surveys can be used to investigate this incongruence. Implications of the hypothesis In this report, analyses of distinctive patterns between the use and rationale for use of supplements among athletes are recommended to explore this potentially dangerous phenomenon. PMID:17535442
The ranking probability approach and its usage in design and analysis of large-scale studies.
Kuo, Chia-Ling; Zaykin, Dmitri
2013-01-01
In experiments with many statistical tests there is need to balance type I and type II error rates while taking multiplicity into account. In the traditional approach, the nominal [Formula: see text]-level such as 0.05 is adjusted by the number of tests, [Formula: see text], i.e., as 0.05/[Formula: see text]. Assuming that some proportion of tests represent "true signals", that is, originate from a scenario where the null hypothesis is false, power depends on the number of true signals and the respective distribution of effect sizes. One way to define power is for it to be the probability of making at least one correct rejection at the assumed [Formula: see text]-level. We advocate an alternative way of establishing how "well-powered" a study is. In our approach, useful for studies with multiple tests, the ranking probability [Formula: see text] is controlled, defined as the probability of making at least [Formula: see text] correct rejections while rejecting hypotheses with [Formula: see text] smallest P-values. The two approaches are statistically related. Probability that the smallest P-value is a true signal (i.e., [Formula: see text]) is equal to the power at the level [Formula: see text], to an very good excellent approximation. Ranking probabilities are also related to the false discovery rate and to the Bayesian posterior probability of the null hypothesis. We study properties of our approach when the effect size distribution is replaced for convenience by a single "typical" value taken to be the mean of the underlying distribution. We conclude that its performance is often satisfactory under this simplification; however, substantial imprecision is to be expected when [Formula: see text] is very large and [Formula: see text] is small. Precision is largely restored when three values with the respective abundances are used instead of a single typical effect size value.
NASA Astrophysics Data System (ADS)
Cajueiro, Daniel O.; Tabak, Benjamin M.
2004-11-01
In this paper, the efficient market hypothesis is tested for China, Hong Kong and Singapore by means of the long memory dependence approach. We find evidence suggesting that Hong Kong is the most efficient market followed by Chinese A type shares and Singapore and finally by Chinese B type shares, which suggests that liquidity and capital restrictions may play a role in explaining results of market efficiency tests.
Modeling Regional Seismic Waves from Underground Nuclear Explosion
1989-05-15
consider primarily the long-period tangenital motions in this pilot study because less computational effort is involved compared to modeling the P-SV system...error testing can be a time- consuming endeavor but the basic approach has proven effective in previous studies (Vidale et aL, 1985; Helmberger and Vidale...at various depths in a variety of basin models were generated to test the above hypothesis. When the source is situated in the sediments and when the
How decision reversibility affects motivation.
Bullens, Lottie; van Harreveld, Frenk; Förster, Jens; Higgins, Tory E
2014-04-01
The present research examined how decision reversibility can affect motivation. On the basis of extant findings, it was suggested that 1 way it could affect motivation would be to strengthen different regulatory foci, with reversible decision making, compared to irreversible decision making, strengthening prevention-related motivation relatively more than promotion-related motivation. If so, then decision reversibility should have effects associated with the relative differences between prevention and promotion motivation. In 5 studies, we manipulated the reversibility of a decision and used different indicators of regulatory focus motivation to test these predictions. Specifically, Study 1 tested for differences in participants' preference for approach versus avoidance strategies toward a desired end state. In Study 2, we used speed and accuracy performance as indicators of participants' regulatory motivation, and in Study 3, we measured global versus local reaction time performance. In Study 4, we approached the research question in a different way, making use of the value-from-fit hypothesis (Higgins, 2000, 2002). We tested whether a fit between chronic regulatory focus and focus induced by the reversibility of the decision increased participants' subjective positive feelings about the decision outcome. Finally, in Study 5, we tested whether regulatory motivation, induced by decision reversibility, also influenced participants' preference in specific product features. The results generally support our hypothesis showing that, compared to irreversible decisions, reversible decisions strengthen a prevention focus more than a promotion focus. Implications for research on decision making are discussed.
Time pressure undermines performance more under avoidance than approach motivation.
Roskes, Marieke; Elliot, Andrew J; Nijstad, Bernard A; De Dreu, Carsten K W
2013-06-01
Four experiments were designed to test the hypothesis that performance is particularly undermined by time pressure when people are avoidance motivated. The results supported this hypothesis across three different types of tasks, including those well suited and those ill suited to the type of information processing evoked by avoidance motivation. We did not find evidence that stress-related emotions were responsible for the observed effect. Avoidance motivation is certainly necessary and valuable in the self-regulation of everyday behavior. However, our results suggest that given its nature and implications, it seems best that avoidance motivation is avoided in situations that involve (time) pressure.
Testing for carryover effects after cessation of treatments: a design approach.
Sturdevant, S Gwynn; Lumley, Thomas
2016-08-02
Recently, trials addressing noisy measurements with diagnosis occurring by exceeding thresholds (such as diabetes and hypertension) have been published which attempt to measure carryover - the impact that treatment has on an outcome after cessation. The design of these trials has been criticised and simulations have been conducted which suggest that the parallel-designs used are not adequate to test this hypothesis; two solutions are that either a differing parallel-design or a cross-over design could allow for diagnosis of carryover. We undertook a systematic simulation study to determine the ability of a cross-over or a parallel-group trial design to detect carryover effects on incident hypertension in a population with prehypertension. We simulated blood pressure and focused on varying criteria to diagnose systolic hypertension. Using the difference in cumulative incidence hypertension to analyse parallel-group or cross-over trials resulted in none of the designs having acceptable Type I error rate. Under the null hypothesis of no carryover the difference is well above the nominal 5 % error rate. When a treatment is effective during the intervention period, reliable testing for a carryover effect is difficult. Neither parallel-group nor cross-over designs using the difference in cumulative incidence appear to be a feasible approach. Future trials should ensure their design and analysis is validated by simulation.
ERIC Educational Resources Information Center
Ensminger, David C.; Hoyt, Amy E.; Chandrasekhar, Arcot J.; McNulty, John A.
2013-01-01
We tested the hypothesis that medical students change their study strategies when transitioning from basic science courses to clerkships, and that their study practices are associated with performance scores. Factor scores for three approaches to studying (construction, rote, and review) generated from student (n = 150) responses to a…
Acquiring Spanish at the Interfaces: An Integrative Approach to the L2 Acquisition of Psych-Verbs
ERIC Educational Resources Information Center
Gomez Soler, Inmaculada
2012-01-01
This dissertation provides a comprehensive analysis of the L2 acquisition of Spanish psych-verbs (e.g. "gustar" "to like") across four different proficiency levels. In particular, psych-verbs constitute a testing ground for the predictions of the Interface Hypothesis (Sorace and Filiaci, 2006; Tsimpli, Sorace, Heycok &…
ERIC Educational Resources Information Center
Luke, Steven G.; Henderson, John M.; Ferreira, Fernanda
2015-01-01
The lexical quality hypothesis (Perfetti & Hart, 2002) suggests that skilled reading requires high-quality lexical representations. In children, these representations are still developing, and it has been suggested that this development leads to more adult-like eye-movement behavior during the reading of connected text. To test this idea, a…
ERIC Educational Resources Information Center
Gudino, Omar G.; Nadeem, Erum; Kataoka, Sheryl H.; Lau, Anna S.
2012-01-01
Urban Latino youth are exposed to high rates of violence, which increases risk for diverse forms of psychopathology. The current study aims to increase specificity in predicting responses by testing the hypothesis that youths' reinforcement sensitivity--behavioral inhibition (BIS) and behavioral approach (BAS)--is associated with specific clinical…
Effect of Subject Types on the Production of Auxiliary "Is" in Young English-Speaking Children
ERIC Educational Resources Information Center
Guo, Ling-Yu; Owen, Amanda J.; Tomblin, J. Bruce
2010-01-01
Purpose: In this study, the authors tested the unique checking constraint (UCC) hypothesis and the usage-based approach concerning why young children variably use tense and agreement morphemes in obligatory contexts by examining the effect of subject types on the production of auxiliary "is". Method: Twenty typically developing 3-year-olds were…
The Healthy Men Study (HMS) is a prospective multisite community study on drinking water disinfection byproducts (DBPs) and male reproductive health. We are testing whether exposure to DBPs in drinking water may be associated with altered semen quality, a hypothesis derived from...
A Runge-Kutta discontinuous Galerkin approach to solve reactive flows: The hyperbolic operator
DOE Office of Scientific and Technical Information (OSTI.GOV)
Billet, G., E-mail: billet@onera.f; Ryan, J., E-mail: ryan@onera.f
2011-02-20
A Runge-Kutta discontinuous Galerkin method to solve the hyperbolic part of reactive Navier-Stokes equations written in conservation form is presented. Complex thermodynamics laws are taken into account. Particular care has been taken to solve the stiff gaseous interfaces correctly with no restrictive hypothesis. 1D and 2D test cases are presented.
Devin, Alligators, Jellyfish, and Me.
ERIC Educational Resources Information Center
Tsuchiyama, Elaine
1997-01-01
Describes how a first-grade teacher used the "hypothesis-test" approach with Devin, a first grader who struggled as a reader and writer. Points out that, when she started working with Devin, she wanted to understand his difficulties, but by the end, she realized that it was her curriculum, not his difficulties, that needed to be in the foreground.…
ERIC Educational Resources Information Center
Stamovlasis, Dimitrios
2014-01-01
This paper addresses some methodological issues concerning traditional linear approaches and shows the need for a paradigm shift in education research towards the Complexity and Nonlinear Dynamical Systems (NDS) framework. It presents a quantitative piece of research aiming to test the nonlinear dynamical hypothesis in education. It applies…
Hypothesis testing in hydrology: Theory and practice
NASA Astrophysics Data System (ADS)
Kirchner, James; Pfister, Laurent
2017-04-01
Well-posed hypothesis tests have spurred major advances in hydrological theory. However, a random sample of recent research papers suggests that in hydrology, as in other fields, hypothesis formulation and testing rarely correspond to the idealized model of the scientific method. Practices such as "p-hacking" or "HARKing" (Hypothesizing After the Results are Known) are major obstacles to more rigorous hypothesis testing in hydrology, along with the well-known problem of confirmation bias - the tendency to value and trust confirmations more than refutations - among both researchers and reviewers. Hypothesis testing is not the only recipe for scientific progress, however: exploratory research, driven by innovations in measurement and observation, has also underlain many key advances. Further improvements in observation and measurement will be vital to both exploratory research and hypothesis testing, and thus to advancing the science of hydrology.
Cacho, N Ivalú; Kliebenstein, Daniel J; Strauss, Sharon Y
2015-11-01
We explored macroevolutionary patterns of plant chemical defense in Streptanthus (Brassicaceae), tested for evolutionary escalation of defense, as predicted by Ehrlich and Raven's plant-herbivore coevolutionary arms-race hypothesis, and tested whether species inhabiting low-resource or harsh environments invest more in defense, as predicted by the resource availability hypothesis (RAH). We conducted phylogenetically explicit analyses using glucosinolate profiles, soil nutrient analyses, and microhabitat bareness estimates across 30 species of Streptanthus inhabiting varied environments and soils. We found weak to moderate phylogenetic signal in glucosinolate classes and no signal in total glucosinolate production; a trend toward evolutionary de-escalation in the numbers and diversity of glucosinolates, accompanied by an evolutionary increase in the proportion of aliphatic glucosinolates; some support for the RAH relative to soil macronutrients, but not relative to serpentine soil use; and that the number of glucosinolates increases with microhabitat bareness, which is associated with increased herbivory and drought. Weak phylogenetic signal in chemical defense has been observed in other plant systems. A more holistic approach incorporating other forms of defense might be necessary to confidently reject escalation of defense. That defense increases with microhabitat bareness supports the hypothesis that habitat bareness is an underappreciated selective force on plants in harsh environments. © 2015 The Authors. New Phytologist © 2015 New Phytologist Trust.
Davis, O S P; Kovas, Y; Harlaar, N; Busfield, P; McMillan, A; Frances, J; Petrill, S A; Dale, P S; Plomin, R
2008-06-01
A key translational issue for neuroscience is to understand how genes affect individual differences in brain function. Although it is reasonable to suppose that genetic effects on specific learning abilities, such as reading and mathematics, as well as general cognitive ability (g), will overlap very little, the counterintuitive finding emerging from multivariate genetic studies is that the same genes affect these diverse learning abilities: a Generalist Genes hypothesis. To conclusively test this hypothesis, we exploited the widespread access to inexpensive and fast Internet connections in the UK to assess 2541 pairs of 10-year-old twins for reading, mathematics and g, using a web-based test battery. Heritabilities were 0.38 for reading, 0.49 for mathematics and 0.44 for g. Multivariate genetic analysis showed substantial genetic correlations between learning abilities: 0.57 between reading and mathematics, 0.61 between reading and g, and 0.75 between mathematics and g, providing strong support for the Generalist Genes hypothesis. If genetic effects on cognition are so general, the effects of these genes on the brain are also likely to be general. In this way, generalist genes may prove invaluable in integrating top-down and bottom-up approaches to the systems biology of the brain.
Statistical Hypothesis Testing in Intraspecific Phylogeography: NCPA versus ABC
Templeton, Alan R.
2009-01-01
Nested clade phylogeographic analysis (NCPA) and approximate Bayesian computation (ABC) have been used to test phylogeographic hypotheses. Multilocus NCPA tests null hypotheses, whereas ABC discriminates among a finite set of alternatives. The interpretive criteria of NCPA are explicit and allow complex models to be built from simple components. The interpretive criteria of ABC are ad hoc and require the specification of a complete phylogeographic model. The conclusions from ABC are often influenced by implicit assumptions arising from the many parameters needed to specify a complex model. These complex models confound many assumptions so that biological interpretations are difficult. Sampling error is accounted for in NCPA, but ABC ignores important sources of sampling error that creates pseudo-statistical power. NCPA generates the full sampling distribution of its statistics, but ABC only yields local probabilities, which in turn make it impossible to distinguish between a good fitting model, a non-informative model, and an over-determined model. Both NCPA and ABC use approximations, but convergences of the approximations used in NCPA are well defined whereas those in ABC are not. NCPA can analyze a large number of locations, but ABC cannot. Finally, the dimensionality of tested hypothesis is known in NCPA, but not for ABC. As a consequence, the “probabilities” generated by ABC are not true probabilities and are statistically non-interpretable. Accordingly, ABC should not be used for hypothesis testing, but simulation approaches are valuable when used in conjunction with NCPA or other methods that do not rely on highly parameterized models. PMID:19192182
A Hypothesis-Driven Approach to Site Investigation
NASA Astrophysics Data System (ADS)
Nowak, W.
2008-12-01
Variability of subsurface formations and the scarcity of data lead to the notion of aquifer parameters as geostatistical random variables. Given an information need and limited resources for field campaigns, site investigation is often put into the context of optimal design. In optimal design, the types, numbers and positions of samples are optimized under case-specific objectives to meet the information needs. Past studies feature optimal data worth (balancing maximum financial profit in an engineering task versus the cost of additional sampling), or aim at a minimum prediction uncertainty of stochastic models for a prescribed investigation budget. Recent studies also account for other sources of uncertainty outside the hydrogeological range, such as uncertain toxicity, ingestion and behavioral parameters of the affected population when predicting the human health risk from groundwater contaminations. The current study looks at optimal site investigation from a new angle. Answering a yes/no question under uncertainty directly requires recasting the original question as a hypothesis test. Otherwise, false confidence in the resulting answer would be pretended. A straightforward example is whether a recent contaminant spill will cause contaminant concentrations in excess of a legal limit at a nearby drinking water well. This question can only be answered down to a specified chance of error, i.e., based on the significance level used in hypothesis tests. Optimal design is placed into the hypothesis-driven context by using the chance of providing a false yes/no answer as new criterion to be minimized. Different configurations apply for one-sided and two-sided hypothesis tests. If a false answer entails financial liability, the hypothesis-driven context can be re-cast in the context of data worth. The remaining difference is that failure is a hard constraint in the data worth context versus a monetary punishment term in the hypothesis-driven context. The basic principle is discussed and illustrated on the case of a hypothetical contaminant spill and the exceedance of critical contaminant levels at a downstream location. An tempting and important side question is whether site investigation could be tweaked towards a yes or no answer in maliciously biased campaigns by unfair formulation of the optimization objective.
Foverskov, Else; Holm, Anders
2016-02-01
Despite social inequality in health being well documented, it is still debated which causal mechanism best explains the negative association between socioeconomic position (SEP) and health. This paper is concerned with testing the explanatory power of three widely proposed causal explanations for social inequality in health in adulthood: the social causation hypothesis (SEP determines health), the health selection hypothesis (health determines SEP) and the indirect selection hypothesis (no causal relationship). We employ dynamic data of respondents aged 30 to 60 from the last nine waves of the British Household Panel Survey. Household income and location on the Cambridge Scale is included as measures of different dimensions of SEP and health is measured as a latent factor score. The causal hypotheses are tested using a time-based Granger approach by estimating dynamic fixed effects panel regression models following the method suggested by Anderson and Hsiao. We propose using this method to estimate the associations over time since it allows one to control for all unobserved time-invariant factors and hence lower the chances of biased estimates due to unobserved heterogeneity. The results showed no proof of the social causation hypothesis over a one to five year period and limited support for the health selection hypothesis was seen only for men in relation to HH income. These findings were robust in multiple sensitivity analysis. We conclude that the indirect selection hypothesis may be the most important in explaining social inequality in health in adulthood, indicating that the well-known cross-sectional correlations between health and SEP in adulthood seem not to be driven by a causal relationship, but instead by dynamics and influences in place before the respondents turn 30 years old that affect both their health and SEP onwards. The conclusion is limited in that we do not consider the effect of specific diseases and causal relationships in adulthood may be present over a longer timespan than 5 years. Copyright © 2015 Elsevier Ltd. All rights reserved.
Dahlin, Christine R; Wright, Timothy F
2012-01-01
The question of why animals participate in duets is an intriguing one, as many such displays appear to be more costly to produce than individual signals. Mated pairs of yellow-naped amazons, Amazona auropalliata, give duets on their nesting territories. We investigated the function of those duets with a playback experiment. We tested two hypotheses for the function of those duets: the joint territory defense hypothesis and the mate-guarding hypothesis, by presenting territorial pairs with three types of playback treatments: duets, male solos, and female solos. The joint territory defense hypothesis suggests that individuals engage in duets because they appear more threatening than solos and are thus more effective for the establishment, maintenance and/or defense of territories. It predicts that pairs will be coordinated in their response (pair members approach speakers and vocalize together) and will either respond more strongly (more calls and/or more movement) to duet treatments than to solo treatments, or respond equally to all treatments. Alternatively, the mate-guarding hypothesis suggests that individuals participate in duets because they allow them to acoustically guard their mate, and predicts uncoordinated responses by pairs, with weak responses to duet treatments and stronger responses by individuals to solos produced by the same sex. Yellow-naped amazon pairs responded to all treatments in an equivalently aggressive and coordinated manner by rapidly approaching speakers and vocalizing more. These responses generally support the joint territory defense hypothesis and further suggest that all intruders are viewed as a threat by resident pairs.
Dahlin, Christine R.; Wright, Timothy F.
2011-01-01
The question of why animals participate in duets is an intriguing one, as many such displays appear to be more costly to produce than individual signals. Mated pairs of yellow-naped amazons, Amazona auropalliata, give duets on their nesting territories. We investigated the function of those duets with a playback experiment. We tested two hypotheses for the function of those duets: the joint territory defense hypothesis and the mate-guarding hypothesis, by presenting territorial pairs with three types of playback treatments: duets, male solos, and female solos. The joint territory defense hypothesis suggests that individuals engage in duets because they appear more threatening than solos and are thus more effective for the establishment, maintenance and/or defense of territories. It predicts that pairs will be coordinated in their response (pair members approach speakers and vocalize together) and will either respond more strongly (more calls and/or more movement) to duet treatments than to solo treatments, or respond equally to all treatments. Alternatively, the mate-guarding hypothesis suggests that individuals participate in duets because they allow them to acoustically guard their mate, and predicts uncoordinated responses by pairs, with weak responses to duet treatments and stronger responses by individuals to solos produced by the same sex. Yellow-naped amazon pairs responded to all treatments in an equivalently aggressive and coordinated manner by rapidly approaching speakers and vocalizing more. These responses generally support the joint territory defense hypothesis and further suggest that all intruders are viewed as a threat by resident pairs. PMID:22162899
Hypothesis Testing as an Act of Rationality
NASA Astrophysics Data System (ADS)
Nearing, Grey
2017-04-01
Statistical hypothesis testing is ad hoc in two ways. First, setting probabilistic rejection criteria is, as Neyman (1957) put it, an act of will rather than an act of rationality. Second, physical theories like conservation laws do not inherently admit probabilistic predictions, and so we must use what are called epistemic bridge principles to connect model predictions with the actual methods of hypothesis testing. In practice, these bridge principles are likelihood functions, error functions, or performance metrics. I propose that the reason we are faced with these problems is because we have historically failed to account for a fundamental component of basic logic - namely the portion of logic that explains how epistemic states evolve in the presence of empirical data. This component of Cox' (1946) calculitic logic is called information theory (Knuth, 2005), and adding information theory our hypothetico-deductive account of science yields straightforward solutions to both of the above problems. This also yields a straightforward method for dealing with Popper's (1963) problem of verisimilitude by facilitating a quantitative approach to measuring process isomorphism. In practice, this involves data assimilation. Finally, information theory allows us to reliably bound measures of epistemic uncertainty, thereby avoiding the problem of Bayesian incoherency under misspecified priors (Grünwald, 2006). I therefore propose solutions to four of the fundamental problems inherent in both hypothetico-deductive and/or Bayesian hypothesis testing. - Neyman (1957) Inductive Behavior as a Basic Concept of Philosophy of Science. - Cox (1946) Probability, Frequency and Reasonable Expectation. - Knuth (2005) Lattice Duality: The Origin of Probability and Entropy. - Grünwald (2006). Bayesian Inconsistency under Misspecification. - Popper (1963) Conjectures and Refutations: The Growth of Scientific Knowledge.
Adaptive seamless designs: selection and prospective testing of hypotheses.
Jennison, Christopher; Turnbull, Bruce W
2007-01-01
There is a current trend towards clinical protocols which involve an initial "selection" phase followed by a hypothesis testing phase. The selection phase may involve a choice between competing treatments or different dose levels of a drug, between different target populations, between different endpoints, or between a superiority and a non-inferiority hypothesis. Clearly there can be benefits in elapsed time and economy in organizational effort if both phases can be designed up front as one experiment, with little downtime between phases. Adaptive designs have been proposed as a way to handle these selection/testing problems. They offer flexibility and allow final inferences to depend on data from both phases, while maintaining control of overall false positive rates. We review and critique the methods, give worked examples and discuss the efficiency of adaptive designs relative to more conventional procedures. Where gains are possible using the adaptive approach, a variety of logistical, operational, data handling and other practical difficulties remain to be overcome if adaptive, seamless designs are to be effectively implemented.
NASA Astrophysics Data System (ADS)
Razali, Radzuan; Khan, Habib; Shafie, Afza; Hassan, Abdul Rahman
2016-11-01
The objective of this paper is to examine the short-run and long-run dynamic causal relationship between energy consumption and income per capita both in bivariate and multivariate framework over the period 1971-2014 in the case of Malaysia [1]. The study applies ARDL Bound test procedure for the long run co-integration and Granger causality test for investigation of causal link between the variables. The ARDL bound test confirms the existence of long run co-integration relationship between the variables. The causality test show a feed-back hypothesis between income per capita and energy consumption over the period in the case of Malaysia.
Hua, Xia
2016-07-27
Being invoked as one of the candidate mechanisms for the latitudinal patterns in biodiversity, Janzen's hypothesis states that the limited seasonal temperature variation in the tropics generates greater temperature stratification across elevations, which makes tropical species adapted to narrower ranges of temperatures and have lower effective dispersal across elevations than species in temperate regions. Numerous empirical studies have documented latitudinal patterns in species elevational ranges and thermal niche breadths that are consistent with the hypothesis, but the theoretical underpinnings remain unclear. This study presents the first mathematical model to examine the evolutionary processes that could back up Janzen's hypothesis and assess the effectiveness of limited seasonal temperature variation to promote speciation along elevation in the tropics. Results suggest that trade-offs in thermal tolerances provide a mechanism for Janzen's hypothesis. Limited seasonal temperature variation promotes gradient speciation not due to the reduction in gene flow that is associated with narrow thermal niche, but due to the pleiotropic effects of more stable divergent selection of thermal tolerance on the evolution of reproductive incompatibility. The proposed modelling approach also provides a potential way to test a speciation model against genetic data. © 2016 The Author(s).
Test of the Constancy - Velocity Hypothesis: Navy Unit Functioning and Performance over 12 Years.
1988-01-31
purpose of the United States Government. 17 COSATI CODES 18 SUEJECT T’ERMS C rinue an rev se f necess iy1 f da eao ms een- listment1111 Rate, Change of...38 Velocity, Climate Change , and Upgrade Rate 41 Joint Effects of Culture/Climate and Velocity 43 Conclusions about the Role Played by Velocity 44...which (a) examined change in organizational systems over time, (b) systematically tested different methodological approaches to organizational
Kilborn, Joshua P; Jones, David L; Peebles, Ernst B; Naar, David F
2017-04-01
Clustering data continues to be a highly active area of data analysis, and resemblance profiles are being incorporated into ecological methodologies as a hypothesis testing-based approach to clustering multivariate data. However, these new clustering techniques have not been rigorously tested to determine the performance variability based on the algorithm's assumptions or any underlying data structures. Here, we use simulation studies to estimate the statistical error rates for the hypothesis test for multivariate structure based on dissimilarity profiles (DISPROF). We concurrently tested a widely used algorithm that employs the unweighted pair group method with arithmetic mean (UPGMA) to estimate the proficiency of clustering with DISPROF as a decision criterion. We simulated unstructured multivariate data from different probability distributions with increasing numbers of objects and descriptors, and grouped data with increasing overlap, overdispersion for ecological data, and correlation among descriptors within groups. Using simulated data, we measured the resolution and correspondence of clustering solutions achieved by DISPROF with UPGMA against the reference grouping partitions used to simulate the structured test datasets. Our results highlight the dynamic interactions between dataset dimensionality, group overlap, and the properties of the descriptors within a group (i.e., overdispersion or correlation structure) that are relevant to resemblance profiles as a clustering criterion for multivariate data. These methods are particularly useful for multivariate ecological datasets that benefit from distance-based statistical analyses. We propose guidelines for using DISPROF as a clustering decision tool that will help future users avoid potential pitfalls during the application of methods and the interpretation of results.
Lee, Kendall H.; Blaha, Charles D.; Garris, Paul A.; Mohseni, Pedram; Horne, April E.; Bennet, Kevin E.; Agnesi, Filippo; Bledsoe, Jonathan M.; Lester, Deranda B.; Kimble, Chris; Min, Hoon-Ki; Kim, Young-Bo; Cho, Zang-Hee
2010-01-01
Deep Brain Stimulation (DBS) provides therapeutic benefit for several neuropathologies including Parkinson’s disease (PD), epilepsy, chronic pain, and depression. Despite well established clinical efficacy, the mechanism(s) of DBS remains poorly understood. In this review we begin by summarizing the current understanding of the DBS mechanism. Using this knowledge as a framework, we then explore a specific hypothesis regarding DBS of the subthalamic nucleus (STN) for the treatment of PD. This hypothesis states that therapeutic benefit is provided, at least in part, by activation of surviving nigrostriatal dopaminergic neurons, subsequent striatal dopamine release, and resumption of striatal target cell control by dopamine. While highly controversial, we present preliminary data that are consistent with specific predications testing this hypothesis. We additionally propose that developing new technologies, e.g., human electrometer and closed-loop smart devices, for monitoring dopaminergic neurotransmission during STN DBS will further advance this treatment approach. PMID:20657744
Noreika, Valdas
2011-06-01
A number of differences between the dreams of schizophrenia patients and those of healthy participants have been linked to changes in waking life that schizophrenia may cause. This way, the "continuity hypothesis" has become a standard way to relate dreaming and waking experiences in schizophrenia. Nevertheless, some of the findings in dream literature are not compatible with the continuity hypothesis and suggest some other ways how dream content and waking experiences could interact. Conceptually, the continuity hypothesis could be sharpened into the "waking-to-dreaming" and the "dreaming-to-waking" hypotheses, whereas a less explored type of "discontinuity" could embrace the "compensated waking" and the "compensated dreaming" hypotheses. A careful consideration and empirical testing of each of those hypotheses may reveal a multiplicity of the ways how dreaming and waking life interact in schizophrenia. Copyright © 2010 Elsevier Inc. All rights reserved.
Stagewise cognitive development: an application of catastrophe theory.
van der Maas, H L; Molenaar, P C
1992-07-01
In this article an overview is given of traditional methodological approaches to stagewise cognitive developmental research. These approaches are evaluated and integrated on the basis of catastrophe theory. In particular, catastrophe theory specifies a set of common criteria for testing the discontinuity hypothesis proposed by Piaget. Separate criteria correspond to distinct methods used in cognitive developmental research. Such criteria are, for instance, the detection of spurts in development, bimodality of test scores, and increased variability of responses during transitional periods. When a genuine stage transition is present, these criteria are expected to be satisfied. A revised catastrophe model accommodating these criteria is proposed for the stage transition in cognitive development from the preoperational to the concrete operational stage.
One-way ANOVA based on interval information
NASA Astrophysics Data System (ADS)
Hesamian, Gholamreza
2016-08-01
This paper deals with extending the one-way analysis of variance (ANOVA) to the case where the observed data are represented by closed intervals rather than real numbers. In this approach, first a notion of interval random variable is introduced. Especially, a normal distribution with interval parameters is introduced to investigate hypotheses about the equality of interval means or test the homogeneity of interval variances assumption. Moreover, the least significant difference (LSD method) for investigating multiple comparison of interval means is developed when the null hypothesis about the equality of means is rejected. Then, at a given interval significance level, an index is applied to compare the interval test statistic and the related interval critical value as a criterion to accept or reject the null interval hypothesis of interest. Finally, the method of decision-making leads to some degrees to accept or reject the interval hypotheses. An applied example will be used to show the performance of this method.
A web-portal for interactive data exploration, visualization, and hypothesis testing
Bartsch, Hauke; Thompson, Wesley K.; Jernigan, Terry L.; Dale, Anders M.
2014-01-01
Clinical research studies generate data that need to be shared and statistically analyzed by their participating institutions. The distributed nature of research and the different domains involved present major challenges to data sharing, exploration, and visualization. The Data Portal infrastructure was developed to support ongoing research in the areas of neurocognition, imaging, and genetics. Researchers benefit from the integration of data sources across domains, the explicit representation of knowledge from domain experts, and user interfaces providing convenient access to project specific data resources and algorithms. The system provides an interactive approach to statistical analysis, data mining, and hypothesis testing over the lifetime of a study and fulfills a mandate of public sharing by integrating data sharing into a system built for active data exploration. The web-based platform removes barriers for research and supports the ongoing exploration of data. PMID:24723882
P value and the theory of hypothesis testing: an explanation for new researchers.
Biau, David Jean; Jolles, Brigitte M; Porcher, Raphaël
2010-03-01
In the 1920s, Ronald Fisher developed the theory behind the p value and Jerzy Neyman and Egon Pearson developed the theory of hypothesis testing. These distinct theories have provided researchers important quantitative tools to confirm or refute their hypotheses. The p value is the probability to obtain an effect equal to or more extreme than the one observed presuming the null hypothesis of no effect is true; it gives researchers a measure of the strength of evidence against the null hypothesis. As commonly used, investigators will select a threshold p value below which they will reject the null hypothesis. The theory of hypothesis testing allows researchers to reject a null hypothesis in favor of an alternative hypothesis of some effect. As commonly used, investigators choose Type I error (rejecting the null hypothesis when it is true) and Type II error (accepting the null hypothesis when it is false) levels and determine some critical region. If the test statistic falls into that critical region, the null hypothesis is rejected in favor of the alternative hypothesis. Despite similarities between the two, the p value and the theory of hypothesis testing are different theories that often are misunderstood and confused, leading researchers to improper conclusions. Perhaps the most common misconception is to consider the p value as the probability that the null hypothesis is true rather than the probability of obtaining the difference observed, or one that is more extreme, considering the null is true. Another concern is the risk that an important proportion of statistically significant results are falsely significant. Researchers should have a minimum understanding of these two theories so that they are better able to plan, conduct, interpret, and report scientific experiments.
ERIC Educational Resources Information Center
Oshima, Jun; Oshima, Ritsuko; Murayama, Isao; Inagaki, Shigenori; Takenaka, Makiko; Nakayama, Hayashi; Yamaguchi, Etsuji
2004-01-01
This paper reports design experiments on two Japanese elementary science lesson units in a sixth-grade classroom supported by computer support for collaborative learning (CSCL) technology as a collaborative reflection tool. We took different approaches in the experiments depending on their instructional goals. In the unit 'air and how things…
ERIC Educational Resources Information Center
Mclaren, Patrick J.; Hyde, Melissa K.; White, Katherine M.
2012-01-01
Increasing the number of bone marrow (BM) donors is important to ensure sufficient diversity on BM registries to meet the needs of patients. This study used an experimental approach to test the hypothesis that providing information about the risks of BM donation to allay unsubstantiated fears would reduce male and female participants' perceptions…
Testing an Asset-Building Approach for Young People: Early Access to Savings Predicts Later Savings
ERIC Educational Resources Information Center
Friedline, Terri; Elliott, William; Chowa, Gina A. N.
2013-01-01
A major hypothesis of asset-building is that early access to savings accounts leads to continued and improved educational and economic outcomes over time. This study asks whether or not young adults (ages 18-22) in 2007, particularly among lower income households, are significantly more likely to own savings accounts and to accumulate more savings…
ERIC Educational Resources Information Center
Parr, Brian A.; Edwards, M. Craig; Leising, James G.
2008-01-01
The purpose of this study was to empirically test the hypothesis that students who participated in a contextualized, mathematics-enhanced high school agricultural power and technology curriculum and aligned instructional approach would not experience significant diminishment in acquisition of technical skills related to agricultural power and…
Taking a systems approach to ecological systems
Grace, James B.
2015-01-01
Increasingly, there is interest in a systems-level understanding of ecological problems, which requires the evaluation of more complex, causal hypotheses. In this issue of the Journal of Vegetation Science, Soliveres et al. use structural equation modeling to test a causal network hypothesis about how tree canopies affect understorey communities. Historical analysis suggests structural equation modeling has been under-utilized in ecology.
ERIC Educational Resources Information Center
Maddox, W. Todd; Ing, A. David
2005-01-01
W. T. Maddox, F. G. Ashby, and C. J. Bohil (2003) found that delayed feedback adversely affects information-integration but not rule-based category learning in support of a multiple-systems approach to category learning. However, differences in the number of stimulus dimensions relevant to solving the task and perceptual similarity failed to rule…
A Bayesian bird's eye view of ‘Replications of important results in social psychology’
Schönbrodt, Felix D.; Yao, Yuling; Gelman, Andrew; Wagenmakers, Eric-Jan
2017-01-01
We applied three Bayesian methods to reanalyse the preregistered contributions to the Social Psychology special issue ‘Replications of Important Results in Social Psychology’ (Nosek & Lakens. 2014 Registered reports: a method to increase the credibility of published results. Soc. Psychol. 45, 137–141. (doi:10.1027/1864-9335/a000192)). First, individual-experiment Bayesian parameter estimation revealed that for directed effect size measures, only three out of 44 central 95% credible intervals did not overlap with zero and fell in the expected direction. For undirected effect size measures, only four out of 59 credible intervals contained values greater than 0.10 (10% of variance explained) and only 19 intervals contained values larger than 0.05. Second, a Bayesian random-effects meta-analysis for all 38 t-tests showed that only one out of the 38 hierarchically estimated credible intervals did not overlap with zero and fell in the expected direction. Third, a Bayes factor hypothesis test was used to quantify the evidence for the null hypothesis against a default one-sided alternative. Only seven out of 60 Bayes factors indicated non-anecdotal support in favour of the alternative hypothesis (BF10>3), whereas 51 Bayes factors indicated at least some support for the null hypothesis. We hope that future analyses of replication success will embrace a more inclusive statistical approach by adopting a wider range of complementary techniques. PMID:28280547
Debates—Hypothesis testing in hydrology: Theory and practice
NASA Astrophysics Data System (ADS)
Pfister, Laurent; Kirchner, James W.
2017-03-01
The basic structure of the scientific method—at least in its idealized form—is widely championed as a recipe for scientific progress, but the day-to-day practice may be different. Here, we explore the spectrum of current practice in hypothesis formulation and testing in hydrology, based on a random sample of recent research papers. This analysis suggests that in hydrology, as in other fields, hypothesis formulation and testing rarely correspond to the idealized model of the scientific method. Practices such as "p-hacking" or "HARKing" (Hypothesizing After the Results are Known) are major obstacles to more rigorous hypothesis testing in hydrology, along with the well-known problem of confirmation bias—the tendency to value and trust confirmations more than refutations—among both researchers and reviewers. Nonetheless, as several examples illustrate, hypothesis tests have played an essential role in spurring major advances in hydrological theory. Hypothesis testing is not the only recipe for scientific progress, however. Exploratory research, driven by innovations in measurement and observation, has also underlain many key advances. Further improvements in observation and measurement will be vital to both exploratory research and hypothesis testing, and thus to advancing the science of hydrology.
Bayesian inference for psychology. Part II: Example applications with JASP.
Wagenmakers, Eric-Jan; Love, Jonathon; Marsman, Maarten; Jamil, Tahira; Ly, Alexander; Verhagen, Josine; Selker, Ravi; Gronau, Quentin F; Dropmann, Damian; Boutin, Bruno; Meerhoff, Frans; Knight, Patrick; Raj, Akash; van Kesteren, Erik-Jan; van Doorn, Johnny; Šmíra, Martin; Epskamp, Sacha; Etz, Alexander; Matzke, Dora; de Jong, Tim; van den Bergh, Don; Sarafoglou, Alexandra; Steingroever, Helen; Derks, Koen; Rouder, Jeffrey N; Morey, Richard D
2018-02-01
Bayesian hypothesis testing presents an attractive alternative to p value hypothesis testing. Part I of this series outlined several advantages of Bayesian hypothesis testing, including the ability to quantify evidence and the ability to monitor and update this evidence as data come in, without the need to know the intention with which the data were collected. Despite these and other practical advantages, Bayesian hypothesis tests are still reported relatively rarely. An important impediment to the widespread adoption of Bayesian tests is arguably the lack of user-friendly software for the run-of-the-mill statistical problems that confront psychologists for the analysis of almost every experiment: the t-test, ANOVA, correlation, regression, and contingency tables. In Part II of this series we introduce JASP ( http://www.jasp-stats.org ), an open-source, cross-platform, user-friendly graphical software package that allows users to carry out Bayesian hypothesis tests for standard statistical problems. JASP is based in part on the Bayesian analyses implemented in Morey and Rouder's BayesFactor package for R. Armed with JASP, the practical advantages of Bayesian hypothesis testing are only a mouse click away.
Teaching Hypothesis Testing by Debunking a Demonstration of Telepathy.
ERIC Educational Resources Information Center
Bates, John A.
1991-01-01
Discusses a lesson designed to demonstrate hypothesis testing to introductory college psychology students. Explains that a psychology instructor demonstrated apparent psychic abilities to students. Reports that students attempted to explain the instructor's demonstrations through hypothesis testing and revision. Provides instructions on performing…
Unification of field theory and maximum entropy methods for learning probability densities
NASA Astrophysics Data System (ADS)
Kinney, Justin B.
2015-09-01
The need to estimate smooth probability distributions (a.k.a. probability densities) from finite sampled data is ubiquitous in science. Many approaches to this problem have been described, but none is yet regarded as providing a definitive solution. Maximum entropy estimation and Bayesian field theory are two such approaches. Both have origins in statistical physics, but the relationship between them has remained unclear. Here I unify these two methods by showing that every maximum entropy density estimate can be recovered in the infinite smoothness limit of an appropriate Bayesian field theory. I also show that Bayesian field theory estimation can be performed without imposing any boundary conditions on candidate densities, and that the infinite smoothness limit of these theories recovers the most common types of maximum entropy estimates. Bayesian field theory thus provides a natural test of the maximum entropy null hypothesis and, furthermore, returns an alternative (lower entropy) density estimate when the maximum entropy hypothesis is falsified. The computations necessary for this approach can be performed rapidly for one-dimensional data, and software for doing this is provided.
Unification of field theory and maximum entropy methods for learning probability densities.
Kinney, Justin B
2015-09-01
The need to estimate smooth probability distributions (a.k.a. probability densities) from finite sampled data is ubiquitous in science. Many approaches to this problem have been described, but none is yet regarded as providing a definitive solution. Maximum entropy estimation and Bayesian field theory are two such approaches. Both have origins in statistical physics, but the relationship between them has remained unclear. Here I unify these two methods by showing that every maximum entropy density estimate can be recovered in the infinite smoothness limit of an appropriate Bayesian field theory. I also show that Bayesian field theory estimation can be performed without imposing any boundary conditions on candidate densities, and that the infinite smoothness limit of these theories recovers the most common types of maximum entropy estimates. Bayesian field theory thus provides a natural test of the maximum entropy null hypothesis and, furthermore, returns an alternative (lower entropy) density estimate when the maximum entropy hypothesis is falsified. The computations necessary for this approach can be performed rapidly for one-dimensional data, and software for doing this is provided.
Ang, Ching-Seng; Nice, Edouard C
2010-09-03
Colorectal cancer (CRC) is the second most common cause of cancer-related deaths in both men and women. The fecal occult blood test is currently the first line method for CRC screening but has an unacceptably low sensitivity and specificity. Improved screening tests are therefore urgently required for early stage CRC screening. We have described a hypothesis-driven approach for a rapid biomarker discovery process whereby selected proteins previously implicated as colorectal cancer-associated proteins (CCAP), which can potentially be shed into the feces from a colorectal tumor, are targeted for excision from 1D-SDS-PAGE based on their predicted molecular weight followed by directed identification and relative quantification using multiple reaction monitoring (MRM). This approach can significantly reduce the time for clinical assay development with the added advantage that many proteins will have been validated by previous in vitro and/or in vivo studies. Sixty potential CCAPs were selected from the literature and appropriate MRM conditions were established for measurement of proteotypic peptides. Nineteen of these proteins were detected in the feces from a patient with colorectal cancer. Relative quantitation of these 19 CCAP across 5 CRC patients and 5 healthy volunteers were carried out, revealing hemoglobin, myeloperoxidase, S100A9, filamin A and l-plastin to be present only in the feces of CRC patients.
Carter, Cindy L; Onicescu, Georgiana; Cartmell, Kathleen B; Sterba, Katherine R; Tomsic, James; Alberg, Anthony J
2012-08-01
Physical activity benefits cancer survivors, but the comparative effectiveness of a team-based delivery approach remains unexplored. The hypothesis tested was that a team-based physical activity intervention delivery approach has added physical and psychological benefits compared to a group-based approach. A team-based sport accessible to survivors is dragon boating, which requires no previous experience and allows for diverse skill levels. In a non-randomized trial, cancer survivors chose between two similarly structured 8-week programs, a dragon boat paddling team (n = 68) or group-based walking program (n = 52). Three separate intervention rounds were carried out in 2007-2008. Pre-post testing measured physical and psychosocial outcomes. Compared to walkers, paddlers had significantly greater (all p < 0.01) team cohesion, program adherence/attendance, and increased upper-body strength. For quality-of-life outcomes, both interventions were associated with pre-post improvements, but with no clear-cut pattern of between-intervention differences. These hypothesis-generating findings suggest that a short-term, team-based physical activity program (dragon boat paddling) was associated with increased cohesion and adherence/attendance. Improvements in physical fitness and psychosocial benefits were comparable to a traditional, group-based walking program. Compared to a group-based intervention delivery format, the team-based intervention delivery format holds promise for promoting physical activity program adherence/attendance in cancer survivors.
Visualization-based analysis of multiple response survey data
NASA Astrophysics Data System (ADS)
Timofeeva, Anastasiia
2017-11-01
During the survey, the respondents are often allowed to tick more than one answer option for a question. Analysis and visualization of such data have difficulties because of the need for processing multiple response variables. With standard representation such as pie and bar charts, information about the association between different answer options is lost. The author proposes a visualization approach for multiple response variables based on Venn diagrams. For a more informative representation with a large number of overlapping groups it is suggested to use similarity and association matrices. Some aggregate indicators of dissimilarity (similarity) are proposed based on the determinant of the similarity matrix and the maximum eigenvalue of association matrix. The application of the proposed approaches is well illustrated by the example of the analysis of advertising sources. Intersection of sets indicates that the same consumer audience is covered by several advertising sources. This information is very important for the allocation of the advertising budget. The differences between target groups in advertising sources are of interest. To identify such differences the hypothesis of homogeneity and independence are tested. Recent approach to the problem are briefly reviewed and compared. An alternative procedure is suggested. It is based on partition of a consumer audience into pairwise disjoint subsets and includes hypothesis testing of the difference between the population proportions. It turned out to be more suitable for the real problem being solved.
Lash, Ayhan Aytekin; Plonczynski, Donna J; Sehdev, Amikar
2011-01-01
To compare the inclusion and the influences of selected variables on hypothesis testing during the 1980s and 1990s. In spite of the emphasis on conducting inquiry consistent with the tenets of logical positivism, there have been no studies investigating the frequency and patterns of hypothesis testing in nursing research The sample was obtained from the journal Nursing Research which was the research journal with the highest circulation during the study period under study. All quantitative studies published during the two decades including briefs and historical studies were included in the analyses A retrospective design was used to select the sample. Five years from the 1980s and 1990s each were randomly selected from the journal, Nursing Research. Of the 582 studies, 517 met inclusion criteria. Findings suggest that there has been a decline in the use of hypothesis testing in the last decades of the 20th century. Further research is needed to identify the factors that influence the conduction of research with hypothesis testing. Hypothesis testing in nursing research showed a steady decline from the 1980s to 1990s. Research purposes of explanation, and prediction/ control increased the likelihood of hypothesis testing. Hypothesis testing strengthens the quality of the quantitative studies, increases the generality of findings and provides dependable knowledge. This is particularly true for quantitative studies that aim to explore, explain and predict/control phenomena and/or test theories. The findings also have implications for doctoral programmes, research preparation of nurse-investigators, and theory testing.
Intra-fraction motion of the prostate is a random walk
NASA Astrophysics Data System (ADS)
Ballhausen, H.; Li, M.; Hegemann, N.-S.; Ganswindt, U.; Belka, C.
2015-01-01
A random walk model for intra-fraction motion has been proposed, where at each step the prostate moves a small amount from its current position in a random direction. Online tracking data from perineal ultrasound is used to validate or reject this model against alternatives. Intra-fraction motion of a prostate was recorded by 4D ultrasound (Elekta Clarity system) during 84 fractions of external beam radiotherapy of six patients. In total, the center of the prostate was tracked for 8 h in intervals of 4 s. Maximum likelihood model parameters were fitted to the data. The null hypothesis of a random walk was tested with the Dickey-Fuller test. The null hypothesis of stationarity was tested by the Kwiatkowski-Phillips-Schmidt-Shin test. The increase of variance in prostate position over time and the variability in motility between fractions were analyzed. Intra-fraction motion of the prostate was best described as a stochastic process with an auto-correlation coefficient of ρ = 0.92 ± 0.13. The random walk hypothesis (ρ = 1) could not be rejected (p = 0.27). The static noise hypothesis (ρ = 0) was rejected (p < 0.001). The Dickey-Fuller test rejected the null hypothesis ρ = 1 in 25% to 32% of cases. On average, the Kwiatkowski-Phillips-Schmidt-Shin test rejected the null hypothesis ρ = 0 with a probability of 93% to 96%. The variance in prostate position increased linearly over time (r2 = 0.9 ± 0.1). Variance kept increasing and did not settle at a maximum as would be expected from a stationary process. There was substantial variability in motility between fractions and patients with maximum aberrations from isocenter ranging from 0.5 mm to over 10 mm in one patient alone. In conclusion, evidence strongly suggests that intra-fraction motion of the prostate is a random walk and neither static (like inter-fraction setup errors) nor stationary (like a cyclic motion such as breathing, for example). The prostate tends to drift away from the isocenter during a fraction, and this variance increases with time, such that shorter fractions are beneficial to the problem of intra-fraction motion. As a consequence, fixed safety margins (which would over-compensate at the beginning and under-compensate at the end of a fraction) cannot optimally account for intra-fraction motion. Instead, online tracking and position correction on-the-fly should be considered as the preferred approach to counter intra-fraction motion.
2011-01-01
Background Although many biological databases are applying semantic web technologies, meaningful biological hypothesis testing cannot be easily achieved. Database-driven high throughput genomic hypothesis testing requires both of the capabilities of obtaining semantically relevant experimental data and of performing relevant statistical testing for the retrieved data. Tissue Microarray (TMA) data are semantically rich and contains many biologically important hypotheses waiting for high throughput conclusions. Methods An application-specific ontology was developed for managing TMA and DNA microarray databases by semantic web technologies. Data were represented as Resource Description Framework (RDF) according to the framework of the ontology. Applications for hypothesis testing (Xperanto-RDF) for TMA data were designed and implemented by (1) formulating the syntactic and semantic structures of the hypotheses derived from TMA experiments, (2) formulating SPARQLs to reflect the semantic structures of the hypotheses, and (3) performing statistical test with the result sets returned by the SPARQLs. Results When a user designs a hypothesis in Xperanto-RDF and submits it, the hypothesis can be tested against TMA experimental data stored in Xperanto-RDF. When we evaluated four previously validated hypotheses as an illustration, all the hypotheses were supported by Xperanto-RDF. Conclusions We demonstrated the utility of high throughput biological hypothesis testing. We believe that preliminary investigation before performing highly controlled experiment can be benefited. PMID:21342584
Testing for nonlinearity in time series: The method of surrogate data
DOE Office of Scientific and Technical Information (OSTI.GOV)
Theiler, J.; Galdrikian, B.; Longtin, A.
1991-01-01
We describe a statistical approach for identifying nonlinearity in time series; in particular, we want to avoid claims of chaos when simpler models (such as linearly correlated noise) can explain the data. The method requires a careful statement of the null hypothesis which characterizes a candidate linear process, the generation of an ensemble of surrogate'' data sets which are similar to the original time series but consistent with the null hypothesis, and the computation of a discriminating statistic for the original and for each of the surrogate data sets. The idea is to test the original time series against themore » null hypothesis by checking whether the discriminating statistic computed for the original time series differs significantly from the statistics computed for each of the surrogate sets. We present algorithms for generating surrogate data under various null hypotheses, and we show the results of numerical experiments on artificial data using correlation dimension, Lyapunov exponent, and forecasting error as discriminating statistics. Finally, we consider a number of experimental time series -- including sunspots, electroencephalogram (EEG) signals, and fluid convection -- and evaluate the statistical significance of the evidence for nonlinear structure in each case. 56 refs., 8 figs.« less
cit: hypothesis testing software for mediation analysis in genomic applications.
Millstein, Joshua; Chen, Gary K; Breton, Carrie V
2016-08-01
The challenges of successfully applying causal inference methods include: (i) satisfying underlying assumptions, (ii) limitations in data/models accommodated by the software and (iii) low power of common multiple testing approaches. The causal inference test (CIT) is based on hypothesis testing rather than estimation, allowing the testable assumptions to be evaluated in the determination of statistical significance. A user-friendly software package provides P-values and optionally permutation-based FDR estimates (q-values) for potential mediators. It can handle single and multiple binary and continuous instrumental variables, binary or continuous outcome variables and adjustment covariates. Also, the permutation-based FDR option provides a non-parametric implementation. Simulation studies demonstrate the validity of the cit package and show a substantial advantage of permutation-based FDR over other common multiple testing strategies. The cit open-source R package is freely available from the CRAN website (https://cran.r-project.org/web/packages/cit/index.html) with embedded C ++ code that utilizes the GNU Scientific Library, also freely available (http://www.gnu.org/software/gsl/). joshua.millstein@usc.edu Supplementary data are available at Bioinformatics online. © The Author 2016. Published by Oxford University Press. All rights reserved. For Permissions, please e-mail: journals.permissions@oup.com.
Inferring microhabitat preferences of Lilium catesbaei (Liliaceae).
Sommers, Kristen Penney; Elswick, Michael; Herrick, Gabriel I; Fox, Gordon A
2011-05-01
Microhabitat studies use varied statistical methods, some treating site occupancy as a dependent and others as an independent variable. Using the rare Lilium catesbaei as an example, we show why approaches to testing hypotheses of differences between occupied and unoccupied sites can lead to erroneous conclusions about habitat preferences. Predictive approaches like logistic regression can better lead to understanding of habitat requirements. Using 32 lily locations and 30 random locations >2 m from a lily (complete data: 31 lily and 28 random spots), we measured physical conditions--photosynthetically active radiation (PAR), canopy cover, litter depth, distance to and height of nearest shrub, and soil moisture--and number and identity of neighboring plants. Twelve lilies were used to estimate a photosynthetic assimilation curve. Analyses used logistic regression, discriminant function analysis (DFA), (multivariate) analysis of variance, and resampled Wilcoxon tests. Logistic regression and DFA found identical predictors of presence (PAR, canopy cover, distance to shrub, litter), but hypothesis tests pointed to a different set (PAR, litter, canopy cover, height of nearest shrub). Lilies are mainly in high-PAR spots, often close to light saturation. By contrast, PAR in random spots was often near the lily light compensation point. Lilies were near Serenoa repens less than at random; otherwise, neighbor identity had no significant effect. Predictive methods are more useful in this context than the hypothesis tests. Light availability plays a big role in lily presence, which may help to explain increases in flowering and emergence after fire and roller-chopping.
Knowledge dimensions in hypothesis test problems
NASA Astrophysics Data System (ADS)
Krishnan, Saras; Idris, Noraini
2012-05-01
The reformation in statistics education over the past two decades has predominantly shifted the focus of statistical teaching and learning from procedural understanding to conceptual understanding. The emphasis of procedural understanding is on the formulas and calculation procedures. Meanwhile, conceptual understanding emphasizes students knowing why they are using a particular formula or executing a specific procedure. In addition, the Revised Bloom's Taxonomy offers a twodimensional framework to describe learning objectives comprising of the six revised cognition levels of original Bloom's taxonomy and four knowledge dimensions. Depending on the level of complexities, the four knowledge dimensions essentially distinguish basic understanding from the more connected understanding. This study identifiesthe factual, procedural and conceptual knowledgedimensions in hypothesis test problems. Hypothesis test being an important tool in making inferences about a population from sample informationis taught in many introductory statistics courses. However, researchers find that students in these courses still have difficulty in understanding the underlying concepts of hypothesis test. Past studies also show that even though students can perform the hypothesis testing procedure, they may not understand the rationale of executing these steps or know how to apply them in novel contexts. Besides knowing the procedural steps in conducting a hypothesis test, students must have fundamental statistical knowledge and deep understanding of the underlying inferential concepts such as sampling distribution and central limit theorem. By identifying the knowledge dimensions of hypothesis test problems in this study, suitable instructional and assessment strategies can be developed in future to enhance students' learning of hypothesis test as a valuable inferential tool.
Toward Joint Hypothesis-Tests Seismic Event Screening Analysis: Ms|mb and Event Depth
DOE Office of Scientific and Technical Information (OSTI.GOV)
Anderson, Dale; Selby, Neil
2012-08-14
Well established theory can be used to combine single-phenomenology hypothesis tests into a multi-phenomenology event screening hypothesis test (Fisher's and Tippett's tests). Commonly used standard error in Ms:mb event screening hypothesis test is not fully consistent with physical basis. Improved standard error - Better agreement with physical basis, and correctly partitions error to include Model Error as a component of variance, correctly reduces station noise variance through network averaging. For 2009 DPRK test - Commonly used standard error 'rejects' H0 even with better scaling slope ({beta} = 1, Selby et al.), improved standard error 'fails to rejects' H0.
Data-driven approaches in the investigation of social perception
Adolphs, Ralph; Nummenmaa, Lauri; Todorov, Alexander; Haxby, James V.
2016-01-01
The complexity of social perception poses a challenge to traditional approaches to understand its psychological and neurobiological underpinnings. Data-driven methods are particularly well suited to tackling the often high-dimensional nature of stimulus spaces and of neural representations that characterize social perception. Such methods are more exploratory, capitalize on rich and large datasets, and attempt to discover patterns often without strict hypothesis testing. We present four case studies here: behavioural studies on face judgements, two neuroimaging studies of movies, and eyetracking studies in autism. We conclude with suggestions for particular topics that seem ripe for data-driven approaches, as well as caveats and limitations. PMID:27069045
After p Values: The New Statistics for Undergraduate Neuroscience Education.
Calin-Jageman, Robert J
2017-01-01
Statistical inference is a methodological cornerstone for neuroscience education. For many years this has meant inculcating neuroscience majors into null hypothesis significance testing with p values. There is increasing concern, however, about the pervasive misuse of p values. It is time to start planning statistics curricula for neuroscience majors that replaces or de-emphasizes p values. One promising alternative approach is what Cumming has dubbed the "New Statistics", an approach that emphasizes effect sizes, confidence intervals, meta-analysis, and open science. I give an example of the New Statistics in action and describe some of the key benefits of adopting this approach in neuroscience education.
Understanding the Role of P Values and Hypothesis Tests in Clinical Research.
Mark, Daniel B; Lee, Kerry L; Harrell, Frank E
2016-12-01
P values and hypothesis testing methods are frequently misused in clinical research. Much of this misuse appears to be owing to the widespread, mistaken belief that they provide simple, reliable, and objective triage tools for separating the true and important from the untrue or unimportant. The primary focus in interpreting therapeutic clinical research data should be on the treatment ("oomph") effect, a metaphorical force that moves patients given an effective treatment to a different clinical state relative to their control counterparts. This effect is assessed using 2 complementary types of statistical measures calculated from the data, namely, effect magnitude or size and precision of the effect size. In a randomized trial, effect size is often summarized using constructs, such as odds ratios, hazard ratios, relative risks, or adverse event rate differences. How large a treatment effect has to be to be consequential is a matter for clinical judgment. The precision of the effect size (conceptually related to the amount of spread in the data) is usually addressed with confidence intervals. P values (significance tests) were first proposed as an informal heuristic to help assess how "unexpected" the observed effect size was if the true state of nature was no effect or no difference. Hypothesis testing was a modification of the significance test approach that envisioned controlling the false-positive rate of study results over many (hypothetical) repetitions of the experiment of interest. Both can be helpful but, by themselves, provide only a tunnel vision perspective on study results that ignores the clinical effects the study was conducted to measure.
Pharmacophore Based Virtual Screening Approach to Identify Selective PDE4B Inhibitors
Gaurav, Anand; Gautam, Vertika
2017-01-01
Phosphodiesterase 4 (PDE4) has been established as a promising target in asthma and chronic obstructive pulmonary disease. PDE4B subtype selective inhibitors are known to reduce the dose limiting adverse effect associated with non-selective PDE4B inhibitors. This makes the development of PDE4B subtype selective inhibitors a desirable research goal. To achieve this goal, ligand based pharmacophore modeling approach is employed. Separate pharmacophore hypotheses for PDE4B and PDE4D inhibitors were generated using HypoGen algorithm and 106 PDE4 inhibitors from literature having thiopyrano [3,2-d] Pyrimidines, 2-arylpyrimidines, and triazines skeleton. Suitable training and test sets were created using the molecules as per the guidelines available for HypoGen program. Training set was used for hypothesis development while test set was used for validation purpose. Fisher validation was also used to test the significance of the developed hypothesis. The validated pharmacophore hypotheses for PDE4B and PDE4D inhibitors were used in sequential virtual screening of zinc database of drug like molecules to identify selective PDE4B inhibitors. The hits were screened for their estimated activity and fit value. The top hit was subjected to docking into the active sites of PDE4B and PDE4D to confirm its selectivity for PDE4B. The hits are proposed to be evaluated further using in-vitro assays. PMID:29201082
ON THE SUBJECT OF HYPOTHESIS TESTING
Ugoni, Antony
1993-01-01
In this paper, the definition of a statistical hypothesis is discussed, and the considerations which need to be addressed when testing a hypothesis. In particular, the p-value, significance level, and power of a test are reviewed. Finally, the often quoted confidence interval is given a brief introduction. PMID:17989768
Some consequences of using the Horsfall-Barratt scale for hypothesis testing
USDA-ARS?s Scientific Manuscript database
Comparing treatment effects by hypothesis testing is a common practice in plant pathology. Nearest percent estimates (NPEs) of disease severity were compared to Horsfall-Barratt (H-B) scale data to explore whether there was an effect of assessment method on hypothesis testing. A simulation model ba...
Hypothesis Testing in Task-Based Interaction
ERIC Educational Resources Information Center
Choi, Yujeong; Kilpatrick, Cynthia
2014-01-01
Whereas studies show that comprehensible output facilitates L2 learning, hypothesis testing has received little attention in Second Language Acquisition (SLA). Following Shehadeh (2003), we focus on hypothesis testing episodes (HTEs) in which learners initiate repair of their own speech in interaction. In the context of a one-way information gap…
Classroom-Based Strategies to Incorporate Hypothesis Testing in Functional Behavior Assessments
ERIC Educational Resources Information Center
Lloyd, Blair P.; Weaver, Emily S.; Staubitz, Johanna L.
2017-01-01
When results of descriptive functional behavior assessments are unclear, hypothesis testing can help school teams understand how the classroom environment affects a student's challenging behavior. This article describes two hypothesis testing strategies that can be used in classroom settings: structural analysis and functional analysis. For each…
Hypothesis Testing in the Real World
ERIC Educational Resources Information Center
Miller, Jeff
2017-01-01
Critics of null hypothesis significance testing suggest that (a) its basic logic is invalid and (b) it addresses a question that is of no interest. In contrast to (a), I argue that the underlying logic of hypothesis testing is actually extremely straightforward and compelling. To substantiate that, I present examples showing that hypothesis…
Lee, Peter N
2015-03-20
The "gateway hypothesis" usually refers to the possibility that the taking up of habit A, which is considered harmless (or less harmful), may lead to the subsequent taking up of another habit, B, which is considered harmful (or more harmful). Possible approaches to designing and analysing studies to test the hypothesis are discussed. Evidence relating to the use of snus (A) as a gateway for smoking (B) is then evaluated in detail. The importance of having appropriate data available on the sequence of use of A and B and on other potential confounding factors that may lead to the taking up of B is emphasised. Where randomised trials are impractical, the preferred designs include the prospective cohort study in which ever use of A and of B is recorded at regular intervals, and the cross-sectional survey in which time of starting to use A and B is recorded. Both approaches allow time-stratified analytical methods to be used, in which, in each time period, risk of initiating B among never users of B at the start of the interval is compared according to prior use of A. Adjustment in analysis for the potential confounding factors is essential. Of 11 studies of possible relevance conducted in Sweden, Finland or Norway, only one seriously addresses potential confounding by those other factors involved in the initiation of smoking. Furthermore, 5 of the 11 studies are of a design that does not allow proper testing of the gateway hypothesis for various reasons, and the analysis is unsatisfactory, sometimes seriously, in all the remaining six. While better analyses could be attempted for some of the six studies identified as having appropriate design, the issues of confounding remain, and more studies are clearly needed. To obtain a rapid answer, a properly designed cross-sectional survey is recommended.
GeneTools--application for functional annotation and statistical hypothesis testing.
Beisvag, Vidar; Jünge, Frode K R; Bergum, Hallgeir; Jølsum, Lars; Lydersen, Stian; Günther, Clara-Cecilie; Ramampiaro, Heri; Langaas, Mette; Sandvik, Arne K; Laegreid, Astrid
2006-10-24
Modern biology has shifted from "one gene" approaches to methods for genomic-scale analysis like microarray technology, which allow simultaneous measurement of thousands of genes. This has created a need for tools facilitating interpretation of biological data in "batch" mode. However, such tools often leave the investigator with large volumes of apparently unorganized information. To meet this interpretation challenge, gene-set, or cluster testing has become a popular analytical tool. Many gene-set testing methods and software packages are now available, most of which use a variety of statistical tests to assess the genes in a set for biological information. However, the field is still evolving, and there is a great need for "integrated" solutions. GeneTools is a web-service providing access to a database that brings together information from a broad range of resources. The annotation data are updated weekly, guaranteeing that users get data most recently available. Data submitted by the user are stored in the database, where it can easily be updated, shared between users and exported in various formats. GeneTools provides three different tools: i) NMC Annotation Tool, which offers annotations from several databases like UniGene, Entrez Gene, SwissProt and GeneOntology, in both single- and batch search mode. ii) GO Annotator Tool, where users can add new gene ontology (GO) annotations to genes of interest. These user defined GO annotations can be used in further analysis or exported for public distribution. iii) eGOn, a tool for visualization and statistical hypothesis testing of GO category representation. As the first GO tool, eGOn supports hypothesis testing for three different situations (master-target situation, mutually exclusive target-target situation and intersecting target-target situation). An important additional function is an evidence-code filter that allows users, to select the GO annotations for the analysis. GeneTools is the first "all in one" annotation tool, providing users with a rapid extraction of highly relevant gene annotation data for e.g. thousands of genes or clones at once. It allows a user to define and archive new GO annotations and it supports hypothesis testing related to GO category representations. GeneTools is freely available through www.genetools.no
ERIC Educational Resources Information Center
Kwon, Yong-Ju; Jeong, Jin-Su; Park, Yun-Bok
2006-01-01
The purpose of the present study was to test the hypothesis that student's abductive reasoning skills play an important role in the generation of hypotheses on pendulum motion tasks. To test the hypothesis, a hypothesis-generating test on pendulum motion, and a prior-belief test about pendulum motion were developed and administered to a sample of…
ERIC Educational Resources Information Center
LEWIS, EARL N.
AN EXPERIMENT WAS DESIGNED TO TEST THE HYPOTHESIS THAT PROPER USE OF ELECTRO-MECHANICAL AIDS CAN RELIEVE THE TEACHER OF A GREAT DEAL OF THE ROUTINE WORK OF TEACHING FOREIGN LANGUAGES. HE WOULD THUS BE ALLOWED TO EXTEND HIMSELF EITHER QUANTITATIVELY OR QUALITATIVELY IN HIS WORK. THIS EXPERIMENT USES THE QUALITATIVE APPROACH. THREE GROUPS OF…
Mazerolle, M.J.
2006-01-01
In ecology, researchers frequently use observational studies to explain a given pattern, such as the number of individuals in a habitat patch, with a large number of explanatory (i.e., independent) variables. To elucidate such relationships, ecologists have long relied on hypothesis testing to include or exclude variables in regression models, although the conclusions often depend on the approach used (e.g., forward, backward, stepwise selection). Though better tools have surfaced in the mid 1970's, they are still underutilized in certain fields, particularly in herpetology. This is the case of the Akaike information criterion (AIC) which is remarkably superior in model selection (i.e., variable selection) than hypothesis-based approaches. It is simple to compute and easy to understand, but more importantly, for a given data set, it provides a measure of the strength of evidence for each model that represents a plausible biological hypothesis relative to the entire set of models considered. Using this approach, one can then compute a weighted average of the estimate and standard error for any given variable of interest across all the models considered. This procedure, termed model-averaging or multimodel inference, yields precise and robust estimates. In this paper, I illustrate the use of the AIC in model selection and inference, as well as the interpretation of results analysed in this framework with two real herpetological data sets. The AIC and measures derived from it is should be routinely adopted by herpetologists. ?? Koninklijke Brill NV 2006.
Assessment of statistical significance and clinical relevance.
Kieser, Meinhard; Friede, Tim; Gondan, Matthias
2013-05-10
In drug development, it is well accepted that a successful study will demonstrate not only a statistically significant result but also a clinically relevant effect size. Whereas standard hypothesis tests are used to demonstrate the former, it is less clear how the latter should be established. In the first part of this paper, we consider the responder analysis approach and study the performance of locally optimal rank tests when the outcome distribution is a mixture of responder and non-responder distributions. We find that these tests are quite sensitive to their planning assumptions and have therefore not really any advantage over standard tests such as the t-test and the Wilcoxon-Mann-Whitney test, which perform overall well and can be recommended for applications. In the second part, we present a new approach to the assessment of clinical relevance based on the so-called relative effect (or probabilistic index) and derive appropriate sample size formulae for the design of studies aiming at demonstrating both a statistically significant and clinically relevant effect. Referring to recent studies in multiple sclerosis, we discuss potential issues in the application of this approach. Copyright © 2012 John Wiley & Sons, Ltd.
Heterogeneity of Social Approach Behaviour in Williams Syndrome: The Role of Response Inhibition
ERIC Educational Resources Information Center
Little, Katie; Riby, Deborah M.; Janes, Emily; Clark, Fiona; Fleck, Ruth; Rodgers, Jacqui
2013-01-01
The developmental disorder of Williams syndrome (WS) is associated with an overfriendly personality type, including an increased tendency to approach strangers. This atypical social approach behaviour (SAB) has been linked to two potential theories: the amygdala hypothesis and the frontal lobe hypothesis. The current study aimed to investigate…
Making Knowledge Delivery Failsafe: Adding Step Zero in Hypothesis Testing
ERIC Educational Resources Information Center
Pan, Xia; Zhou, Qiang
2010-01-01
Knowledge of statistical analysis is increasingly important for professionals in modern business. For example, hypothesis testing is one of the critical topics for quality managers and team workers in Six Sigma training programs. Delivering the knowledge of hypothesis testing effectively can be an important step for the incapable learners or…
An experiment with spectral analysis of emotional speech affected by orthodontic appliances
NASA Astrophysics Data System (ADS)
Přibil, Jiří; Přibilová, Anna; Ďuračková, Daniela
2012-11-01
The contribution describes the effect of the fixed and removable orthodontic appliances on spectral properties of emotional speech. Spectral changes were analyzed and evaluated by spectrograms and mean Welch’s periodograms. This alternative approach to the standard listening test enables to obtain objective comparison based on statistical analysis by ANOVA and hypothesis tests. Obtained results of analysis performed on short sentences of a female speaker in four emotional states (joyous, sad, angry, and neutral) show that, first of all, the removable orthodontic appliance affects the spectrograms of produced speech.
True or false: do 5-year-olds understand belief?
Fabricius, William V; Boyer, Ty W; Weimer, Amy A; Carroll, Kathleen
2010-11-01
In 3 studies (N = 188) we tested the hypothesis that children use a perceptual access approach to reason about mental states before they understand beliefs. The perceptual access hypothesis predicts a U-shaped developmental pattern of performance in true belief tasks, in which 3-year-olds who reason about reality should succeed, 4- to 5-year-olds who use perceptual access reasoning should fail, and older children who use belief reasoning should succeed. The results of Study 1 revealed the predicted pattern in 2 different true belief tasks. The results of Study 2 disconfirmed several alternate explanations based on possible pragmatic and inhibitory demands of the true belief tasks. In Study 3, we compared 2 methods of classifying individuals according to which 1 of the 3 reasoning strategies (reality reasoning, perceptual access reasoning, belief reasoning) they used. The 2 methods gave converging results. Both methods indicated that the majority of children used the same approach across tasks and that it was not until after 6 years of age that most children reasoned about beliefs. We conclude that because most prior studies have failed to detect young children's use of perceptual access reasoning, they have overestimated their understanding of false beliefs. We outline several theoretical implications that follow from the perceptual access hypothesis.
Boland, Elaine M; Stange, Jonathan P; Labelle, Denise R; Shapero, Benjamin G; Weiss, Rachel B; Abramson, Lyn Y; Alloy, Lauren B
2016-05-01
The Behavioral Approach System (BAS)/Reward Hypersensitivity Theory and the Social Zeitgeber Theory are two biopsychosocial theories of bipolar spectrum disorders (BSD) that may work together to explain affective dysregulation. The present study examined whether BAS sensitivity is associated with affective symptoms via a) increased social rhythm disruption in response to BAS-relevant life events, or b) greater exposure to BAS events leading to social rhythm disruption and subsequent symptoms. Results indicated that high BAS individuals were more likely to experience social rhythm disruption following BAS-relevant events. Social rhythm disruption mediated the association between BAS-relevant events and symptoms (hypothesis a). High BAS individuals experienced significantly more BAS-relevant events, which predicted greater social rhythm disruption, which predicted greater levels of affective symptoms (hypothesis b). Individuals at risk for BSD may be sensitive to BAS-relevant stimuli, experience more BAS-relevant events, and experience affective dysregulation due to the interplay of the BAS and circadian rhythms.
Testing of Hypothesis in Equivalence and Non Inferiority Trials-A Concept.
Juneja, Atul; Aggarwal, Abha R; Adhikari, Tulsi; Pandey, Arvind
2016-04-01
Establishing the appropriate hypothesis is one of the important steps for carrying out the statistical tests/analysis. Its understanding is important for interpreting the results of statistical analysis. The current communication attempts to provide the concept of testing of hypothesis in non inferiority and equivalence trials, where the null hypothesis is just reverse of what is set up for conventional superiority trials. It is similarly looked for rejection for establishing the fact the researcher is intending to prove. It is important to mention that equivalence or non inferiority cannot be proved by accepting the null hypothesis of no difference. Hence, establishing the appropriate statistical hypothesis is extremely important to arrive at meaningful conclusion for the set objectives in research.
The 1999 Mw 7.1 Hector Mine, California, earthquake: A test of the stress shadow hypothesis?
Harris, R.A.; Simpson, R.W.
2002-01-01
We test the stress shadow hypothesis for large earthquake interactions by examining the relationship between two large earthquakes that occurred in the Mojave Desert of southern California, the 1992 Mw 7.3 Landers and 1999 Mw 7.1 Hector Mine earthquakes. We want to determine if the 1999 Hector Mine earthquake occurred at a location where the Coulomb stress was increased (earthquake advance, stress trigger) or decreased (earthquake delay, stress shadow) by the previous large earthquake. Using four models of the Landers rupture and a range of possible hypocentral planes for the Hector Mine earthquake, we discover that most scenarios yield a Landers-induced relaxation (stress shadow) on the Hector Mine hypocentral plane. Although this result would seem to weigh against the stress shadow hypothesis, the results become considerably more uncertain when the effects of a nearby Landers aftershock, the 1992 ML 5.4 Pisgah earthquake, are taken into account. We calculate the combined static Coulomb stress changes due to the Landers and Pisgah earthquakes to range from -0.3 to +0.3 MPa (- 3 to +3 bars) at the possible Hector Mine hypocenters, depending on choice of rupture model and hypocenter. These varied results imply that the Hector Mine earthquake does not provide a good test of the stress shadow hypothesis for large earthquake interactions. We use a simple approach, that of static dislocations in an elastic half-space, yet we still obtain a wide range of both negative and positive Coulomb stress changes. Our findings serve as a caution that more complex models purporting to explain the triggering or shadowing relationship between the 1992 Landers and 1999 Hector Mine earthquakes need to also consider the parametric and geometric uncertainties raised here.
Meta-Analysis of Rare Binary Adverse Event Data
Bhaumik, Dulal K.; Amatya, Anup; Normand, Sharon-Lise; Greenhouse, Joel; Kaizar, Eloise; Neelon, Brian; Gibbons, Robert D.
2013-01-01
We examine the use of fixed-effects and random-effects moment-based meta-analytic methods for analysis of binary adverse event data. Special attention is paid to the case of rare adverse events which are commonly encountered in routine practice. We study estimation of model parameters and between-study heterogeneity. In addition, we examine traditional approaches to hypothesis testing of the average treatment effect and detection of the heterogeneity of treatment effect across studies. We derive three new methods, simple (unweighted) average treatment effect estimator, a new heterogeneity estimator, and a parametric bootstrapping test for heterogeneity. We then study the statistical properties of both the traditional and new methods via simulation. We find that in general, moment-based estimators of combined treatment effects and heterogeneity are biased and the degree of bias is proportional to the rarity of the event under study. The new methods eliminate much, but not all of this bias. The various estimators and hypothesis testing methods are then compared and contrasted using an example dataset on treatment of stable coronary artery disease. PMID:23734068
Ecomorphology of parasite attachment: experiments with feather lice.
Bush, Sarah E; Sohn, Edward; Clayton, Dale H
2006-02-01
The host specificity of some parasites can be reinforced by morphological specialization for attachment to mobile hosts. For example, ectoparasites with adaptations for attaching to hosts of a particular size might not be able to remain attached to larger or smaller hosts. This hypothesis is suggested by the positive correlation documented between the body sizes of many parasites and their hosts. We adopted an ecomorphological approach to test the attachment hypothesis. We tested the ability of host-specific feather lice (Phthiraptera: Ischnocera) to attach to 6 novel species of pigeons and doves that vary in size by nearly 2 orders of magnitude. Surprisingly, Rock Pigeon lice (Columbicola columbae) remained attached equally well to all 6 novel host species. We tested the relative importance of 3 factors that could facilitate louse attachment: whole-body insertion, tarsal claw use, and mandible use. Insertion, per se, was not necessary for attachment. However, insertion on coarse feathers of large hosts allowed lice to access feather barbules with their mandibles. Mandible use was a key component of attachment regardless of feather size. Attachment constraints do not appear to reinforce host specificity in this system.
Conceptual biology, hypothesis discovery, and text mining: Swanson's legacy.
Bekhuis, Tanja
2006-04-03
Innovative biomedical librarians and information specialists who want to expand their roles as expert searchers need to know about profound changes in biology and parallel trends in text mining. In recent years, conceptual biology has emerged as a complement to empirical biology. This is partly in response to the availability of massive digital resources such as the network of databases for molecular biologists at the National Center for Biotechnology Information. Developments in text mining and hypothesis discovery systems based on the early work of Swanson, a mathematician and information scientist, are coincident with the emergence of conceptual biology. Very little has been written to introduce biomedical digital librarians to these new trends. In this paper, background for data and text mining, as well as for knowledge discovery in databases (KDD) and in text (KDT) is presented, then a brief review of Swanson's ideas, followed by a discussion of recent approaches to hypothesis discovery and testing. 'Testing' in the context of text mining involves partially automated methods for finding evidence in the literature to support hypothetical relationships. Concluding remarks follow regarding (a) the limits of current strategies for evaluation of hypothesis discovery systems and (b) the role of literature-based discovery in concert with empirical research. Report of an informatics-driven literature review for biomarkers of systemic lupus erythematosus is mentioned. Swanson's vision of the hidden value in the literature of science and, by extension, in biomedical digital databases, is still remarkably generative for information scientists, biologists, and physicians.
From Shattered Assumptions to Weakened Worldviews: Trauma Symptoms Signal Anxiety Buffer Disruption.
Edmondson, Donald; Chaudoir, Stephenie R; Mills, Mary Alice; Park, Crystal L; Holub, Julie; Bartkowiak, Jennifer M
2011-01-01
The fundamental assertion of worldview-based models of posttraumatic stress disorder is that trauma symptoms result when traumatic experiences cannot be readily assimilated into previously held worldviews. In two studies, we test the anxiety buffer disruption hypothesis, which states that trauma symptoms result from the disruption of normal death anxiety-buffering functions of worldview. In Study 1, participants with trauma symptoms greater than the cutoff for PTSD evinced greater death-thought accessibility than those with sub-clinical or negligible symptoms after a reminder of death. In Study 2, participants with clinically significant trauma symptoms showed no evidence of worldview defense though death-thoughts were accessible. These results support the anxiety buffer disruption hypothesis, and suggest an entirely new approach to experimental PTSD research.
Recent tests of the equilibrium-point hypothesis (lambda model).
Feldman, A G; Ostry, D J; Levin, M F; Gribble, P L; Mitnitski, A B
1998-07-01
The lambda model of the equilibrium-point hypothesis (Feldman & Levin, 1995) is an approach to motor control which, like physics, is based on a logical system coordinating empirical data. The model has gone through an interesting period. On one hand, several nontrivial predictions of the model have been successfully verified in recent studies. In addition, the explanatory and predictive capacity of the model has been enhanced by its extension to multimuscle and multijoint systems. On the other hand, claims have recently appeared suggesting that the model should be abandoned. The present paper focuses on these claims and concludes that they are unfounded. Much of the experimental data that have been used to reject the model are actually consistent with it.
Murray, Kris A; Skerratt, Lee F; Garland, Stephen; Kriticos, Darren; McCallum, Hamish
2013-01-01
The pandemic amphibian disease chytridiomycosis often exhibits strong seasonality in both prevalence and disease-associated mortality once it becomes endemic. One hypothesis that could explain this temporal pattern is that simple weather-driven pathogen proliferation (population growth) is a major driver of chytridiomycosis disease dynamics. Despite various elaborations of this hypothesis in the literature for explaining amphibian declines (e.g., the chytrid thermal-optimum hypothesis) it has not been formally tested on infection patterns in the wild. In this study we developed a simple process-based model to simulate the growth of the pathogen Batrachochytrium dendrobatidis (Bd) under varying weather conditions to provide an a priori test of a weather-linked pathogen proliferation hypothesis for endemic chytridiomycosis. We found strong support for several predictions of the proliferation hypothesis when applied to our model species, Litoria pearsoniana, sampled across multiple sites and years: the weather-driven simulations of pathogen growth potential (represented as a growth index in the 30 days prior to sampling; GI30) were positively related to both the prevalence and intensity of Bd infections, which were themselves strongly and positively correlated. In addition, a machine-learning classifier achieved ~72% success in classifying positive qPCR results when utilising just three informative predictors 1) GI30, 2) frog body size and 3) rain on the day of sampling. Hence, while intrinsic traits of the individuals sampled (species, size, sex) and nuisance sampling variables (rainfall when sampling) influenced infection patterns obtained when sampling via qPCR, our results also strongly suggest that weather-linked pathogen proliferation plays a key role in the infection dynamics of endemic chytridiomycosis in our study system. Predictive applications of the model include surveillance design, outbreak preparedness and response, climate change scenario modelling and the interpretation of historical patterns of amphibian decline.
Whiplash and the compensation hypothesis.
Spearing, Natalie M; Connelly, Luke B
2011-12-01
Review article. To explain why the evidence that compensation-related factors lead to worse health outcomes is not compelling, either in general, or in the specific case of whiplash. There is a common view that compensation-related factors lead to worse health outcomes ("the compensation hypothesis"), despite the presence of important, and unresolved sources of bias. The empirical evidence on this question has ramifications for the design of compensation schemes. Using studies on whiplash, this article outlines the methodological problems that impede attempts to confirm or refute the compensation hypothesis. Compensation studies are prone to measurement bias, reverse causation bias, and selection bias. Errors in measurement are largely due to the latent nature of whiplash injuries and health itself, a lack of clarity over the unit of measurement (specific factors, or "compensation"), and a lack of appreciation for the heterogeneous qualities of compensation-related factors and schemes. There has been a failure to acknowledge and empirically address reverse causation bias, or the likelihood that poor health influences the decision to pursue compensation: it is unclear if compensation is a cause or a consequence of poor health, or both. Finally, unresolved selection bias (and hence, confounding) is evident in longitudinal studies and natural experiments. In both cases, between-group differences have not been addressed convincingly. The nature of the relationship between compensation-related factors and health is unclear. Current approaches to testing the compensation hypothesis are prone to several important sources of bias, which compromise the validity of their results. Methods that explicitly test the hypothesis and establish whether or not a causal relationship exists between compensation factors and prolonged whiplash symptoms are needed in future studies.
An Exercise for Illustrating the Logic of Hypothesis Testing
ERIC Educational Resources Information Center
Lawton, Leigh
2009-01-01
Hypothesis testing is one of the more difficult concepts for students to master in a basic, undergraduate statistics course. Students often are puzzled as to why statisticians simply don't calculate the probability that a hypothesis is true. This article presents an exercise that forces students to lay out on their own a procedure for testing a…
ERIC Educational Resources Information Center
Wilcox, Rand R.; Serang, Sarfaraz
2017-01-01
The article provides perspectives on p values, null hypothesis testing, and alternative techniques in light of modern robust statistical methods. Null hypothesis testing and "p" values can provide useful information provided they are interpreted in a sound manner, which includes taking into account insights and advances that have…
A powerful and efficient set test for genetic markers that handles confounders
Listgarten, Jennifer; Lippert, Christoph; Kang, Eun Yong; Xiang, Jing; Kadie, Carl M.; Heckerman, David
2013-01-01
Motivation: Approaches for testing sets of variants, such as a set of rare or common variants within a gene or pathway, for association with complex traits are important. In particular, set tests allow for aggregation of weak signal within a set, can capture interplay among variants and reduce the burden of multiple hypothesis testing. Until now, these approaches did not address confounding by family relatedness and population structure, a problem that is becoming more important as larger datasets are used to increase power. Results: We introduce a new approach for set tests that handles confounders. Our model is based on the linear mixed model and uses two random effects—one to capture the set association signal and one to capture confounders. We also introduce a computational speedup for two random-effects models that makes this approach feasible even for extremely large cohorts. Using this model with both the likelihood ratio test and score test, we find that the former yields more power while controlling type I error. Application of our approach to richly structured Genetic Analysis Workshop 14 data demonstrates that our method successfully corrects for population structure and family relatedness, whereas application of our method to a 15 000 individual Crohn’s disease case–control cohort demonstrates that it additionally recovers genes not recoverable by univariate analysis. Availability: A Python-based library implementing our approach is available at http://mscompbio.codeplex.com. Contact: jennl@microsoft.com or lippert@microsoft.com or heckerma@microsoft.com Supplementary information: Supplementary data are available at Bioinformatics online. PMID:23599503
Hypothesis Testing Using Spatially Dependent Heavy Tailed Multisensor Data
2014-12-01
Office of Research 113 Bowne Hall Syracuse, NY 13244 -1200 ABSTRACT HYPOTHESIS TESTING USING SPATIALLY DEPENDENT HEAVY-TAILED MULTISENSOR DATA Report...consistent with the null hypothesis of linearity and can be used to estimate the distribution of a test statistic that can discrimi- nate between the null... Test for nonlinearity. Histogram is generated using the surrogate data. The statistic of the original time series is represented by the solid line
Palmatier, Matthew I; Kellicut, Marissa R; Brianna Sheppard, A; Brown, Russell W; Robinson, Donita L
2014-11-01
Nicotine is a psychomotor stimulant with 'reinforcement enhancing' effects--the actions of nicotine in the brain increase responding for non-nicotine rewards. We hypothesized that this latter effect of nicotine depends on increased incentive properties of anticipatory cues; consistent with this hypothesis, multiple laboratories have reported that nicotine increases sign tracking, i.e. approach to a conditioned stimulus (CS), in Pavlovian conditioned-approach tasks. Incentive motivation and sign tracking are mediated by mesolimbic dopamine (DA) transmission and nicotine facilitates mesolimbic DA release. Therefore, we hypothesized that the incentive-promoting effects of nicotine would be impaired by DA antagonists. To test this hypothesis, separate groups of rats were injected with nicotine (0.4mg/kg base) or saline prior to Pavlovian conditioning sessions in which a CS (30s illumination of a light or presentation of a lever) was immediately followed by a sweet reward delivered in an adjacent location. Both saline and nicotine pretreated rats exhibited similar levels of conditioned approach to the reward location (goal tracking), but nicotine pretreatment significantly increased approach to the CS (sign tracking), regardless of type (lever or light). The DAD1 antagonist SCH-23390 and the DAD2/3 antagonist eticlopride reduced conditioned approach in all rats, but specifically reduced goal tracking in the saline pretreated rats and sign tracking in the nicotine pretreated rats. The non-selective DA antagonist flupenthixol reduced sign-tracking in nicotine rats at all doses tested; however, only the highest dose of flupenthixol reduced goal tracking in both nicotine and saline groups. The reductions in conditioned approach behavior, especially those by SCH-23390, were dissociated from simple motor suppressant effects of the antagonists. These experiments are the first to investigate the effects of dopaminergic drugs on the facilitation of sign-tracking engendered by nicotine and they implicate dopaminergic systems both in conditioned approach as well as the incentive-promoting effects of nicotine. Copyright © 2014 Elsevier Inc. All rights reserved.
Parrish, Rudolph S.; Smith, Charles N.
1990-01-01
A quantitative method is described for testing whether model predictions fall within a specified factor of true values. The technique is based on classical theory for confidence regions on unknown population parameters and can be related to hypothesis testing in both univariate and multivariate situations. A capability index is defined that can be used as a measure of predictive capability of a model, and its properties are discussed. The testing approach and the capability index should facilitate model validation efforts and permit comparisons among competing models. An example is given for a pesticide leaching model that predicts chemical concentrations in the soil profile.
The role of responsibility and fear of guilt in hypothesis-testing.
Mancini, Francesco; Gangemi, Amelia
2006-12-01
Recent theories argue that both perceived responsibility and fear of guilt increase obsessive-like behaviours. We propose that hypothesis-testing might account for this effect. Both perceived responsibility and fear of guilt would influence subjects' hypothesis-testing, by inducing a prudential style. This style implies focusing on and confirming the worst hypothesis, and reiterating the testing process. In our experiment, we manipulated the responsibility and fear of guilt of 236 normal volunteers who executed a deductive task. The results show that perceived responsibility is the main factor that influenced individuals' hypothesis-testing. Fear of guilt has however a significant additive effect. Guilt-fearing participants preferred to carry on with the diagnostic process, even when faced with initial favourable evidence, whereas participants in the responsibility condition only did so when confronted with an unfavourable evidence. Implications for the understanding of obsessive-compulsive disorder (OCD) are discussed.
Gary D. Grossman; Robert E. Ratajczak; C. Michael Wagner; J. Todd Petty
2010-01-01
1. We used information theoretic statistics [Akaikeâs Information Criterion (AIC)] and regression analysis in a multiple hypothesis testing approach to assess the processes capable of explaining long-term demographic variation in a lightly exploited brook trout population in Ball Creek, NC. We sampled a 100-m-long second-order site during both spring and autumn 1991â...
Optimal Sensor Scheduling for Multiple Hypothesis Testing
1981-09-01
Naval Research, under contract N00014-77-0532 is gratpfully acknowledged. 2 Laboratory for Information and Decision Systems , MIT Room 35-213, Cambridge...treat the more general problem [9,10]. However, two common threads connect these approaches: they obtain feedback laws mapping posterior destributions ...objective of a detection or identification algorithm is to produce correct estimates of the true state of a system . It is also bene- ficial if these
ERIC Educational Resources Information Center
Giorgis, Scott; Mahlen, Nancy; Anne, Kirk
2017-01-01
The augmented reality (AR) sandbox bridges the gap between two-dimensional (2D) and three-dimensional (3D) visualization by projecting a digital topographic map onto a sandbox landscape. As the landscape is altered, the map dynamically adjusts, providing an opportunity to discover how to read topographic maps. We tested the hypothesis that the AR…
Synthetic Lethality as a Targeted Approach to Advanced Prostate Cancer
2013-03-01
cell line was derived from primary human prostate epithelial cells by transformation with human papilloma virus. While not tumorigenic, they do...normal cells and tissues has no significant adverse effects. Inhibition of PKCδ in human and murine cells containing an activated Ras protein, however...initiates rapid and profound apoptosis. In this work, we are testing the hypothesis that inhibition or down-regulation of PKCδ in human and murine
Shojaedini, Seyed Vahab; Heydari, Masoud
2014-10-01
Shape and movement features of sperms are important parameters for infertility study and treatment. In this article, a new method is introduced for characterization sperms in microscopic videos. In this method, first a hypothesis framework is defined to distinguish sperms from other particles in captured video. Then decision about each hypothesis is done in following steps: Selecting some primary regions as candidates for sperms by watershed-based segmentation, pruning of some false candidates during successive frames using graph theory concept and finally confirming correct sperms by using their movement trajectories. Performance of the proposed method is evaluated on real captured images belongs to semen with high density of sperms. The obtained results show the proposed method may detect 97% of sperms in presence of 5% false detections and track 91% of moving sperms. Furthermore, it can be shown that better characterization of sperms in proposed algorithm doesn't lead to extracting more false sperms compared to some present approaches.
The impact of Lean bundles on hospital performance: does size matter?
Al-Hyari, Khalil; Abu Hammour, Sewar; Abu Zaid, Mohammad Khair Saleem; Haffar, Mohamed
2016-10-10
Purpose The purpose of this paper is to study the effect of the implementation of Lean bundles on hospital performance in private hospitals in Jordan and evaluate how much the size of organization can affect the relationship between Lean bundles implementation and hospital performance. Design/methodology/approach The research is considered as quantitative method (descriptive and hypothesis testing). Three statistical techniques were adopted to analyse the data. Structural equation modeling techniques and multi-group analysis were used to examine the research's hypothesis, and to perform the required statistical analysis of the data from the survey. Reliability analysis and confirmatory factor analysis were used to test the construct validity, reliability and measurement loadings that were performed. Findings Lean bundles have been identified as an effective approach that can dramatically improve the organizational performance of private hospitals in Jordan. Main Lean bundles - just in time, human resource management, and total quality management are applicable to large, small and medium hospitals without significant differences in advantages that depend on size. Originality/value According to the researchers' best knowledge, this is the first research that studies the impact of Lean bundles implementation in healthcare sector in Jordan. This research also makes a significant contribution for decision makers in healthcare to increase their awareness of Lean bundles.
Mass extinctions drove increased global faunal cosmopolitanism on the supercontinent Pangaea.
Button, David J; Lloyd, Graeme T; Ezcurra, Martín D; Butler, Richard J
2017-10-10
Mass extinctions have profoundly impacted the evolution of life through not only reducing taxonomic diversity but also reshaping ecosystems and biogeographic patterns. In particular, they are considered to have driven increased biogeographic cosmopolitanism, but quantitative tests of this hypothesis are rare and have not explicitly incorporated information on evolutionary relationships. Here we quantify faunal cosmopolitanism using a phylogenetic network approach for 891 terrestrial vertebrate species spanning the late Permian through Early Jurassic. This key interval witnessed the Permian-Triassic and Triassic-Jurassic mass extinctions, the onset of fragmentation of the supercontinent Pangaea, and the origins of dinosaurs and many modern vertebrate groups. Our results recover significant increases in global faunal cosmopolitanism following both mass extinctions, driven mainly by new, widespread taxa, leading to homogenous 'disaster faunas'. Cosmopolitanism subsequently declines in post-recovery communities. These shared patterns in both biotic crises suggest that mass extinctions have predictable influences on animal distribution and may shed light on biodiversity loss in extant ecosystems.Mass extinctions are thought to produce 'disaster faunas', communities dominated by a small number of widespread species. Here, Button et al. develop a phylogenetic network approach to test this hypothesis and find that mass extinctions did increase faunal cosmopolitanism across Pangaea during the late Palaeozoic and early Mesozoic.
NASA Astrophysics Data System (ADS)
Tang, G.; Zhang, M. G.; Liu, C.; Zhou, Z.; Chen, W.; Slik, J. W. F.
2014-05-01
The Tropical Niche Conservatism Hypothesis (TCH) tries to explain the generally observed latitudinal gradient of increasing species diversity towards the tropics. To date, few studies have used phylogenetic approaches to assess its validity, even though such methods are especially suited to detect changes in niche structure. We test the TCH using modeled distributions of 1898 woody species in Yunnan Province (southwest China) in combination with a family level phylogeny. Unlike predicted, species richness and phylogenetic diversity did not show a latitudinal gradient, but identified two high diversity zones, one in Northwest and one in South Yunnan. Despite this, the underlying residual phylogenetic diversity showed a clear decline away from the tropics, while the species composition became progressingly more phylogenetically clustered towards the North. These latitudinal changes were strongly associated with more extreme temperature variability and declining precipitation and soil water availability, especially during the dry season. Our results suggests that the climatically more extreme conditions outside the tropics require adaptations for successful colonization, most likely related to the plant hydraulic system, that have been acquired by only a limited number of phylogenetically closely related plant lineages. We emphasize the importance of phylogenetic approaches for testing the TCH.
Strand-seq: a unifying tool for studies of chromosome segregation
Falconer, Ester; Lansdorp, Peter M.
2013-01-01
Non random segregation of sister chromatids has been implicated to help specify daughter cell fate (the Silent Sister Hypothesis [1]) or to protect the genome of long-lived stem cells (the Immortal Strand Hypothesis [2]). The idea that sister chromatids are non-randomly segregated into specific daughter cells is only marginally supported by data in sporadic and often contradictory studies. As a result, the field has moved forward rather slowly. The advent of being able to directly label and differentiate sister chromatids in vivo using fluorescence in situ hybridization [3] was a significant advance for such studies. However, this approach is limited by the need for large tracks of unidirectional repeats on chromosomes and the reliance on quantitative imaging of fluorescent probes and rigorous statistical analysis to discern between the two competing hypotheses. A novel method called Strand-seq which uses next-generation sequencing to assay sister chromatid inheritance patterns independently for each chromosome [4] offers a comprehensive approach to test for non-random segregation. In addition Strand-seq enables studies on the deposition of chromatin marks in relation to DNA replication. This method is expected to help unify the field by testing previous claims of non-random segregation in an unbiased way in many model systems in vitro and in vivo. PMID:23665005
The PMHT: solutions for some of its problems
NASA Astrophysics Data System (ADS)
Wieneke, Monika; Koch, Wolfgang
2007-09-01
Tracking multiple targets in a cluttered environment is a challenging task. Probabilistic Multiple Hypothesis Tracking (PMHT) is an efficient approach for dealing with it. Essentially PMHT is based on the method of Expectation-Maximization for handling with association conflicts. Linearity in the number of targets and measurements is the main motivation for a further development and extension of this methodology. Unfortunately, compared with the Probabilistic Data Association Filter (PDAF), PMHT has not yet shown its superiority in terms of track-lost statistics. Furthermore, the problem of track extraction and deletion is apparently not yet satisfactorily solved within this framework. Four properties of PMHT are responsible for its problems in track maintenance: Non-Adaptivity, Hospitality, Narcissism and Local Maxima. 1, 2 In this work we present a solution for each of them and derive an improved PMHT by integrating the solutions into the PMHT formalism. The new PMHT is evaluated by Monte-Carlo simulations. A sequential Likelihood-Ratio (LR) test for track extraction has been developed and already integrated into the framework of traditional Bayesian Multiple Hypothesis Tracking. 3 As a multi-scan approach, also the PMHT methodology has the potential for track extraction. In this paper an analogous integration of a sequential LR test into the PMHT framework is proposed. We present an LR formula for track extraction and deletion using the PMHT update formulae. As PMHT provides all required ingredients for a sequential LR calculation, the LR is thus a by-product of the PMHT iteration process. Therefore the resulting update formula for the sequential LR test affords the development of Track-Before-Detect algorithms for PMHT. The approach is illustrated by a simple example.
Zhu, Huaping; Sun, Yaoru; Zeng, Jinhua; Sun, Hongyu
2011-05-01
Previous studies have suggested that the dysfunction of the human mirror neuron system (hMNS) plays an important role in the autism spectrum disorder (ASD). In this work, we propose a novel training program from our interdisciplinary research to improve mirror neuron functions of autistic individuals by using a BCI system with virtual reality technology. It is a promising approach for the autism to learn and develop social communications in a VR environment. A test method for this hypothesis is also provided. Copyright © 2011 Elsevier Ltd. All rights reserved.
Micro-structure and Swelling Behaviour of Compacted Clayey Soils: A Quantitative Approach
NASA Astrophysics Data System (ADS)
Ferber, Valéry; Auriol, Jean-Claude; David, Jean-Pierre
In this paper, the clay aggregate volume and inter-aggregate volume in compacted clayey soils are quantified, on the basis of simple hypothesis, using only their water content and dry density. Swelling tests on a highly plastic clay are then interpreted by describing the influence of the inter-aggregate volume before swelling on the total volume of samples after swelling. This approach leads to a linear relation between these latter parameters. Based on these results, a description of the evolution of the microstructure due to imbibition can be proposed. Moreover, this approach enables a general quantification of the influence of initial water content and dry density on the swelling behaviour of compacted clayey soils.
A statistical test to show negligible trend
Philip M. Dixon; Joseph H.K. Pechmann
2005-01-01
The usual statistical tests of trend are inappropriate for demonstrating the absence of trend. This is because failure to reject the null hypothesis of no trend does not prove that null hypothesis. The appropriate statistical method is based on an equivalence test. The null hypothesis is that the trend is not zero, i.e., outside an a priori specified equivalence region...
Sequential Probability Ratio Test for Spacecraft Collision Avoidance Maneuver Decisions
NASA Technical Reports Server (NTRS)
Carpenter, J. Russell; Markley, F. Landis
2013-01-01
A document discusses sequential probability ratio tests that explicitly allow decision-makers to incorporate false alarm and missed detection risks, and are potentially less sensitive to modeling errors than a procedure that relies solely on a probability of collision threshold. Recent work on constrained Kalman filtering has suggested an approach to formulating such a test for collision avoidance maneuver decisions: a filter bank with two norm-inequality-constrained epoch-state extended Kalman filters. One filter models the null hypotheses that the miss distance is inside the combined hard body radius at the predicted time of closest approach, and one filter models the alternative hypothesis. The epoch-state filter developed for this method explicitly accounts for any process noise present in the system. The method appears to work well using a realistic example based on an upcoming, highly elliptical orbit formation flying mission.
Demer, Joseph L.
2007-01-01
Background Late in the 20th Century, it was recognized that connective tissue structures in the orbit influence the paths of the extraocular muscles, and constitute their functional origins. Targeted investigations of these connective tissue “pulleys” led to the formulation of the active pulley hypothesis, which proposes that pulling directions of the rectus extraocular muscles are actively controlled via connective tissues. Purpose This review rebuts a series of criticisms of the active pulley hypothesis published by Jampel, and Jampel and Shi, in which these authors have disputed the existence and function of the pulleys. Methods The current paper reviews published evidence for the existence of orbital pulleys, the active pulley hypothesis, and physiologic tests of the active pulley hypothesis. Magnetic resonance imaging in a living subject, and histological examination of a human cadaver directly illustrate the relationship of pulleys to extraocular muscles. Results Strong scientific evidence is cited that supports the existence of orbital pulleys, and their role in ocular motility. The criticisms of have ignored mathematical truisms and strong scientific evidence. Conclusions Actively controlled orbital pulleys play a fundamental role in ocular motility. Pulleys profoundly influence the neural commands required to control eye movements and binocular alignment. Familiarity with the anatomy and physiology of the pulleys is requisite for a rational approach to diagnosing and treating strabismus using emerging methods. Conversely, approaches that deny or ignore the pulleys risk the sorts of errors that arise in geography and navigation from incorrect assumptions such as those of a flat (“platygean”) earth. PMID:17022164
A new approach for low-cost noninvasive detection of asymptomatic heart disease at rest.
DeMarzo, Arthur P; Calvin, James E
2007-01-01
It would be useful to have an inexpensive, noninvasive point-of-care test for early detection of asymptomatic heart disease. This study used impedance cardiography (ICG) in a new way to assess heart function that did not use stroke volume or cardiac output. There is a model of the ICG dZ/dt waveform that may be used as a template to represent normal heart function. The hypothesis was that a dZ/dt waveform which deviates from that template should indicate heart dysfunction and therefore heart disease. The objective was to assess the accuracy of this new ICG approach, using echocardiography as the standard. Thirty-four outpatients undergoing echocardiographic testing were tested by ICG while sitting upright and supine. All patients had no symptoms or history of a structural or functional heart disorder. Echocardiographic testing showed 17 patients with abnormalities and 17 as normal. ICG testing yielded 16 true positives for heart dysfunction with 1 false negative (sensitivity = 94%) and 17 true negatives with no false positives (specificity = 100%). Considering that the cost, technical skill, and time required for this ICG test are comparable to those of an electrocardiograph, this new approach has potential as a point-of-care screening test for asymptomatic heart disease.
Integrating In Vitro, Modeling, and In Vivo Approaches to Investigate Warfarin Bioequivalence
Wen, H; Fan, J; Vince, B; Li, T; Gao, W; Kinjo, M; Brown, J; Sun, W; Jiang, W; Lionberger, R
2017-01-01
We demonstrate the use of modeling and simulation to investigate bioequivalence (BE) concerns raised about generic warfarin products. To test the hypothesis that the loss of isopropyl alcohol and slow dissolution in acidic pH has significant impact on the pharmacokinetics of warfarin sodium tablets, we conducted physiologically based pharmacokinetic absorption modeling and simulation using formulation factors or in vitro dissolution profiles as input parameters. Sensitivity analyses indicated that warfarin pharmacokinetics was not sensitive to solubility, particle size, density, or dissolution rate in pH 4.5, but was affected by dissolution rate in pH 6.8 and potency. Virtual BE studies suggested that stressed warfarin sodium tablets with slow dissolution rate in pH 4.5 but having similar dissolution rate in pH 6.8 would be bioequivalent to the unstressed warfarin sodium tablets. A four‐way, crossover, single‐dose BE study in healthy subjects was conducted to test the same hypothesis and confirmed the simulation conclusion. PMID:28379643
An experimental approach in revisiting the magnetic orientation of cattle.
Weijers, Debby; Hemerik, Lia; Heitkönig, Ignas M A
2018-01-01
In response to the increasing number of observational studies on an apparent south-north orientation in non-homing, non-migrating terrestrial mammals, we experimentally tested the alignment hypothesis using strong neodymium magnets on the resting orientation of individual cattle in Portugal. Contrary to the hypothesis, the 34 cows in the experiment showed no directional preference, neither with, nor without a strong neodymium magnet fixed to their collar. The concurrently performed 2,428 daytime observations-excluding the hottest part of the day-of 659 resting individual cattle did not show a south-north alignment when at rest either. The preferred compass orientation of these cows was on average 130 degrees from the magnetic north (i.e., south east). Cow compass orientation correlated significantly with sun direction, but not with wind direction. In as far as we can determine, this is the first experimental test on magnetic orientation in larger, non-homing, non-migrating mammals. These experimental and observational findings do not support previously published suggestions on the magnetic south-north alignment in these mammals.
Mourning dove hunting regulation strategy based on annual harvest statistics and banding data
Otis, D.L.
2006-01-01
Although managers should strive to base game bird harvest management strategies on mechanistic population models, monitoring programs required to build and continuously update these models may not be in place. Alternatively, If estimates of total harvest and harvest rates are available, then population estimates derived from these harvest data can serve as the basis for making hunting regulation decisions based on population growth rates derived from these estimates. I present a statistically rigorous approach for regulation decision-making using a hypothesis-testing framework and an assumed framework of 3 hunting regulation alternatives. I illustrate and evaluate the technique with historical data on the mid-continent mallard (Anas platyrhynchos) population. I evaluate the statistical properties of the hypothesis-testing framework using the best available data on mourning doves (Zenaida macroura). I use these results to discuss practical implementation of the technique as an interim harvest strategy for mourning doves until reliable mechanistic population models and associated monitoring programs are developed.
Molecular profiles of finasteride effects on prostate carcinogenesis.
Li, Jin; Kim, Jeri
2009-06-01
Our inability to distinguish between low-grade prostate cancers that pose no threat and those that can kill compels newly diagnosed early prostate cancer patients to make decisions that may negatively affect their lives needlessly for years afterward. To reliably stratify patients into different risk categories and apply appropriate treatment, we need a better molecular understanding of prostate cancer progression. Androgen ablation therapy and 5-alpha reductase inhibitors reduce dihydrotestosterone levels and increase apoptosis. Because of the differing biological potentials of tumor cells, however, these treatments may, in some cases, worsen outcome by selecting for or inducing adaptation of stronger androgen receptor signaling pathways. Reduced dihydrotestosterone also may be associated with altered survival pathways. Complicating treatment effects further, molecular adaptation may be accelerated by interactions between epithelial and stromal cells. The hypothesis that early prostate cancer cells with differing biological potential may respond differently to finasteride treatment is worth testing. Ongoing studies using a systems biology approach in a preoperative prostate cancer setting are testing this hypothesis toward developing more-rational clinical interventions.
Pata, Ugur Korkut
2018-03-01
This paper examines the dynamic short- and long-term relationship between per capita GDP, per capita energy consumption, financial development, urbanization, industrialization, and per capita carbon dioxide (CO 2 ) emissions within the framework of the environmental Kuznets curve (EKC) hypothesis for Turkey covering the period from 1974 to 2013. According to the results of the autoregressive distributed lag bounds testing approach, an increase in per capita GDP, per capita energy consumption, financial development, urbanization, and industrialization has a positive effect on per capita CO 2 emissions in the long term, and also the variables other than urbanization increase per capita CO 2 emissions in the short term. In addition, the findings support the validity of the EKC hypothesis for Turkey in the short and long term. However, the turning points obtained from long-term regressions lie outside the sample period. Therefore, as the per capita GDP increases in Turkey, per capita CO 2 emissions continue to increase.
Unadjusted Bivariate Two-Group Comparisons: When Simpler is Better.
Vetter, Thomas R; Mascha, Edward J
2018-01-01
Hypothesis testing involves posing both a null hypothesis and an alternative hypothesis. This basic statistical tutorial discusses the appropriate use, including their so-called assumptions, of the common unadjusted bivariate tests for hypothesis testing and thus comparing study sample data for a difference or association. The appropriate choice of a statistical test is predicated on the type of data being analyzed and compared. The unpaired or independent samples t test is used to test the null hypothesis that the 2 population means are equal, thereby accepting the alternative hypothesis that the 2 population means are not equal. The unpaired t test is intended for comparing dependent continuous (interval or ratio) data from 2 study groups. A common mistake is to apply several unpaired t tests when comparing data from 3 or more study groups. In this situation, an analysis of variance with post hoc (posttest) intragroup comparisons should instead be applied. Another common mistake is to apply a series of unpaired t tests when comparing sequentially collected data from 2 study groups. In this situation, a repeated-measures analysis of variance, with tests for group-by-time interaction, and post hoc comparisons, as appropriate, should instead be applied in analyzing data from sequential collection points. The paired t test is used to assess the difference in the means of 2 study groups when the sample observations have been obtained in pairs, often before and after an intervention in each study subject. The Pearson chi-square test is widely used to test the null hypothesis that 2 unpaired categorical variables, each with 2 or more nominal levels (values), are independent of each other. When the null hypothesis is rejected, 1 concludes that there is a probable association between the 2 unpaired categorical variables. When comparing 2 groups on an ordinal or nonnormally distributed continuous outcome variable, the 2-sample t test is usually not appropriate. The Wilcoxon-Mann-Whitney test is instead preferred. When making paired comparisons on data that are ordinal, or continuous but nonnormally distributed, the Wilcoxon signed-rank test can be used. In analyzing their data, researchers should consider the continued merits of these simple yet equally valid unadjusted bivariate statistical tests. However, the appropriate use of an unadjusted bivariate test still requires a solid understanding of its utility, assumptions (requirements), and limitations. This understanding will mitigate the risk of misleading findings, interpretations, and conclusions.
NASA Astrophysics Data System (ADS)
Kartikasari, A.; Widjajanti, D. B.
2017-02-01
The aim of this study is to explore the effectiveness of learning approach using problem-based learning based on multiple intelligences in developing student’s achievement, mathematical connection ability, and self-esteem. This study is experimental research with research sample was 30 of Grade X students of MIA III MAN Yogyakarta III. Learning materials that were implemented consisting of trigonometry and geometry. For the purpose of this study, researchers designed an achievement test made up of 44 multiple choice questions with respectively 24 questions on the concept of trigonometry and 20 questions for geometry. The researcher also designed a connection mathematical test and self-esteem questionnaire that consisted of 7 essay questions on mathematical connection test and 30 items of self-esteem questionnaire. The learning approach said that to be effective if the proportion of students who achieved KKM on achievement test, the proportion of students who achieved a minimum score of high category on the results of both mathematical connection test and self-esteem questionnaire were greater than or equal to 70%. Based on the hypothesis testing at the significance level of 5%, it can be concluded that the learning approach using problem-based learning based on multiple intelligences was effective in terms of student’s achievement, mathematical connection ability, and self-esteem.
Longitudinal Dimensionality of Adolescent Psychopathology: Testing the Differentiation Hypothesis
ERIC Educational Resources Information Center
Sterba, Sonya K.; Copeland, William; Egger, Helen L.; Costello, E. Jane; Erkanli, Alaattin; Angold, Adrian
2010-01-01
Background: The differentiation hypothesis posits that the underlying liability distribution for psychopathology is of low dimensionality in young children, inflating diagnostic comorbidity rates, but increases in dimensionality with age as latent syndromes become less correlated. This hypothesis has not been adequately tested with longitudinal…
Machine learning search for variable stars
NASA Astrophysics Data System (ADS)
Pashchenko, Ilya N.; Sokolovsky, Kirill V.; Gavras, Panagiotis
2018-04-01
Photometric variability detection is often considered as a hypothesis testing problem: an object is variable if the null hypothesis that its brightness is constant can be ruled out given the measurements and their uncertainties. The practical applicability of this approach is limited by uncorrected systematic errors. We propose a new variability detection technique sensitive to a wide range of variability types while being robust to outliers and underestimated measurement uncertainties. We consider variability detection as a classification problem that can be approached with machine learning. Logistic Regression (LR), Support Vector Machines (SVM), k Nearest Neighbours (kNN), Neural Nets (NN), Random Forests (RF), and Stochastic Gradient Boosting classifier (SGB) are applied to 18 features (variability indices) quantifying scatter and/or correlation between points in a light curve. We use a subset of Optical Gravitational Lensing Experiment phase two (OGLE-II) Large Magellanic Cloud (LMC) photometry (30 265 light curves) that was searched for variability using traditional methods (168 known variable objects) as the training set and then apply the NN to a new test set of 31 798 OGLE-II LMC light curves. Among 205 candidates selected in the test set, 178 are real variables, while 13 low-amplitude variables are new discoveries. The machine learning classifiers considered are found to be more efficient (select more variables and fewer false candidates) compared to traditional techniques using individual variability indices or their linear combination. The NN, SGB, SVM, and RF show a higher efficiency compared to LR and kNN.
Null but not void: considerations for hypothesis testing.
Shaw, Pamela A; Proschan, Michael A
2013-01-30
Standard statistical theory teaches us that once the null and alternative hypotheses have been defined for a parameter, the choice of the statistical test is clear. Standard theory does not teach us how to choose the null or alternative hypothesis appropriate to the scientific question of interest. Neither does it tell us that in some cases, depending on which alternatives are realistic, we may want to define our null hypothesis differently. Problems in statistical practice are frequently not as pristinely summarized as the classic theory in our textbooks. In this article, we present examples in statistical hypothesis testing in which seemingly simple choices are in fact rich with nuance that, when given full consideration, make the choice of the right hypothesis test much less straightforward. Published 2012. This article is a US Government work and is in the public domain in the USA.
Effect of climate-related mass extinctions on escalation in molluscs
NASA Astrophysics Data System (ADS)
Hansen, Thor A.; Kelley, Patricia H.; Melland, Vicky D.; Graham, Scott E.
1999-12-01
We test the hypothesis that escalated species (e.g., those with antipredatory adaptations such as heavy armor) are more vulnerable to extinctions caused by changes in climate. If this hypothesis is valid, recovery faunas after climate-related extinctions should include significantly fewer species with escalated shell characteristics, and escalated species should undergo greater rates of extinction than nonescalated species. This hypothesis is tested for the Cretaceous-Paleocene, Eocene-Oligocene, middle Miocene, and Pliocene-Pleistocene mass extinctions. Gastropod and bivalve molluscs from the U.S. coastal plain were evaluated for 10 shell characters that confer resistance to predators. Of 40 tests, one supported the hypothesis; highly ornamented gastropods underwent greater levels of Pliocene-Pleistocene extinction than did nonescalated species. All remaining tests were nonsignificant. The hypothesis that escalated species are more vulnerable to climate-related mass extinctions is not supported.
On Restructurable Control System Theory
NASA Technical Reports Server (NTRS)
Athans, M.
1983-01-01
The state of stochastic system and control theory as it impacts restructurable control issues is addressed. The multivariable characteristics of the control problem are addressed. The failure detection/identification problem is discussed as a multi-hypothesis testing problem. Control strategy reconfiguration, static multivariable controls, static failure hypothesis testing, dynamic multivariable controls, fault-tolerant control theory, dynamic hypothesis testing, generalized likelihood ratio (GLR) methods, and adaptive control are discussed.
Zhang, Limei
2016-06-01
This study reports on the relationships between test takers' individual differences and their performance on a reading comprehension test. A total of 518 Chinese college students (252 women and 256 men; M age = 19.26 year, SD = 0.98) answered a questionnaire and sit for a reading comprehension test. The study found that test takers' L2 language proficiency was closely linked to their test performance. Test takers' employment of strategies was significantly and positively associated with their performance on the test. Test takers' motivation was found to be significantly associated with reading test performance. Test anxiety was negatively related to their use of reading strategies and test performance. The results of the study lent support to the threshold hypothesis of language proficiency. The implications for classroom teaching were provided. © The Author(s) 2016.
Revised standards for statistical evidence.
Johnson, Valen E
2013-11-26
Recent advances in Bayesian hypothesis testing have led to the development of uniformly most powerful Bayesian tests, which represent an objective, default class of Bayesian hypothesis tests that have the same rejection regions as classical significance tests. Based on the correspondence between these two classes of tests, it is possible to equate the size of classical hypothesis tests with evidence thresholds in Bayesian tests, and to equate P values with Bayes factors. An examination of these connections suggest that recent concerns over the lack of reproducibility of scientific studies can be attributed largely to the conduct of significance tests at unjustifiably high levels of significance. To correct this problem, evidence thresholds required for the declaration of a significant finding should be increased to 25-50:1, and to 100-200:1 for the declaration of a highly significant finding. In terms of classical hypothesis tests, these evidence standards mandate the conduct of tests at the 0.005 or 0.001 level of significance.
Anthony, Christopher J; DiPerna, James Clyde; Amato, Paul R
2014-06-01
Data from the Early Childhood Longitudinal Study--Kindergarten Cohort (ECLS-K) were used to test the hypothesis that approaches to learning (ATL) mediates the link between parental divorce and academic achievement. Fixed effects regression was utilized to test for mediation, and subsequent moderation analyses examining gender and age at time of divorce also were conducted. Results indicated that divorce was associated with less growth in test scores and that ATL mediated 18% and 12% of this association in reading and mathematics respectively. Parental divorce also was associated with larger negative effects for children who experienced divorce at an older age as well as for girls' mathematics test scores. These findings contribute to the understanding of the impact of parental divorce on children's academic achievement and underscore the importance of focusing on the variability of child outcomes following parental divorce. Copyright © 2014 Society for the Study of School Psychology. Published by Elsevier Ltd. All rights reserved.
Gonçalves, Joana; Violante, Inês R; Sereno, José; Leitão, Ricardo A; Cai, Ying; Abrunhosa, Antero; Silva, Ana Paula; Silva, Alcino J; Castelo-Branco, Miguel
2017-01-01
Excitation/inhibition (E/I) imbalance remains a widely discussed hypothesis in autism spectrum disorders (ASD). The presence of such an imbalance may potentially define a therapeutic target for the treatment of cognitive disabilities related to this pathology. Consequently, the study of monogenic disorders related to autism, such as neurofibromatosis type 1 (NF1), represents a promising approach to isolate mechanisms underlying ASD-related cognitive disabilities. However, the NF1 mouse model showed increased γ-aminobutyric acid (GABA) neurotransmission, whereas the human disease showed reduced cortical GABA levels. It is therefore important to clarify whether the E/I imbalance hypothesis holds true. We hypothesize that E/I may depend on distinct pre- and postsynaptic push-pull mechanisms that might be are region-dependent. In current study, we assessed two critical components of E/I regulation: the concentration of neurotransmitters and levels of GABA(A) receptors. Measurements were performed across the hippocampi, striatum, and prefrontal cortices by combined in vivo magnetic resonance spectroscopy (MRS) and molecular approaches in this ASD-related animal model, the Nf1 +/- mouse. Cortical and striatal GABA/glutamate ratios were increased. At the postsynaptic level, very high receptor GABA(A) receptor expression was found in hippocampus, disproportionately to the small reduction in GABA levels. Gabaergic tone (either by receptor levels change or GABA/glutamate ratios) seemed therefore to be enhanced in all regions, although by a different mechanism. Our data provides support for the hypothesis of E/I imbalance in NF1 while showing that pre- and postsynaptic changes are region-specific. All these findings are consistent with our previous physiological evidence of increased inhibitory tone. Such heterogeneity suggests that therapeutic approaches to address neurochemical imbalance in ASD may need to focus on targets where convergent physiological mechanisms can be found.
Influence of Solutocapillary Convection on Macrovoid Defect Formation in Polymeric Membranes
NASA Technical Reports Server (NTRS)
Greenberg, Alan R.; Krantz, William B.; Todd, Paul
2003-01-01
The focus of this research project involved the dry-cast process for polymeric membrane formation, whereby evaporation of solvent from an initially homogeneous polymer/solvent/ nonsolvent solution results in phase separation and the formation of polymer-rich and polymer-lean phases. Under certain conditions the polymer-lean phase gives rise to very large and usually undesirable, tear-drop-shaped pores (size approx. 10 - 50 microns) termed macrovoids (MVs). Although in many cases the presence of MV pores has deleterious effects on membrane performance, there are a number of innovative applications where the presence of such pores is highly desirable. Although researchers have proposed a variety of mechanisms for MV formation over the past three decades, two main hypotheses are currently favored: one asserts that MV growth can be attributed solely to diffusion (the diffusive growth hypothesis), whereas the other states that solutocapillary convection (the SC hypothesis) at the MV interface contributes to growth. The overall goal of this research was to obtain a more comprehensive understanding of the fundamental mechanism of MV growth. This research incorporates a coupled modeling and experimental approach to test a solutocapillary convection hypothesis for the growth of macrovoid pores in polymeric membranes. Specifically, we utilized a modification of the first principles model developed by two of the PIs (ARG and WBK) for dry-cast CA membranes. For the experimental component, two separate and mutually complementary approaches were used to study MV growth. In the first, membranes cast in a zero-g environment aboard the NASA KC-135 aircraft were compared with those cast on the ground to assess the effect of the buoyancy force on membrane morphology and MV size and shape. In the second approach, videomicroscopy flow visualization (VMFV) was utilized to observe MV formation and growth in real time and to assess the effect of surface tension on the MV growth dynamics. As a result of these fundamental studies, our research group advanced a new hypothesis for MV pore development in polymeric membranes.
Proficiency Testing for Evaluating Aerospace Materials Test Anomalies
NASA Technical Reports Server (NTRS)
Hirsch, D.; Motto, S.; Peyton, S.; Beeson, H.
2006-01-01
ASTM G 86 and ASTM G 74 are commonly used to evaluate materials susceptibility to ignition in liquid and gaseous oxygen systems. However, the methods have been known for their lack of repeatability. The inherent problems identified with the test logic would either not allow precise identification or the magnitude of problems related to running the tests, such as lack of consistency of systems performance, lack of adherence to procedures, etc. Excessive variability leads to increasing instances of accepting the null hypothesis erroneously, and so to the false logical deduction that problems are nonexistent when they really do exist. This paper attempts to develop and recommend an approach that could lead to increased accuracy in problem diagnostics by using the 50% reactivity point, which has been shown to be more repeatable. The initial tests conducted indicate that PTFE and Viton A (for pneumatic impact) and Buna S (for mechanical impact) would be good choices for additional testing and consideration for inter-laboratory evaluations. The approach presented could also be used to evaluate variable effects with increased confidence and tolerance optimization.
A Unified Mixed-Effects Model for Rare-Variant Association in Sequencing Studies
Sun, Jianping; Zheng, Yingye; Hsu, Li
2013-01-01
For rare-variant association analysis, due to extreme low frequencies of these variants, it is necessary to aggregate them by a prior set (e.g., genes and pathways) in order to achieve adequate power. In this paper, we consider hierarchical models to relate a set of rare variants to phenotype by modeling the effects of variants as a function of variant characteristics while allowing for variant-specific effect (heterogeneity). We derive a set of two score statistics, testing the group effect by variant characteristics and the heterogeneity effect. We make a novel modification to these score statistics so that they are independent under the null hypothesis and their asymptotic distributions can be derived. As a result, the computational burden is greatly reduced compared with permutation-based tests. Our approach provides a general testing framework for rare variants association, which includes many commonly used tests, such as the burden test [Li and Leal, 2008] and the sequence kernel association test [Wu et al., 2011], as special cases. Furthermore, in contrast to these tests, our proposed test has an added capacity to identify which components of variant characteristics and heterogeneity contribute to the association. Simulations under a wide range of scenarios show that the proposed test is valid, robust and powerful. An application to the Dallas Heart Study illustrates that apart from identifying genes with significant associations, the new method also provides additional information regarding the source of the association. Such information may be useful for generating hypothesis in future studies. PMID:23483651
NASA Astrophysics Data System (ADS)
Thompson, E. M.; Hewlett, J. B.; Baise, L. G.; Vogel, R. M.
2011-01-01
Annual maximum (AM) time series are incomplete (i.e., censored) when no events are included above the assumed censoring threshold (i.e., magnitude of completeness). We introduce a distrtibutional hypothesis test for left-censored Gumbel observations based on the probability plot correlation coefficient (PPCC). Critical values of the PPCC hypothesis test statistic are computed from Monte-Carlo simulations and are a function of sample size, censoring level, and significance level. When applied to a global catalog of earthquake observations, the left-censored Gumbel PPCC tests are unable to reject the Gumbel hypothesis for 45 of 46 seismic regions. We apply four different field significance tests for combining individual tests into a collective hypothesis test. None of the field significance tests are able to reject the global hypothesis that AM earthquake magnitudes arise from a Gumbel distribution. Because the field significance levels are not conclusive, we also compute the likelihood that these field significance tests are unable to reject the Gumbel model when the samples arise from a more complex distributional alternative. A power study documents that the censored Gumbel PPCC test is unable to reject some important and viable Generalized Extreme Value (GEV) alternatives. Thus, we cannot rule out the possibility that the global AM earthquake time series could arise from a GEV distribution with a finite upper bound, also known as a reverse Weibull distribution. Our power study also indicates that the binomial and uniform field significance tests are substantially more powerful than the more commonly used Bonferonni and false discovery rate multiple comparison procedures.
Defense Small Business Innovation Research Program (SBIR) Abstracts of Phase 1 Awards 1983.
1984-04-06
STRATEGY, THE INTERPLAY BETWEEN ELECTROMAGNETIC EMISSION CON- TROL AND FLEET OPERATION. THE TECHNICAL APPROACH IS BASED ON AN ANALYSIS OF EMCON...THEM AND A BOTTOM UP APPROACH . THE REQUIREMENTS AND ARCHITECTURAL ASPECTS WILL BE EXPLORED FROM THE MORE ENCOMPASSING PERSPECTIVE OF THE TOTAL...AN AI APPROACH TO INFORMATION FUSION INCLUDING KNOW- LEDGE ORGANIZATION, HYPOTHESIS REPRESENTATIVES, DOMAIN KNOWLEDGE RE- PRESENTATION, HYPOTHESIS
NASA Astrophysics Data System (ADS)
Soulis, K. X.; Valiantzas, J. D.
2011-10-01
The Soil Conservation Service Curve Number (SCS-CN) approach is widely used as a simple method for predicting direct runoff volume for a given rainfall event. The CN values can be estimated by being selected from tables. However, it is more accurate to estimate the CN value from measured rainfall-runoff data (assumed available) in a watershed. Previous researchers indicated that the CN values calculated from measured rainfall-runoff data vary systematically with the rainfall depth. They suggested the determination of a single asymptotic CN value observed for very high rainfall depths to characterize the watersheds' runoff response. In this paper, the novel hypothesis that the observed correlation between the calculated CN value and the rainfall depth in a watershed reflects the effect of the inevitable presence of soil-cover complex spatial variability along watersheds is being tested. Based on this hypothesis, the simplified concept of a two-CN heterogeneous system is introduced to model the observed CN-rainfall variation by reducing the CN spatial variability into two classes. The behavior of the CN-rainfall function produced by the proposed two-CN system concept is approached theoretically, it is analyzed systematically, and it is found to be similar to the variation observed in natural watersheds. Synthetic data tests, natural watersheds examples, and detailed study of two natural experimental watersheds with known spatial heterogeneity characteristics were used to evaluate the method. The results indicate that the determination of CN values from rainfall runoff data using the proposed two-CN system approach provides reasonable accuracy and it over performs the previous original method based on the determination of a single asymptotic CN value. Although the suggested method increases the number of unknown parameters to three (instead of one), a clear physical reasoning for them is presented.
Staitieh, Bashar S; Saghafi, Ramin; Kempker, Jordan A; Schulman, David A
2016-04-01
Hypothesis-driven physical examination emphasizes the role of bedside examination in the refinement of differential diagnoses and improves diagnostic acumen. This approach has not yet been investigated as a tool to improve the ability of higher-level trainees to teach medical students. To assess the effect of teaching hypothesis-driven physical diagnosis to pulmonary fellows on their ability to improve the pulmonary examination skills of first-year medical students. Fellows and students were assessed on teaching and diagnostic skills by self-rating on a Likert scale. One group of fellows received the hypothesis-driven teaching curriculum (the "intervention" group) and another received instruction on head-to-toe examination. Both groups subsequently taught physical diagnosis to a group of first-year medical students. An oral examination was administered to all students after completion of the course. Fellows were comfortable teaching physical diagnosis to students. Students in both groups reported a lack of comfort with the pulmonary examination at the beginning of the course and improvement in their comfort by the end. Students trained by intervention group fellows outperformed students trained by control group fellows in the interpretation of physical findings (P < 0.05). Teaching hypothesis-driven physical examination to higher-level trainees who teach medical students improves the ability of students to interpret physical findings. This benefit should be confirmed using validated testing tools.
NASA Astrophysics Data System (ADS)
Vistarakula, Krishna; Bergin, Mike; Hu, David
2010-11-01
Nearly every mammalian and avian eye is rimmed with lashes. We investigate experimentally the ability of lashes to reduce airborne particle deposition in the eye. We hypothesize that there is an optimum eyelash length that maximizes both filtration ability and extent of peripheral vision. This hypothesis is tested using a dual approach. Using preserved heads from 36 species of animals at the American Museum of Natural History, we determine the relationship between eye size and eyelash geometry (length and spacing). We test the filtration efficacy of these geometries by deploying outdoor manikins and measuring particle deposition rate as a function of eyelash length.
Gamma Strength Functions and Level Densities from 300 MeV Proton Scatttering at 0°
NASA Astrophysics Data System (ADS)
von Neumann-Cosel, Peter; Bassauer, Sergej; Martin, Dirk
The gamma strength function (GSF) as well as total level densities (LDs) in 208Pb and 96Mo were extracted from high-resolution forward angle inelastic proton scattering data taken at RCNP, Osaka, Japan, and compared to experimental results obtained with the Oslo method in order to test the validity of the Brink-Axel (BA) hypothesis in the energy region of the pygmy dipole resonance. The case of 208Pb is inconclusive because of strong fluctuations of the GSF due to the small level density in a doubly closed-shell nucleus. In 96Mo the data are consistent with the BA hypothesis. The good agreement of LDs provides an independent confirmation of the approach underlying the decomposition of GSF and LDs in Oslo-type experiments.
From Shattered Assumptions to Weakened Worldviews: Trauma Symptoms Signal Anxiety Buffer Disruption
Edmondson, Donald; Chaudoir, Stephenie R.; Mills, Mary Alice; Park, Crystal L.; Holub, Julie; Bartkowiak, Jennifer M.
2013-01-01
The fundamental assertion of worldview-based models of posttraumatic stress disorder is that trauma symptoms result when traumatic experiences cannot be readily assimilated into previously held worldviews. In two studies, we test the anxiety buffer disruption hypothesis, which states that trauma symptoms result from the disruption of normal death anxiety-buffering functions of worldview. In Study 1, participants with trauma symptoms greater than the cutoff for PTSD evinced greater death-thought accessibility than those with sub-clinical or negligible symptoms after a reminder of death. In Study 2, participants with clinically significant trauma symptoms showed no evidence of worldview defense though death-thoughts were accessible. These results support the anxiety buffer disruption hypothesis, and suggest an entirely new approach to experimental PTSD research. PMID:24077677
Nuzzo, F; Zei, G; Stefanini, M; Colognola, R; Santachiara, A S; Lagomarsini, P; Marinoni, S; Salvaneschi, L
1990-01-01
The association of two rare hereditary disorders, trichothiodystrophy (TTD) and xeroderma pigmentosum (XP), was found in four patients from three families, apparently unrelated but living in the same geographical area. In order to test the hypothesis of a common ancestor, consanguinity within and among the families was checked using three different approaches: reconstruction of genealogical trees, typing of blood markers, and surname analysis. The results of the three types of analyses strengthen the hypothesis that, in at least two out of the three families, the genetic defect determining the TTD/XP phenotype is identical by descent, as a consequence of remote inbreeding. This implies that if two mutations are responsible for the two diseases they are at linked loci or affect the same gene. PMID:2308151
On the Interpretation and Use of Mediation: Multiple Perspectives on Mediation Analysis.
Agler, Robert; De Boeck, Paul
2017-01-01
Mediation analysis has become a very popular approach in psychology, and it is one that is associated with multiple perspectives that are often at odds, often implicitly. Explicitly discussing these perspectives and their motivations, advantages, and disadvantages can help to provide clarity to conversations and research regarding the use and refinement of mediation models. We discuss five such pairs of perspectives on mediation analysis, their associated advantages and disadvantages, and their implications: with vs. without a mediation hypothesis, specific effects vs. a global model, directness vs. indirectness of causation, effect size vs. null hypothesis testing, and hypothesized vs. alternative explanations. Discussion of the perspectives is facilitated by a small simulation study. Some philosophical and linguistic considerations are briefly discussed, as well as some other perspectives we do not develop here.
Saraf, Sanatan; Mathew, Thomas; Roy, Anindya
2015-01-01
For the statistical validation of surrogate endpoints, an alternative formulation is proposed for testing Prentice's fourth criterion, under a bivariate normal model. In such a setup, the criterion involves inference concerning an appropriate regression parameter, and the criterion holds if the regression parameter is zero. Testing such a null hypothesis has been criticized in the literature since it can only be used to reject a poor surrogate, and not to validate a good surrogate. In order to circumvent this, an equivalence hypothesis is formulated for the regression parameter, namely the hypothesis that the parameter is equivalent to zero. Such an equivalence hypothesis is formulated as an alternative hypothesis, so that the surrogate endpoint is statistically validated when the null hypothesis is rejected. Confidence intervals for the regression parameter and tests for the equivalence hypothesis are proposed using bootstrap methods and small sample asymptotics, and their performances are numerically evaluated and recommendations are made. The choice of the equivalence margin is a regulatory issue that needs to be addressed. The proposed equivalence testing formulation is also adopted for other parameters that have been proposed in the literature on surrogate endpoint validation, namely, the relative effect and proportion explained.
Test of association: which one is the most appropriate for my study?
Gonzalez-Chica, David Alejandro; Bastos, João Luiz; Duquia, Rodrigo Pereira; Bonamigo, Renan Rangel; Martínez-Mesa, Jeovany
2015-01-01
Hypothesis tests are statistical tools widely used for assessing whether or not there is an association between two or more variables. These tests provide a probability of the type 1 error (p-value), which is used to accept or reject the null study hypothesis. To provide a practical guide to help researchers carefully select the most appropriate procedure to answer the research question. We discuss the logic of hypothesis testing and present the prerequisites of each procedure based on practical examples.
Tollefsen, Knut Erik; Scholz, Stefan; Cronin, Mark T; Edwards, Stephen W; de Knecht, Joop; Crofton, Kevin; Garcia-Reyero, Natalia; Hartung, Thomas; Worth, Andrew; Patlewicz, Grace
2014-12-01
Chemical regulation is challenged by the large number of chemicals requiring assessment for potential human health and environmental impacts. Current approaches are too resource intensive in terms of time, money and animal use to evaluate all chemicals under development or already on the market. The need for timely and robust decision making demands that regulatory toxicity testing becomes more cost-effective and efficient. One way to realize this goal is by being more strategic in directing testing resources; focusing on chemicals of highest concern, limiting testing to the most probable hazards, or targeting the most vulnerable species. Hypothesis driven Integrated Approaches to Testing and Assessment (IATA) have been proposed as practical solutions to such strategic testing. In parallel, the development of the Adverse Outcome Pathway (AOP) framework, which provides information on the causal links between a molecular initiating event (MIE), intermediate key events (KEs) and an adverse outcome (AO) of regulatory concern, offers the biological context to facilitate development of IATA for regulatory decision making. This manuscript summarizes discussions at the Workshop entitled "Advancing AOPs for Integrated Toxicology and Regulatory Applications" with particular focus on the role AOPs play in informing the development of IATA for different regulatory purposes. Copyright © 2014 Elsevier Inc. All rights reserved.
“Positive” Results Increase Down the Hierarchy of the Sciences
Fanelli, Daniele
2010-01-01
The hypothesis of a Hierarchy of the Sciences with physical sciences at the top, social sciences at the bottom, and biological sciences in-between is nearly 200 years old. This order is intuitive and reflected in many features of academic life, but whether it reflects the “hardness” of scientific research—i.e., the extent to which research questions and results are determined by data and theories as opposed to non-cognitive factors—is controversial. This study analysed 2434 papers published in all disciplines and that declared to have tested a hypothesis. It was determined how many papers reported a “positive” (full or partial) or “negative” support for the tested hypothesis. If the hierarchy hypothesis is correct, then researchers in “softer” sciences should have fewer constraints to their conscious and unconscious biases, and therefore report more positive outcomes. Results confirmed the predictions at all levels considered: discipline, domain and methodology broadly defined. Controlling for observed differences between pure and applied disciplines, and between papers testing one or several hypotheses, the odds of reporting a positive result were around 5 times higher among papers in the disciplines of Psychology and Psychiatry and Economics and Business compared to Space Science, 2.3 times higher in the domain of social sciences compared to the physical sciences, and 3.4 times higher in studies applying behavioural and social methodologies on people compared to physical and chemical studies on non-biological material. In all comparisons, biological studies had intermediate values. These results suggest that the nature of hypotheses tested and the logical and methodological rigour employed to test them vary systematically across disciplines and fields, depending on the complexity of the subject matter and possibly other factors (e.g., a field's level of historical and/or intellectual development). On the other hand, these results support the scientific status of the social sciences against claims that they are completely subjective, by showing that, when they adopt a scientific approach to discovery, they differ from the natural sciences only by a matter of degree. PMID:20383332
"Positive" results increase down the Hierarchy of the Sciences.
Fanelli, Daniele
2010-04-07
The hypothesis of a Hierarchy of the Sciences with physical sciences at the top, social sciences at the bottom, and biological sciences in-between is nearly 200 years old. This order is intuitive and reflected in many features of academic life, but whether it reflects the "hardness" of scientific research--i.e., the extent to which research questions and results are determined by data and theories as opposed to non-cognitive factors--is controversial. This study analysed 2434 papers published in all disciplines and that declared to have tested a hypothesis. It was determined how many papers reported a "positive" (full or partial) or "negative" support for the tested hypothesis. If the hierarchy hypothesis is correct, then researchers in "softer" sciences should have fewer constraints to their conscious and unconscious biases, and therefore report more positive outcomes. Results confirmed the predictions at all levels considered: discipline, domain and methodology broadly defined. Controlling for observed differences between pure and applied disciplines, and between papers testing one or several hypotheses, the odds of reporting a positive result were around 5 times higher among papers in the disciplines of Psychology and Psychiatry and Economics and Business compared to Space Science, 2.3 times higher in the domain of social sciences compared to the physical sciences, and 3.4 times higher in studies applying behavioural and social methodologies on people compared to physical and chemical studies on non-biological material. In all comparisons, biological studies had intermediate values. These results suggest that the nature of hypotheses tested and the logical and methodological rigour employed to test them vary systematically across disciplines and fields, depending on the complexity of the subject matter and possibly other factors (e.g., a field's level of historical and/or intellectual development). On the other hand, these results support the scientific status of the social sciences against claims that they are completely subjective, by showing that, when they adopt a scientific approach to discovery, they differ from the natural sciences only by a matter of degree.
A Galilean Approach to the Galileo Affair, 1609-2009
ERIC Educational Resources Information Center
Finocchiaro, Maurice A.
2011-01-01
Galileo's telescopic discoveries of 1609-1612 provided a crucial, although not conclusive, confirmation of the Copernican hypothesis of the earth's motion. In Galileo's approach, the Copernican Revolution required that the geokinetic hypothesis be supported not only with new theoretical arguments but also with new observational evidence; that it…
Test of a hypothesis of realism in quantum theory using a Bayesian approach
NASA Astrophysics Data System (ADS)
Nikitin, N.; Toms, K.
2017-05-01
In this paper we propose a time-independent equality and time-dependent inequality, suitable for an experimental test of the hypothesis of realism. The derivation of these relations is based on the concept of conditional probability and on Bayes' theorem in the framework of Kolmogorov's axiomatics of probability theory. The equality obtained is intrinsically different from the well-known Greenberger-Horne-Zeilinger (GHZ) equality and its variants, because violation of the proposed equality might be tested in experiments with only two microsystems in a maximally entangled Bell state |Ψ-> , while a test of the GHZ equality requires at least three quantum systems in a special state |ΨGHZ> . The obtained inequality differs from Bell's, Wigner's, and Leggett-Garg inequalities, because it deals with spin s =1 /2 projections onto only two nonparallel directions at two different moments of time, while a test of the Bell and Wigner inequalities requires at least three nonparallel directions, and a test of the Leggett-Garg inequalities requires at least three distinct moments of time. Hence, the proposed inequality seems to open an additional experimental possibility to avoid the "contextuality loophole." Violation of the proposed equality and inequality is illustrated with the behavior of a pair of anticorrelated spins in an external magnetic field and also with the oscillations of flavor-entangled pairs of neutral pseudoscalar mesons.
Hypothesis-driven physical examination curriculum.
Allen, Sharon; Olson, Andrew; Menk, Jeremiah; Nixon, James
2017-12-01
Medical students traditionally learn physical examination skills as a rote list of manoeuvres. Alternatives like hypothesis-driven physical examination (HDPE) may promote students' understanding of the contribution of physical examination to diagnostic reasoning. We sought to determine whether first-year medical students can effectively learn to perform a physical examination using an HDPE approach, and then tailor the examination to specific clinical scenarios. Medical students traditionally learn physical examination skills as a rote list of manoeuvres CONTEXT: First-year medical students at the University of Minnesota were taught both traditional and HDPE approaches during a required 17-week clinical skills course in their first semester. The end-of-course evaluation assessed HDPE skills: students were assigned one of two cardiopulmonary cases. Each case included two diagnostic hypotheses. During an interaction with a standardised patient, students were asked to select physical examination manoeuvres in order to make a final diagnosis. Items were weighted and selection order was recorded. First-year students with minimal pathophysiology performed well. All students selected the correct diagnosis. Importantly, students varied the order when selecting examination manoeuvres depending on the diagnoses under consideration, demonstrating early clinical decision-making skills. An early introduction to HDPE may reinforce physical examination skills for hypothesis generation and testing, and can foster early clinical decision-making skills. This has important implications for further research in physical examination instruction. © 2016 John Wiley & Sons Ltd and The Association for the Study of Medical Education.
Trait approach motivation moderates the aftereffects of self-control
Crowell, Adrienne; Kelley, Nicholas J.; Schmeichel, Brandon J.
2014-01-01
Numerous experiments have found that exercising self-control reduces success on subsequent, seemingly unrelated self-control tasks. Such evidence lends support to a strength model that posits a limited and depletable resource underlying all manner of self-control. Recent theory and evidence suggest that exercising self-control may also increase approach-motivated impulse strength. The two studies reported here tested two implications of this increased approach motivation hypothesis. First, aftereffects of self-control should be evident even in responses that require little or no self-control. Second, participants higher in trait approach motivation should be particularly susceptible to such aftereffects. In support, exercising self-control led to increased optimism (Study 1) and broadened attention (Study 2), but only among individuals higher in trait approach motivation. These findings suggest that approach motivation is an important key to understanding the aftereffects of exercising self-control. PMID:25324814
Hozo, Iztok; Schell, Michael J; Djulbegovic, Benjamin
2008-07-01
The absolute truth in research is unobtainable, as no evidence or research hypothesis is ever 100% conclusive. Therefore, all data and inferences can in principle be considered as "inconclusive." Scientific inference and decision-making need to take into account errors, which are unavoidable in the research enterprise. The errors can occur at the level of conclusions that aim to discern the truthfulness of research hypothesis based on the accuracy of research evidence and hypothesis, and decisions, the goal of which is to enable optimal decision-making under present and specific circumstances. To optimize the chance of both correct conclusions and correct decisions, the synthesis of all major statistical approaches to clinical research is needed. The integration of these approaches (frequentist, Bayesian, and decision-analytic) can be accomplished through formal risk:benefit (R:B) analysis. This chapter illustrates the rational choice of a research hypothesis using R:B analysis based on decision-theoretic expected utility theory framework and the concept of "acceptable regret" to calculate the threshold probability of the "truth" above which the benefit of accepting a research hypothesis outweighs its risks.
The Importance of Teaching Power in Statistical Hypothesis Testing
ERIC Educational Resources Information Center
Olinsky, Alan; Schumacher, Phyllis; Quinn, John
2012-01-01
In this paper, we discuss the importance of teaching power considerations in statistical hypothesis testing. Statistical power analysis determines the ability of a study to detect a meaningful effect size, where the effect size is the difference between the hypothesized value of the population parameter under the null hypothesis and the true value…
The Relation between Parental Values and Parenting Behavior: A Test of the Kohn Hypothesis.
ERIC Educational Resources Information Center
Luster, Tom; And Others
1989-01-01
Used data on 65 mother-infant dyads to test Kohn's hypothesis concerning the relation between values and parenting behavior. Findings support Kohn's hypothesis that parents who value self-direction would emphasize supportive function of parenting and parents who value conformity would emphasize their obligations to impose restraints. (Author/NB)
Cognitive Biases in the Interpretation of Autonomic Arousal: A Test of the Construal Bias Hypothesis
ERIC Educational Resources Information Center
Ciani, Keith D.; Easter, Matthew A.; Summers, Jessica J.; Posada, Maria L.
2009-01-01
According to Bandura's construal bias hypothesis, derived from social cognitive theory, persons with the same heightened state of autonomic arousal may experience either pleasant or deleterious emotions depending on the strength of perceived self-efficacy. The current study tested this hypothesis by proposing that college students' preexisting…
Overgaard, Morten; Lindeløv, Jonas; Svejstrup, Stinna; Døssing, Marianne; Hvid, Tanja; Kauffmann, Oliver; Mouridsen, Kim
2013-01-01
This paper reports an experiment intended to test a particular hypothesis derived from blindsight research, which we name the “source misidentification hypothesis.” According to this hypothesis, a subject may be correct about a stimulus without being correct about how she had access to this knowledge (whether the stimulus was visual, auditory, or something else). We test this hypothesis in healthy subjects, asking them to report whether a masked stimulus was presented auditorily or visually, what the stimulus was, and how clearly they experienced the stimulus using the Perceptual Awareness Scale (PAS). We suggest that knowledge about perceptual modality may be a necessary precondition in order to issue correct reports of which stimulus was presented. Furthermore, we find that PAS ratings correlate with correctness, and that subjects are at chance level when reporting no conscious experience of the stimulus. To demonstrate that particular levels of reporting accuracy are obtained, we employ a statistical strategy, which operationally tests the hypothesis of non-equality, such that the usual rejection of the null-hypothesis admits the conclusion of equivalence. PMID:23508677
Evolution of Motor Control: From Reflexes and Motor Programs to the Equilibrium-Point Hypothesis.
Latash, Mark L
2008-01-01
This brief review analyzes the evolution of motor control theories along two lines that emphasize active (motor programs) and reactive (reflexes) features of voluntary movements. It suggests that the only contemporary hypothesis that integrates both approaches in a fruitful way is the equilibrium-point hypothesis. Physical, physiological, and behavioral foundations of the EP-hypothesis are considered as well as relations between the EP-hypothesis and the recent developments of the notion of motor synergies. The paper ends with a brief review of the criticisms of the EP-hypothesis and challenges that the hypothesis faces at this time.
Koblin, Beryl; Hirshfield, Sabina; Chiasson, Mary Ann; Wilton, Leo; Usher, DaShawn; Nandi, Vijay; Hoover, Donald R; Frye, Victoria
2017-12-19
HIV testing is a critical component of HIV prevention and care. Interventions to increase HIV testing rates among young black men who have sex with men (MSM) and black transgender women (transwomen) are needed. Personalized recommendations for an individual's optimal HIV testing approach may increase testing. This randomized trial tests the hypothesis that a personalized recommendation of an optimal HIV testing approach will increase HIV testing more than standard HIV testing information. A randomized trial among 236 young black men and transwomen who have sex with men or transwomen is being conducted. Participants complete a computerized baseline assessment and are randomized to electronically receive a personalized HIV testing recommendation or standard HIV testing information. Follow-up surveys are conducted online at 3 and 6 months after baseline. The All About Me randomized trial was launched in June 2016. Enrollment is completed and 3-month retention is 92.4% (218/236) and has exceeded study target goals. The All About Me intervention is an innovative approach to increase HIV testing by providing a personalized recommendation of a person's optimal HIV testing approach. If successful, optimizing this intervention for mobile devices will widen access to large numbers of individuals. ClinicalTrial.gov NCT02834572; https://clinicaltrials.gov/ct2/show/NCT02834572 (Archived by WebCite at http://www.webcitation.org/6vLJWOS1B). ©Beryl Koblin, Sabina Hirshfield, Mary Ann Chiasson, Leo Wilton, DaShawn Usher, Vijay Nandi, Donald R Hoover, Victoria Frye. Originally published in JMIR Research Protocols (http://www.researchprotocols.org), 19.12.2017.
ERIC Educational Resources Information Center
SAW, J.G.
THIS PAPER DEALS WITH SOME TESTS OF HYPOTHESIS FREQUENTLY ENCOUNTERED IN THE ANALYSIS OF MULTIVARIATE DATA. THE TYPE OF HYPOTHESIS CONSIDERED IS THAT WHICH THE STATISTICIAN CAN ANSWER IN THE NEGATIVE OR AFFIRMATIVE. THE DOOLITTLE METHOD MAKES IT POSSIBLE TO EVALUATE THE DETERMINANT OF A MATRIX OF HIGH ORDER, TO SOLVE A MATRIX EQUATION, OR TO…
Previdelli, Ágatha Nogueira; de Andrade, Samantha Caesar; Fisberg, Regina Mara; Marchioni, Dirce Maria
2016-09-23
The use of dietary patterns to assess dietary intake has become increasingly common in nutritional epidemiology studies due to the complexity and multidimensionality of the diet. Currently, two main approaches have been widely used to assess dietary patterns: data-driven and hypothesis-driven analysis. Since the methods explore different angles of dietary intake, using both approaches simultaneously might yield complementary and useful information; thus, we aimed to use both approaches to gain knowledge of adolescents' dietary patterns. Food intake from a cross-sectional survey with 295 adolescents was assessed by 24 h dietary recall (24HR). In hypothesis-driven analysis, based on the American National Cancer Institute method, the usual intake of Brazilian Healthy Eating Index Revised components were estimated. In the data-driven approach, the usual intake of foods/food groups was estimated by the Multiple Source Method. In the results, hypothesis-driven analysis showed low scores for Whole grains, Total vegetables, Total fruit and Whole fruits), while, in data-driven analysis, fruits and whole grains were not presented in any pattern. High intakes of sodium, fats and sugars were observed in hypothesis-driven analysis with low total scores for Sodium, Saturated fat and SoFAA (calories from solid fat, alcohol and added sugar) components in agreement, while the data-driven approach showed the intake of several foods/food groups rich in these nutrients, such as butter/margarine, cookies, chocolate powder, whole milk, cheese, processed meat/cold cuts and candies. In this study, using both approaches at the same time provided consistent and complementary information with regard to assessing the overall dietary habits that will be important in order to drive public health programs, and improve their efficiency to monitor and evaluate the dietary patterns of populations.
NASA Technical Reports Server (NTRS)
Eagleson, P. S.
1985-01-01
Research activities conducted from February 1, 1985 to July 31, 1985 and preliminary conclusions regarding research objectives are summarized. The objective is to determine the feasibility of using LANDSAT data to estimate effective hydraulic properties of soils. The general approach is to apply the climatic-climax hypothesis (Ealgeson, 1982) to natural water-limited vegetation systems using canopy cover estimated from LANDSAT data. Natural water-limited systems typically consist of inhomogeneous vegetation canopies interspersed with bare soils. The ground resolution associated with one pixel from LANDSAT MSS (or TM) data is generally greater than the scale of the plant canopy or canopy clusters. Thus a method for resolving percent canopy cover at a subpixel level must be established before the Eagleson hypothesis can be tested. Two formulations are proposed which extend existing methods of analyzing mixed pixels to naturally vegetated landscapes. The first method involves use of the normalized vegetation index. The second approach is a physical model based on radiative transfer principles. Both methods are to be analyzed for their feasibility on selected sites.
Replication Unreliability in Psychology: Elusive Phenomena or “Elusive” Statistical Power?
Tressoldi, Patrizio E.
2012-01-01
The focus of this paper is to analyze whether the unreliability of results related to certain controversial psychological phenomena may be a consequence of their low statistical power. Applying the Null Hypothesis Statistical Testing (NHST), still the widest used statistical approach, unreliability derives from the failure to refute the null hypothesis, in particular when exact or quasi-exact replications of experiments are carried out. Taking as example the results of meta-analyses related to four different controversial phenomena, subliminal semantic priming, incubation effect for problem solving, unconscious thought theory, and non-local perception, it was found that, except for semantic priming on categorization, the statistical power to detect the expected effect size (ES) of the typical study, is low or very low. The low power in most studies undermines the use of NHST to study phenomena with moderate or low ESs. We conclude by providing some suggestions on how to increase the statistical power or use different statistical approaches to help discriminate whether the results obtained may or may not be used to support or to refute the reality of a phenomenon with small ES. PMID:22783215
Boland, Elaine M.; Stange, Jonathan P.; Labelle, Denise R.; Shapero, Benjamin G.; Weiss, Rachel B.; Abramson, Lyn Y.; Alloy, Lauren B.
2015-01-01
The Behavioral Approach System (BAS)/Reward Hypersensitivity Theory and the Social Zeitgeber Theory are two biopsychosocial theories of bipolar spectrum disorders (BSD) that may work together to explain affective dysregulation. The present study examined whether BAS sensitivity is associated with affective symptoms via a) increased social rhythm disruption in response to BAS-relevant life events, or b) greater exposure to BAS events leading to social rhythm disruption and subsequent symptoms. Results indicated that high BAS individuals were more likely to experience social rhythm disruption following BAS-relevant events. Social rhythm disruption mediated the association between BAS-relevant events and symptoms (hypothesis a). High BAS individuals experienced significantly more BAS-relevant events, which predicted greater social rhythm disruption, which predicted greater levels of affective symptoms (hypothesis b). Individuals at risk for BSD may be sensitive to BAS-relevant stimuli, experience more BAS-relevant events, and experience affective dysregulation due to the interplay of the BAS and circadian rhythms. PMID:27429864
Mieth, Bettina; Kloft, Marius; Rodríguez, Juan Antonio; Sonnenburg, Sören; Vobruba, Robin; Morcillo-Suárez, Carlos; Farré, Xavier; Marigorta, Urko M.; Fehr, Ernst; Dickhaus, Thorsten; Blanchard, Gilles; Schunk, Daniel; Navarro, Arcadi; Müller, Klaus-Robert
2016-01-01
The standard approach to the analysis of genome-wide association studies (GWAS) is based on testing each position in the genome individually for statistical significance of its association with the phenotype under investigation. To improve the analysis of GWAS, we propose a combination of machine learning and statistical testing that takes correlation structures within the set of SNPs under investigation in a mathematically well-controlled manner into account. The novel two-step algorithm, COMBI, first trains a support vector machine to determine a subset of candidate SNPs and then performs hypothesis tests for these SNPs together with an adequate threshold correction. Applying COMBI to data from a WTCCC study (2007) and measuring performance as replication by independent GWAS published within the 2008–2015 period, we show that our method outperforms ordinary raw p-value thresholding as well as other state-of-the-art methods. COMBI presents higher power and precision than the examined alternatives while yielding fewer false (i.e. non-replicated) and more true (i.e. replicated) discoveries when its results are validated on later GWAS studies. More than 80% of the discoveries made by COMBI upon WTCCC data have been validated by independent studies. Implementations of the COMBI method are available as a part of the GWASpi toolbox 2.0. PMID:27892471
Mieth, Bettina; Kloft, Marius; Rodríguez, Juan Antonio; Sonnenburg, Sören; Vobruba, Robin; Morcillo-Suárez, Carlos; Farré, Xavier; Marigorta, Urko M; Fehr, Ernst; Dickhaus, Thorsten; Blanchard, Gilles; Schunk, Daniel; Navarro, Arcadi; Müller, Klaus-Robert
2016-11-28
The standard approach to the analysis of genome-wide association studies (GWAS) is based on testing each position in the genome individually for statistical significance of its association with the phenotype under investigation. To improve the analysis of GWAS, we propose a combination of machine learning and statistical testing that takes correlation structures within the set of SNPs under investigation in a mathematically well-controlled manner into account. The novel two-step algorithm, COMBI, first trains a support vector machine to determine a subset of candidate SNPs and then performs hypothesis tests for these SNPs together with an adequate threshold correction. Applying COMBI to data from a WTCCC study (2007) and measuring performance as replication by independent GWAS published within the 2008-2015 period, we show that our method outperforms ordinary raw p-value thresholding as well as other state-of-the-art methods. COMBI presents higher power and precision than the examined alternatives while yielding fewer false (i.e. non-replicated) and more true (i.e. replicated) discoveries when its results are validated on later GWAS studies. More than 80% of the discoveries made by COMBI upon WTCCC data have been validated by independent studies. Implementations of the COMBI method are available as a part of the GWASpi toolbox 2.0.
NASA Astrophysics Data System (ADS)
Mieth, Bettina; Kloft, Marius; Rodríguez, Juan Antonio; Sonnenburg, Sören; Vobruba, Robin; Morcillo-Suárez, Carlos; Farré, Xavier; Marigorta, Urko M.; Fehr, Ernst; Dickhaus, Thorsten; Blanchard, Gilles; Schunk, Daniel; Navarro, Arcadi; Müller, Klaus-Robert
2016-11-01
The standard approach to the analysis of genome-wide association studies (GWAS) is based on testing each position in the genome individually for statistical significance of its association with the phenotype under investigation. To improve the analysis of GWAS, we propose a combination of machine learning and statistical testing that takes correlation structures within the set of SNPs under investigation in a mathematically well-controlled manner into account. The novel two-step algorithm, COMBI, first trains a support vector machine to determine a subset of candidate SNPs and then performs hypothesis tests for these SNPs together with an adequate threshold correction. Applying COMBI to data from a WTCCC study (2007) and measuring performance as replication by independent GWAS published within the 2008-2015 period, we show that our method outperforms ordinary raw p-value thresholding as well as other state-of-the-art methods. COMBI presents higher power and precision than the examined alternatives while yielding fewer false (i.e. non-replicated) and more true (i.e. replicated) discoveries when its results are validated on later GWAS studies. More than 80% of the discoveries made by COMBI upon WTCCC data have been validated by independent studies. Implementations of the COMBI method are available as a part of the GWASpi toolbox 2.0.
Royston, Patrick; Parmar, Mahesh K B
2014-08-07
Most randomized controlled trials with a time-to-event outcome are designed and analysed under the proportional hazards assumption, with a target hazard ratio for the treatment effect in mind. However, the hazards may be non-proportional. We address how to design a trial under such conditions, and how to analyse the results. We propose to extend the usual approach, a logrank test, to also include the Grambsch-Therneau test of proportional hazards. We test the resulting composite null hypothesis using a joint test for the hazard ratio and for time-dependent behaviour of the hazard ratio. We compute the power and sample size for the logrank test under proportional hazards, and from that we compute the power of the joint test. For the estimation of relevant quantities from the trial data, various models could be used; we advocate adopting a pre-specified flexible parametric survival model that supports time-dependent behaviour of the hazard ratio. We present the mathematics for calculating the power and sample size for the joint test. We illustrate the methodology in real data from two randomized trials, one in ovarian cancer and the other in treating cellulitis. We show selected estimates and their uncertainty derived from the advocated flexible parametric model. We demonstrate in a small simulation study that when a treatment effect either increases or decreases over time, the joint test can outperform the logrank test in the presence of both patterns of non-proportional hazards. Those designing and analysing trials in the era of non-proportional hazards need to acknowledge that a more complex type of treatment effect is becoming more common. Our method for the design of the trial retains the tools familiar in the standard methodology based on the logrank test, and extends it to incorporate a joint test of the null hypothesis with power against non-proportional hazards. For the analysis of trial data, we propose the use of a pre-specified flexible parametric model that can represent a time-dependent hazard ratio if one is present.
Estimating False Discovery Proportion Under Arbitrary Covariance Dependence*
Fan, Jianqing; Han, Xu; Gu, Weijie
2012-01-01
Multiple hypothesis testing is a fundamental problem in high dimensional inference, with wide applications in many scientific fields. In genome-wide association studies, tens of thousands of tests are performed simultaneously to find if any SNPs are associated with some traits and those tests are correlated. When test statistics are correlated, false discovery control becomes very challenging under arbitrary dependence. In the current paper, we propose a novel method based on principal factor approximation, which successfully subtracts the common dependence and weakens significantly the correlation structure, to deal with an arbitrary dependence structure. We derive an approximate expression for false discovery proportion (FDP) in large scale multiple testing when a common threshold is used and provide a consistent estimate of realized FDP. This result has important applications in controlling FDR and FDP. Our estimate of realized FDP compares favorably with Efron (2007)’s approach, as demonstrated in the simulated examples. Our approach is further illustrated by some real data applications. We also propose a dependence-adjusted procedure, which is more powerful than the fixed threshold procedure. PMID:24729644
Young, Anna M.; Cordier, Breanne; Mundry, Roger; Wright, Timothy F.
2014-01-01
In many social species group, members share acoustically similar calls. Functional hypotheses have been proposed for call sharing, but previous studies have been limited by an inability to distinguish among these hypotheses. We examined the function of vocal sharing in female budgerigars with a two-part experimental design that allowed us to distinguish between two functional hypotheses. The social association hypothesis proposes that shared calls help animals mediate affiliative and aggressive interactions, while the password hypothesis proposes that shared calls allow animals to distinguish group identity and exclude nonmembers. We also tested the labeling hypothesis, a mechanistic explanation which proposes that shared calls are used to address specific individuals within the sender–receiver relationship. We tested the social association hypothesis by creating four–member flocks of unfamiliar female budgerigars (Melopsittacus undulatus) and then monitoring the birds’ calls, social behaviors, and stress levels via fecal glucocorticoid metabolites. We tested the password hypothesis by moving immigrants into established social groups. To test the labeling hypothesis, we conducted additional recording sessions in which individuals were paired with different group members. The social association hypothesis was supported by the development of multiple shared call types in each cage and a correlation between the number of shared call types and the number of aggressive interactions between pairs of birds. We also found support for calls serving as a labeling mechanism using discriminant function analysis with a permutation procedure. Our results did not support the password hypothesis, as there was no difference in stress or directed behaviors between immigrant and control birds. PMID:24860236
Ji, Hong; Petro, Nathan M; Chen, Badong; Yuan, Zejian; Wang, Jianji; Zheng, Nanning; Keil, Andreas
2018-02-06
Over the past decade, the simultaneous recording of electroencephalogram (EEG) and functional magnetic resonance imaging (fMRI) data has garnered growing interest because it may provide an avenue towards combining the strengths of both imaging modalities. Given their pronounced differences in temporal and spatial statistics, the combination of EEG and fMRI data is however methodologically challenging. Here, we propose a novel screening approach that relies on a Cross Multivariate Correlation Coefficient (xMCC) framework. This approach accomplishes three tasks: (1) It provides a measure for testing multivariate correlation and multivariate uncorrelation of the two modalities; (2) it provides criterion for the selection of EEG features; (3) it performs a screening of relevant EEG information by grouping the EEG channels into clusters to improve efficiency and to reduce computational load when searching for the best predictors of the BOLD signal. The present report applies this approach to a data set with concurrent recordings of steady-state-visual evoked potentials (ssVEPs) and fMRI, recorded while observers viewed phase-reversing Gabor patches. We test the hypothesis that fluctuations in visuo-cortical mass potentials systematically covary with BOLD fluctuations not only in visual cortical, but also in anterior temporal and prefrontal areas. Results supported the hypothesis and showed that the xMCC-based analysis provides straightforward identification of neurophysiological plausible brain regions with EEG-fMRI covariance. Furthermore xMCC converged with other extant methods for EEG-fMRI analysis. © 2018 The Authors Journal of Neuroscience Research Published by Wiley Periodicals, Inc.
Rietschel, Marcella; Mattheisen, Manuel; Breuer, René; Schulze, Thomas G.; Nöthen, Markus M.; Levinson, Douglas; Shi, Jianxin; Gejman, Pablo V.; Cichon, Sven; Ophoff, Roel A.
2012-01-01
Recent studies suggest that variation in complex disorders (e.g., schizophrenia) is explained by a large number of genetic variants with small effect size (Odds Ratio∼1.05–1.1). The statistical power to detect these genetic variants in Genome Wide Association (GWA) studies with large numbers of cases and controls (∼15,000) is still low. As it will be difficult to further increase sample size, we decided to explore an alternative method for analyzing GWA data in a study of schizophrenia, dramatically reducing the number of statistical tests. The underlying hypothesis was that at least some of the genetic variants related to a common outcome are collocated in segments of chromosomes at a wider scale than single genes. Our approach was therefore to study the association between relatively large segments of DNA and disease status. An association test was performed for each SNP and the number of nominally significant tests in a segment was counted. We then performed a permutation-based binomial test to determine whether this region contained significantly more nominally significant SNPs than expected under the null hypothesis of no association, taking linkage into account. Genome Wide Association data of three independent schizophrenia case/control cohorts with European ancestry (Dutch, German, and US) using segments of DNA with variable length (2 to 32 Mbp) was analyzed. Using this approach we identified a region at chromosome 5q23.3-q31.3 (128–160 Mbp) that was significantly enriched with nominally associated SNPs in three independent case-control samples. We conclude that considering relatively wide segments of chromosomes may reveal reliable relationships between the genome and schizophrenia, suggesting novel methodological possibilities as well as raising theoretical questions. PMID:22723893
Using the First-Year English Class to Develop Scientific Thinking Skills
NASA Astrophysics Data System (ADS)
McNamara, B. J.; Burnham, C.; Green, S.; Ball, E.; Schryer, A.
2002-12-01
This poster presents the preliminary results from an experimental approach to teaching first-year writing using the scientific method as an organizing theme. The approach presumes a close connection between the classical scientific method: observing, hypothesis forming, hypothesis testing, and generalizing from the results of the testing, and the writing process: inventing and prewriting, drafting, and revising. The project has four goals: 1. To introduce students to the relations between scientific method, academic inquiry, and the writing process; 2. To help students see that academic inquiry, the work of generating, testing, and validating knowledge and then applying that knowledge in real contexts, is actually a hybrid form of the scientific method; 3. To encourage students to connect the work they are doing in the writing classroom with the work they are doing in other classes so they can transfer the skills learned in one context to the other; and 4. To cause students who have previously been alienated by science and science teaching to reconsider their attitudes, and to see the powerful influence of science and scientific thinking in our world. In short, we are teaching science literacy in a humanities classroom. The materials we use include science-based reading and the kinds of writing typically required in science classes. The poster presents the basic premises of the project, samples of class materials, and preliminary results of a controlled pre- and post-test of student attitudes toward science and writing, analyzed especially according to gender and minority status. We also present insights by participating instructors including a female graduate teaching assistant who had been trained as a scientist and a male who had not.
Bowling, David R.; Schulze, Emily S.; Hall, Steven J.
2016-10-14
We revisit a classic ecohydrological study that showed streamside riparian trees in a semiarid mountain catchment did not use perennial stream water. The original study suggested that mature individuals of Acer negundo, Acer grandidentatum, and other species were dependent on water from “deeper strata,” possibly groundwater. We used a dual stable isotope approach (δ 18O and δ 2H) to further examine the water sources of these trees. We tested the hypothesis that groundwater was the main tree water source, but found that neither groundwater nor stream water matched the isotope composition of xylem water during two growing seasons. Soil watermore » (0–1 m depth) was closest to and periodically overlapped with xylem water isotope composition, but overall, xylem water was isotopically enriched compared to all measured water sources. The “two water worlds” hypothesis postulates that soil water comprises isotopically distinct mobile and less mobile pools that do not mix, potentially explaining this disparity. We further hypothesized that isotopic effects during snowpack metamorphosis impart a distinct isotope signature to the less mobile soil water that supplies summer transpiration. Depth trends in water isotopes following snowmelt were consistent with the two water worlds hypothesis, but snow metamorphic isotope effects could not explain the highly enriched xylem water. Thus, the dual isotope approach did not unambiguously determine the water source(s) of these riparian trees. Further exploration of physical, geochemical, and biological mechanisms of water isotope fractionation and partitioning is necessary to resolve these data, highlighting critical challenges in the isotopic determination of plant water sources.« less
DOE Office of Scientific and Technical Information (OSTI.GOV)
Bowling, David R.; Schulze, Emily S.; Hall, Steven J.
We revisit a classic ecohydrological study that showed streamside riparian trees in a semiarid mountain catchment did not use perennial stream water. The original study suggested that mature individuals of Acer negundo, Acer grandidentatum, and other species were dependent on water from “deeper strata,” possibly groundwater. We used a dual stable isotope approach (δ 18O and δ 2H) to further examine the water sources of these trees. We tested the hypothesis that groundwater was the main tree water source, but found that neither groundwater nor stream water matched the isotope composition of xylem water during two growing seasons. Soil watermore » (0–1 m depth) was closest to and periodically overlapped with xylem water isotope composition, but overall, xylem water was isotopically enriched compared to all measured water sources. The “two water worlds” hypothesis postulates that soil water comprises isotopically distinct mobile and less mobile pools that do not mix, potentially explaining this disparity. We further hypothesized that isotopic effects during snowpack metamorphosis impart a distinct isotope signature to the less mobile soil water that supplies summer transpiration. Depth trends in water isotopes following snowmelt were consistent with the two water worlds hypothesis, but snow metamorphic isotope effects could not explain the highly enriched xylem water. Thus, the dual isotope approach did not unambiguously determine the water source(s) of these riparian trees. Further exploration of physical, geochemical, and biological mechanisms of water isotope fractionation and partitioning is necessary to resolve these data, highlighting critical challenges in the isotopic determination of plant water sources.« less
Using a personal digital assistant to document clinical pharmacy services in an intensive care unit.
Lau, A; Balen, R M; Lam, R; Malyuk, D L
2001-07-01
Management Case Studies describe approaches to real-life management problems in health systems. Each installment is a brief description of a problem and how it was dealt with. The cases are intended to help readers deal with similar experiences in their own work sites. Problem solving, not hypothesis testing, is emphasized. Successful resolution of the management issue is not a criterion for publication--important lessons can be learned from failures, too.
Belukha whale (delphinapterus leucas) responses to industrial noise in Nushagak Bay, Alaska: 1983
DOE Office of Scientific and Technical Information (OSTI.GOV)
Stewart, B.S.; Awbrey, F.T.; Evans, W.E.
1983-01-01
Between 15 June and 14 July 1983 the authors conducted playback experiments with belukha whales in the Snake River, Alaska, using sounds recorded near an operating oil-drilling rig. The objectives of these experiments were to quantify behavioral responses of belukha whales to oil drilling noise in an area where foreign acoustic stimuli were absent, and to test the hypothesis that beluhka whales would not approach a source of loud sound.
Strand-seq: a unifying tool for studies of chromosome segregation.
Falconer, Ester; Lansdorp, Peter M
2013-01-01
Non random segregation of sister chromatids has been implicated to help specify daughter cell fate (the Silent Sister Hypothesis [1]) or to protect the genome of long-lived stem cells (the Immortal Strand Hypothesis [2]). The idea that sister chromatids are non-randomly segregated into specific daughter cells is only marginally supported by data in sporadic and often contradictory studies. As a result, the field has moved forward rather slowly. The advent of being able to directly label and differentiate sister chromatids in vivo using fluorescence in situ hybridization [3] was a significant advance for such studies. However, this approach is limited by the need for large tracks of unidirectional repeats on chromosomes and the reliance on quantitative imaging of fluorescent probes and rigorous statistical analysis to discern between the two competing hypotheses. A novel method called Strand-seq which uses next-generation sequencing to assay sister chromatid inheritance patterns independently for each chromosome [4] offers a comprehensive approach to test for non-random segregation. In addition Strand-seq enables studies on the deposition of chromatin marks in relation to DNA replication. This method is expected to help unify the field by testing previous claims of non-random segregation in an unbiased way in many model systems in vitro and in vivo. Crown Copyright © 2013. Published by Elsevier Ltd. All rights reserved.
Phase II design with sequential testing of hypotheses within each stage.
Poulopoulou, Stavroula; Karlis, Dimitris; Yiannoutsos, Constantin T; Dafni, Urania
2014-01-01
The main goal of a Phase II clinical trial is to decide, whether a particular therapeutic regimen is effective enough to warrant further study. The hypothesis tested by Fleming's Phase II design (Fleming, 1982) is [Formula: see text] versus [Formula: see text], with level [Formula: see text] and with a power [Formula: see text] at [Formula: see text], where [Formula: see text] is chosen to represent the response probability achievable with standard treatment and [Formula: see text] is chosen such that the difference [Formula: see text] represents a targeted improvement with the new treatment. This hypothesis creates a misinterpretation mainly among clinicians that rejection of the null hypothesis is tantamount to accepting the alternative, and vice versa. As mentioned by Storer (1992), this introduces ambiguity in the evaluation of type I and II errors and the choice of the appropriate decision at the end of the study. Instead of testing this hypothesis, an alternative class of designs is proposed in which two hypotheses are tested sequentially. The hypothesis [Formula: see text] versus [Formula: see text] is tested first. If this null hypothesis is rejected, the hypothesis [Formula: see text] versus [Formula: see text] is tested next, in order to examine whether the therapy is effective enough to consider further testing in a Phase III study. For the derivation of the proposed design the exact binomial distribution is used to calculate the decision cut-points. The optimal design parameters are chosen, so as to minimize the average sample number (ASN) under specific upper bounds for error levels. The optimal values for the design were found using a simulated annealing method.
An Extension of RSS-based Model Comparison Tests for Weighted Least Squares
2012-08-22
use the model comparison test statistic to analyze the null hypothesis. Under the null hypothesis, the weighted least squares cost functional is JWLS ...q̂WLSH ) = 10.3040×106. Under the alternative hypothesis, the weighted least squares cost functional is JWLS (q̂WLS) = 8.8394 × 106. Thus the model
Uno, Hajime; Tian, Lu; Claggett, Brian; Wei, L J
2015-12-10
With censored event time observations, the logrank test is the most popular tool for testing the equality of two underlying survival distributions. Although this test is asymptotically distribution free, it may not be powerful when the proportional hazards assumption is violated. Various other novel testing procedures have been proposed, which generally are derived by assuming a class of specific alternative hypotheses with respect to the hazard functions. The test considered by Pepe and Fleming (1989) is based on a linear combination of weighted differences of the two Kaplan-Meier curves over time and is a natural tool to assess the difference of two survival functions directly. In this article, we take a similar approach but choose weights that are proportional to the observed standardized difference of the estimated survival curves at each time point. The new proposal automatically makes weighting adjustments empirically. The new test statistic is aimed at a one-sided general alternative hypothesis and is distributed with a short right tail under the null hypothesis but with a heavy tail under the alternative. The results from extensive numerical studies demonstrate that the new procedure performs well under various general alternatives with a caution of a minor inflation of the type I error rate when the sample size is small or the number of observed events is small. The survival data from a recent cancer comparative study are utilized for illustrating the implementation of the process. Copyright © 2015 John Wiley & Sons, Ltd.
Nasser, Helen M; Lafferty, Danielle S; Lesser, Ellen N; Bacharach, Sam Z; Calu, Donna J
2018-01-01
Previously established individual differences in appetitive approach and devaluation sensitivity observed in goal- and sign-trackers may be attributed to differences in the acquisition, modification, or use of associative information in basolateral amygdala (BLA) pathways. Here, we sought to determine the extent to which communication of associative information between BLA and anterior portions of insular cortex (IC) supports ongoing Pavlovian conditioned approach behaviors in sign- and goal-tracking rats, in the absence of manipulations to outcome value. We hypothesized that the BLA mediates goal-, but not sign- tracking approach through interactions with the IC, a brain region involved in supporting flexible behavior. We first trained rats in Pavlovian lever autoshaping to determine their sign- or goal-tracking tendency. During alternating test sessions, we gave unilateral intracranial injections of vehicle or a cocktail of gamma-aminobutyric acid (GABA) receptor agonists, baclofen and muscimol, unilaterally into the BLA and contralaterally or ipsilaterally into the IC prior to reinforced lever autoshaping sessions. Consistent with our hypothesis we found that contralateral inactivation of BLA and IC increased the latency to approach the food cup and decreased the number of food cup contacts in goal-trackers. While contralateral inactivation of BLA and IC did not affect the total number of lever contacts in sign-trackers, this manipulation increased the latency to approach the lever. Ipsilateral inactivation of BLA and IC did not impact approach behaviors in Pavlovian lever autoshaping. These findings, contrary to our hypothesis, suggest that communication between BLA and IC maintains a representation of initially learned appetitive associations that commonly support the initiation of Pavlovian conditioned approach behavior regardless of whether it is directed at the cue or the location of reward delivery. Copyright © 2017 Elsevier Inc. All rights reserved.
Hypothesis testing of scientific Monte Carlo calculations.
Wallerberger, Markus; Gull, Emanuel
2017-11-01
The steadily increasing size of scientific Monte Carlo simulations and the desire for robust, correct, and reproducible results necessitates rigorous testing procedures for scientific simulations in order to detect numerical problems and programming bugs. However, the testing paradigms developed for deterministic algorithms have proven to be ill suited for stochastic algorithms. In this paper we demonstrate explicitly how the technique of statistical hypothesis testing, which is in wide use in other fields of science, can be used to devise automatic and reliable tests for Monte Carlo methods, and we show that these tests are able to detect some of the common problems encountered in stochastic scientific simulations. We argue that hypothesis testing should become part of the standard testing toolkit for scientific simulations.
Hypothesis testing of scientific Monte Carlo calculations
NASA Astrophysics Data System (ADS)
Wallerberger, Markus; Gull, Emanuel
2017-11-01
The steadily increasing size of scientific Monte Carlo simulations and the desire for robust, correct, and reproducible results necessitates rigorous testing procedures for scientific simulations in order to detect numerical problems and programming bugs. However, the testing paradigms developed for deterministic algorithms have proven to be ill suited for stochastic algorithms. In this paper we demonstrate explicitly how the technique of statistical hypothesis testing, which is in wide use in other fields of science, can be used to devise automatic and reliable tests for Monte Carlo methods, and we show that these tests are able to detect some of the common problems encountered in stochastic scientific simulations. We argue that hypothesis testing should become part of the standard testing toolkit for scientific simulations.
Barker, F Keith; Barrowclough, George F; Groth, Jeff G
2002-01-01
Passerine birds comprise over half of avian diversity, but have proved difficult to classify. Despite a long history of work on this group, no comprehensive hypothesis of passerine family-level relationships was available until recent analyses of DNA-DNA hybridization data. Unfortunately, given the value of such a hypothesis in comparative studies of passerine ecology and behaviour, the DNA-hybridization results have not been well tested using independent data and analytical approaches. Therefore, we analysed nucleotide sequence variation at the nuclear RAG-1 and c-mos genes from 69 passerine taxa, including representatives of most currently recognized families. In contradiction to previous DNA-hybridization studies, our analyses suggest paraphyly of suboscine passerines because the suboscine New Zealand wren Acanthisitta was found to be sister to all other passerines. Additionally, we reconstructed the parvorder Corvida as a basal paraphyletic grade within the oscine passerines. Finally, we found strong evidence that several family-level taxa are misplaced in the hybridization results, including the Alaudidae, Irenidae, and Melanocharitidae. The hypothesis of relationships we present here suggests that the oscine passerines arose on the Australian continental plate while it was isolated by oceanic barriers and that a major northern radiation of oscines (i.e. the parvorder Passerida) originated subsequent to dispersal from the south. PMID:11839199
Barker, F Keith; Barrowclough, George F; Groth, Jeff G
2002-02-07
Passerine birds comprise over half of avian diversity, but have proved difficult to classify. Despite a long history of work on this group, no comprehensive hypothesis of passerine family-level relationships was available until recent analyses of DNA-DNA hybridization data. Unfortunately, given the value of such a hypothesis in comparative studies of passerine ecology and behaviour, the DNA-hybridization results have not been well tested using independent data and analytical approaches. Therefore, we analysed nucleotide sequence variation at the nuclear RAG-1 and c-mos genes from 69 passerine taxa, including representatives of most currently recognized families. In contradiction to previous DNA-hybridization studies, our analyses suggest paraphyly of suboscine passerines because the suboscine New Zealand wren Acanthisitta was found to be sister to all other passerines. Additionally, we reconstructed the parvorder Corvida as a basal paraphyletic grade within the oscine passerines. Finally, we found strong evidence that several family-level taxa are misplaced in the hybridization results, including the Alaudidae, Irenidae, and Melanocharitidae. The hypothesis of relationships we present here suggests that the oscine passerines arose on the Australian continental plate while it was isolated by oceanic barriers and that a major northern radiation of oscines (i.e. the parvorder Passerida) originated subsequent to dispersal from the south.
Environmental Kuznets Curve Hypothesis: A Perspective of Sustainable Development in Indonesia
NASA Astrophysics Data System (ADS)
Nuansa, Citrasmara Galuh; Widodo, Wahyu
2018-02-01
Sustainable development with three main pillars, namely environmental, economic, and social, is the concept of country's development to achieve inclusive economic growth, good environmental quality, and improvement of people's welfare. However, the dominance of economic factors cause various environmental problem. This phenomenon occurs in most of developing countries, including in Indonesia. The relationship between economic activity and environmental quality has been widely discussed and empirically tested by scholars. This descriptive research analysed the hypothesis called Environmental Kuznets Curve (EKC) from a perspective of sustainable development in Indonesia. EKC hypothesis illustrates the relationship between economic growth and environmental degradation forming an inverted U-curve, indicating that at the beginning of development, environmental quality will decrease along with increasing economic growth, and then reached a certain point the environmental quality will gradually improve. In this paper will be discussed how the relationship between environmental quality and economic growth in Indonesia was investigated. The preliminary results show that most of the empirical studies use the conventional approach, in which the CO2 emission used as the proxy of environmental degradation. The existence of inverted U-curve is also inconclusive. Therefore, the extension research on the relationship between economic growth and environmental quality in Indonesia using the EKC hypothesis is required.
Receiver operating characteristic analysis of age-related changes in lineup performance.
Humphries, Joyce E; Flowe, Heather D
2015-04-01
In the basic face memory literature, support has been found for the late maturation hypothesis, which holds that face recognition ability is not fully developed until at least adolescence. Support for the late maturation hypothesis in the criminal lineup identification literature, however, has been equivocal because of the analytic approach that has been used to examine age-related changes in identification performance. Recently, receiver operator characteristic (ROC) analysis was applied for the first time in the adult eyewitness memory literature to examine whether memory sensitivity differs across different types of lineup tests. ROC analysis allows for the separation of memory sensitivity from response bias in the analysis of recognition data. Here, we have made the first ROC-based comparison of adults' and children's (5- and 6-year-olds and 9- and 10-year-olds) memory performance on lineups by reanalyzing data from Humphries, Holliday, and Flowe (2012). In line with the late maturation hypothesis, memory sensitivity was significantly greater for adults compared with young children. Memory sensitivity for older children was similar to that for adults. The results indicate that the late maturation hypothesis can be generalized to account for age-related performance differences on an eyewitness memory task. The implications for developmental eyewitness memory research are discussed. Copyright © 2014 Elsevier Inc. All rights reserved.
The Read-Across Hypothesis and Environmental Risk Assessment of Pharmaceuticals
2013-01-01
Pharmaceuticals in the environment have received increased attention over the past decade, as they are ubiquitous in rivers and waterways. Concentrations are in sub-ng to low μg/L, well below acute toxic levels, but there are uncertainties regarding the effects of chronic exposures and there is a need to prioritise which pharmaceuticals may be of concern. The read-across hypothesis stipulates that a drug will have an effect in non-target organisms only if the molecular targets such as receptors and enzymes have been conserved, resulting in a (specific) pharmacological effect only if plasma concentrations are similar to human therapeutic concentrations. If this holds true for different classes of pharmaceuticals, it should be possible to predict the potential environmental impact from information obtained during the drug development process. This paper critically reviews the evidence for read-across, and finds that few studies include plasma concentrations and mode of action based effects. Thus, despite a large number of apparently relevant papers and a general acceptance of the hypothesis, there is an absence of documented evidence. There is a need for large-scale studies to generate robust data for testing the read-across hypothesis and developing predictive models, the only feasible approach to protecting the environment. PMID:24006913
Sex ratios in the two Germanies: a test of the economic stress hypothesis.
Catalano, Ralph A
2003-09-01
Literature describing temporal variation in the secondary sex ratio among humans reports an association between population stressors and declines in the odds of male birth. Explanations of this phenomenon draw on reports that stressed females spontaneously abort male more than female fetuses, and that stressed males exhibit reduced sperm motility. This work has led to the argument that population stress induced by a declining economy reduces the human sex ratio. No direct test of this hypothesis appears in the literature. Here, a test is offered based on a comparison of the sex ratio in East and West Germany for the years 1946 to 1999. The theory suggests that the East German sex ratio should be lower in 1991, when East Germany's economy collapsed, than expected from its own history and from the sex ratio in West Germany. The hypothesis is tested using time-series modelling methods. The data support the hypothesis. The sex ratio in East Germany was at its lowest in 1991. This first direct test supports the hypothesis that economic decline reduces the human sex ratio.
Carryover negligibility and relevance in bioequivalence studies.
Ocaña, Jordi; Sanchez O, Maria P; Carrasco, Josep L
2015-01-01
The carryover effect is a recurring issue in the pharmaceutical field. It may strongly influence the final outcome of an average bioequivalence study. Testing a null hypothesis of zero carryover is useless: not rejecting it does not guarantee the non-existence of carryover, and rejecting it is not informative of the true degree of carryover and its influence on the validity of the final outcome of the bioequivalence study. We propose a more consistent approach: even if some carryover is present, is it enough to seriously distort the study conclusions or is it negligible? This is the central aim of this paper, which focuses on average bioequivalence studies based on 2 × 2 crossover designs and on the main problem associated with carryover: type I error inflation. We propose an equivalence testing approach to these questions and suggest reasonable negligibility or relevance limits for carryover. Finally, we illustrate this approach on some real datasets. Copyright © 2015 John Wiley & Sons, Ltd.
Deterministic versus evidence-based attitude towards clinical diagnosis.
Soltani, Akbar; Moayyeri, Alireza
2007-08-01
Generally, two basic classes have been proposed for scientific explanation of events. Deductive reasoning emphasizes on reaching conclusions about a hypothesis based on verification of universal laws pertinent to that hypothesis, while inductive or probabilistic reasoning explains an event by calculation of some probabilities for that event to be related to a given hypothesis. Although both types of reasoning are used in clinical practice, evidence-based medicine stresses on the advantages of the second approach for most instances in medical decision making. While 'probabilistic or evidence-based' reasoning seems to involve more mathematical formulas at the first look, this attitude is more dynamic and less imprisoned by the rigidity of mathematics comparing with 'deterministic or mathematical attitude'. In the field of medical diagnosis, appreciation of uncertainty in clinical encounters and utilization of likelihood ratio as measure of accuracy seem to be the most important characteristics of evidence-based doctors. Other characteristics include use of series of tests for refining probability, changing diagnostic thresholds considering external evidences and nature of the disease, and attention to confidence intervals to estimate uncertainty of research-derived parameters.
Rammsayer, Thomas; Ulrich, Rolf
2011-05-01
The distinct timing hypothesis suggests a sensory mechanism for processing of durations in the range of milliseconds and a cognitively controlled mechanism for processing of longer durations. To test this hypothesis, we employed a dual-task approach to investigate the effects of maintenance and elaborative rehearsal on temporal processing of brief and long durations. Unlike mere maintenance rehearsal, elaborative rehearsal as a secondary task involved transfer of information from working to long-term memory and elaboration of information to enhance storage in long-term memory. Duration discrimination of brief intervals was not affected by a secondary cognitive task that required either maintenance or elaborative rehearsal. Concurrent elaborative rehearsal, however, impaired discrimination of longer durations as compared to maintenance rehearsal and a control condition with no secondary task. These findings endorse the distinct timing hypothesis and are in line with the notion that executive functions, such as continuous memory updating and active transfer of information into long-term memory interfere with temporal processing of durations in the second, but not in the millisecond range. 2011 Elsevier B.V. All rights reserved.
Understanding suicide terrorism: premature dismissal of the religious-belief hypothesis.
Liddle, James R; Machluf, Karin; Shackelford, Todd K
2010-07-06
We comment on work by Ginges, Hansen, and Norenzayan (2009), in which they compare two hypotheses for predicting individual support for suicide terrorism: the religious-belief hypothesis and the coalitional-commitment hypothesis. Although we appreciate the evidence provided in support of the coalitional-commitment hypothesis, we argue that their method of testing the religious-belief hypothesis is conceptually flawed, thus calling into question their conclusion that the religious-belief hypothesis has been disconfirmed. In addition to critiquing the methodology implemented by Ginges et al., we provide suggestions on how the religious-belief hypothesis may be properly tested. It is possible that the premature and unwarranted conclusions reached by Ginges et al. may deter researchers from examining the effect of specific religious beliefs on support for terrorism, and we hope that our comments can mitigate this possibility.
Physiopathological Hypothesis of Cellulite
de Godoy, José Maria Pereira; de Godoy, Maria de Fátima Guerreiro
2009-01-01
A series of questions are asked concerning this condition including as regards to its name, the consensus about the histopathological findings, physiological hypothesis and treatment of the disease. We established a hypothesis for cellulite and confirmed that the clinical response is compatible with this hypothesis. Hence this novel approach brings a modern physiological concept with physiopathologic basis and clinical proof of the hypothesis. We emphasize that the choice of patient, correct diagnosis of cellulite and the technique employed are fundamental to success. PMID:19756187
Feldman, Anatol G; Latash, Mark L
2005-02-01
Criticisms of the equilibrium point (EP) hypothesis have recently appeared that are based on misunderstandings of some of its central notions. Starting from such interpretations of the hypothesis, incorrect predictions are made and tested. When the incorrect predictions prove false, the hypothesis is claimed to be falsified. In particular, the hypothesis has been rejected based on the wrong assumptions that it conflicts with empirically defined joint stiffness values or that it is incompatible with violations of equifinality under certain velocity-dependent perturbations. Typically, such attempts use notions describing the control of movements of artificial systems in place of physiologically relevant ones. While appreciating constructive criticisms of the EP hypothesis, we feel that incorrect interpretations have to be clarified by reiterating what the EP hypothesis does and does not predict. We conclude that the recent claims of falsifying the EP hypothesis and the calls for its replacement by EMG-force control hypothesis are unsubstantiated. The EP hypothesis goes far beyond the EMG-force control view. In particular, the former offers a resolution for the famous posture-movement paradox while the latter fails to resolve it.
Evolution of Motor Control: From Reflexes and Motor Programs to the Equilibrium-Point Hypothesis
Latash, Mark L.
2009-01-01
This brief review analyzes the evolution of motor control theories along two lines that emphasize active (motor programs) and reactive (reflexes) features of voluntary movements. It suggests that the only contemporary hypothesis that integrates both approaches in a fruitful way is the equilibrium-point hypothesis. Physical, physiological, and behavioral foundations of the EP-hypothesis are considered as well as relations between the EP-hypothesis and the recent developments of the notion of motor synergies. The paper ends with a brief review of the criticisms of the EP-hypothesis and challenges that the hypothesis faces at this time. PMID:19823595
Perez, M F; Bonatelli, I A S; Moraes, E M; Carstens, B C
2016-01-01
Pilosocereus machrisii and P. aurisetus are cactus species within the P. aurisetus complex, a group of eight cacti that are restricted to rocky habitats within the Neotropical savannas of eastern South America. Previous studies have suggested that diversification within this complex was driven by distributional fragmentation, isolation leading to allopatric differentiation, and secondary contact among divergent lineages. These events have been associated with Quaternary climatic cycles, leading to the hypothesis that the xerophytic vegetation patches which presently harbor these populations operate as refugia during the current interglacial. However, owing to limitations of the standard phylogeographic approaches used in these studies, this hypothesis was not explicitly tested. Here we use Approximate Bayesian Computation to refine the previous inferences and test the role of different events in the diversification of two species within P. aurisetus group. We used molecular data from chloroplast DNA and simple sequence repeats loci of P. machrisii and P. aurisetus, the two species with broadest distribution in the complex, in order to test if the diversification in each species was driven mostly by vicariance or by long-dispersal events. We found that both species were affected primarily by vicariance, with a refuge model as the most likely scenario for P. aurisetus and a soft vicariance scenario most probable for P. machrisii. These results emphasize the importance of distributional fragmentation in these species, and add support to the hypothesis of long-term isolation in interglacial refugia previously proposed for the P. aurisetus species complex diversification. PMID:27071846
Caudate nucleus reactivity predicts perceptual learning rate for visual feature conjunctions.
Reavis, Eric A; Frank, Sebastian M; Tse, Peter U
2015-04-15
Useful information in the visual environment is often contained in specific conjunctions of visual features (e.g., color and shape). The ability to quickly and accurately process such conjunctions can be learned. However, the neural mechanisms responsible for such learning remain largely unknown. It has been suggested that some forms of visual learning might involve the dopaminergic neuromodulatory system (Roelfsema et al., 2010; Seitz and Watanabe, 2005), but this hypothesis has not yet been directly tested. Here we test the hypothesis that learning visual feature conjunctions involves the dopaminergic system, using functional neuroimaging, genetic assays, and behavioral testing techniques. We use a correlative approach to evaluate potential associations between individual differences in visual feature conjunction learning rate and individual differences in dopaminergic function as indexed by neuroimaging and genetic markers. We find a significant correlation between activity in the caudate nucleus (a component of the dopaminergic system connected to visual areas of the brain) and visual feature conjunction learning rate. Specifically, individuals who showed a larger difference in activity between positive and negative feedback on an unrelated cognitive task, indicative of a more reactive dopaminergic system, learned visual feature conjunctions more quickly than those who showed a smaller activity difference. This finding supports the hypothesis that the dopaminergic system is involved in visual learning, and suggests that visual feature conjunction learning could be closely related to associative learning. However, no significant, reliable correlations were found between feature conjunction learning and genotype or dopaminergic activity in any other regions of interest. Copyright © 2015 Elsevier Inc. All rights reserved.
A Cognitive Computing Approach for Classification of Complaints in the Insurance Industry
NASA Astrophysics Data System (ADS)
Forster, J.; Entrup, B.
2017-10-01
In this paper we present and evaluate a cognitive computing approach for classification of dissatisfaction and four complaint specific complaint classes in correspondence documents between insurance clients and an insurance company. A cognitive computing approach includes the combination classical natural language processing methods, machine learning algorithms and the evaluation of hypothesis. The approach combines a MaxEnt machine learning algorithm with language modelling, tf-idf and sentiment analytics to create a multi-label text classification model. The result is trained and tested with a set of 2500 original insurance communication documents written in German, which have been manually annotated by the partnering insurance company. With a F1-Score of 0.9, a reliable text classification component has been implemented and evaluated. A final outlook towards a cognitive computing insurance assistant is given in the end.
Gray, J R
2001-09-01
Emotional states might selectively modulate components of cognitive control. To test this hypothesis, the author randomly assigned 152 undergraduates (equal numbers of men and women) to watch short videos intended to induce emotional states (approach, neutral, or withdrawal). Each video was followed by a computerized 2-back working memory task (spatial or verbal, equated for difficulty and appearance). Spatial 2-back performance was enhanced by a withdrawal state and impaired by an approach state; the opposite pattern held for verbal performance. The double dissociation held more strongly for participants who made more errors than average across conditions. The results suggest that approach-withdrawal states can have selective influences on components of cognitive control, possibly on a hemispheric basis. They support and extend several frameworks for conceptualizing emotion-cognition interactions.
Consolidation of visual associative long-term memory in the temporal cortex of primates.
Miyashita, Y; Kameyama, M; Hasegawa, I; Fukushima, T
1998-01-01
Neuropsychological theories have proposed a critical role for the interaction between the medial temporal lobe and the neocortex in the formation of long-term memory for facts and events, which has often been tested by learning of a series of paired words or figures in humans. We have examined neural mechanisms underlying the memory "consolidation" process by single-unit recording and molecular biological methods in an animal model of a visual pair-association task in monkeys. In our previous studies, we found that long-term associative representations of visual objects are acquired through learning in the neural network of the anterior inferior temporal (IT) cortex. In this article, we propose the hypothesis that limbic neurons undergo rapid modification of synaptic connectivity and provide backward signals that guide the reorganization of neocortical neural circuits. Two experiments tested this hypothesis: (1) we examined the role of the backward connections from the medial temporal lobe to the IT cortex by injecting ibotenic acid into the entorhinal and perirhinal cortices, which provided massive backward projections ipsilaterally to the IT cortex. We found that the limbic lesion disrupted the associative code of the IT neurons between the paired associates, without impairing the visual response to each stimulus. (2) We then tested the first half of this hypothesis by detecting the expression of immediate-early genes in the monkey temporal cortex. We found specific expression of zif268 during the learning of a new set of paired associates in the pair-association task, most intensively in area 36 of the perirhinal cortex. All these results with the visual pair-association task support our hypothesis and demonstrate that the consolidation process, which was first proposed on the basis of clinico-psychological evidence, can now be examined in primates using neurophysiolocical and molecular biological approaches. Copyright 1998 Academic Press.
On the Interpretation and Use of Mediation: Multiple Perspectives on Mediation Analysis
Agler, Robert; De Boeck, Paul
2017-01-01
Mediation analysis has become a very popular approach in psychology, and it is one that is associated with multiple perspectives that are often at odds, often implicitly. Explicitly discussing these perspectives and their motivations, advantages, and disadvantages can help to provide clarity to conversations and research regarding the use and refinement of mediation models. We discuss five such pairs of perspectives on mediation analysis, their associated advantages and disadvantages, and their implications: with vs. without a mediation hypothesis, specific effects vs. a global model, directness vs. indirectness of causation, effect size vs. null hypothesis testing, and hypothesized vs. alternative explanations. Discussion of the perspectives is facilitated by a small simulation study. Some philosophical and linguistic considerations are briefly discussed, as well as some other perspectives we do not develop here. PMID:29187828
Extended target recognition in cognitive radar networks.
Wei, Yimin; Meng, Huadong; Liu, Yimin; Wang, Xiqin
2010-01-01
We address the problem of adaptive waveform design for extended target recognition in cognitive radar networks. A closed-loop active target recognition radar system is extended to the case of a centralized cognitive radar network, in which a generalized likelihood ratio (GLR) based sequential hypothesis testing (SHT) framework is employed. Using Doppler velocities measured by multiple radars, the target aspect angle for each radar is calculated. The joint probability of each target hypothesis is then updated using observations from different radar line of sights (LOS). Based on these probabilities, a minimum correlation algorithm is proposed to adaptively design the transmit waveform for each radar in an amplitude fluctuation situation. Simulation results demonstrate performance improvements due to the cognitive radar network and adaptive waveform design. Our minimum correlation algorithm outperforms the eigen-waveform solution and other non-cognitive waveform design approaches.
Action perception as hypothesis testing.
Donnarumma, Francesco; Costantini, Marcello; Ambrosini, Ettore; Friston, Karl; Pezzulo, Giovanni
2017-04-01
We present a novel computational model that describes action perception as an active inferential process that combines motor prediction (the reuse of our own motor system to predict perceived movements) and hypothesis testing (the use of eye movements to disambiguate amongst hypotheses). The system uses a generative model of how (arm and hand) actions are performed to generate hypothesis-specific visual predictions, and directs saccades to the most informative places of the visual scene to test these predictions - and underlying hypotheses. We test the model using eye movement data from a human action observation study. In both the human study and our model, saccades are proactive whenever context affords accurate action prediction; but uncertainty induces a more reactive gaze strategy, via tracking the observed movements. Our model offers a novel perspective on action observation that highlights its active nature based on prediction dynamics and hypothesis testing. Copyright © 2017 The Authors. Published by Elsevier Ltd.. All rights reserved.
Confidence intervals for single-case effect size measures based on randomization test inversion.
Michiels, Bart; Heyvaert, Mieke; Meulders, Ann; Onghena, Patrick
2017-02-01
In the current paper, we present a method to construct nonparametric confidence intervals (CIs) for single-case effect size measures in the context of various single-case designs. We use the relationship between a two-sided statistical hypothesis test at significance level α and a 100 (1 - α) % two-sided CI to construct CIs for any effect size measure θ that contain all point null hypothesis θ values that cannot be rejected by the hypothesis test at significance level α. This method of hypothesis test inversion (HTI) can be employed using a randomization test as the statistical hypothesis test in order to construct a nonparametric CI for θ. We will refer to this procedure as randomization test inversion (RTI). We illustrate RTI in a situation in which θ is the unstandardized and the standardized difference in means between two treatments in a completely randomized single-case design. Additionally, we demonstrate how RTI can be extended to other types of single-case designs. Finally, we discuss a few challenges for RTI as well as possibilities when using the method with other effect size measures, such as rank-based nonoverlap indices. Supplementary to this paper, we provide easy-to-use R code, which allows the user to construct nonparametric CIs according to the proposed method.
Abad-Grau, Mara M; Medina-Medina, Nuria; Montes-Soldado, Rosana; Matesanz, Fuencisla; Bafna, Vineet
2012-01-01
Multimarker Transmission/Disequilibrium Tests (TDTs) are very robust association tests to population admixture and structure which may be used to identify susceptibility loci in genome-wide association studies. Multimarker TDTs using several markers may increase power by capturing high-degree associations. However, there is also a risk of spurious associations and power reduction due to the increase in degrees of freedom. In this study we show that associations found by tests built on simple null hypotheses are highly reproducible in a second independent data set regardless the number of markers. As a test exhibiting this feature to its maximum, we introduce the multimarker 2-Groups TDT (mTDT(2G)), a test which under the hypothesis of no linkage, asymptotically follows a χ2 distribution with 1 degree of freedom regardless the number of markers. The statistic requires the division of parental haplotypes into two groups: disease susceptibility and disease protective haplotype groups. We assessed the test behavior by performing an extensive simulation study as well as a real-data study using several data sets of two complex diseases. We show that mTDT(2G) test is highly efficient and it achieves the highest power among all the tests used, even when the null hypothesis is tested in a second independent data set. Therefore, mTDT(2G) turns out to be a very promising multimarker TDT to perform genome-wide searches for disease susceptibility loci that may be used as a preprocessing step in the construction of more accurate genetic models to predict individual susceptibility to complex diseases.
Abad-Grau, Mara M.; Medina-Medina, Nuria; Montes-Soldado, Rosana; Matesanz, Fuencisla; Bafna, Vineet
2012-01-01
Multimarker Transmission/Disequilibrium Tests (TDTs) are very robust association tests to population admixture and structure which may be used to identify susceptibility loci in genome-wide association studies. Multimarker TDTs using several markers may increase power by capturing high-degree associations. However, there is also a risk of spurious associations and power reduction due to the increase in degrees of freedom. In this study we show that associations found by tests built on simple null hypotheses are highly reproducible in a second independent data set regardless the number of markers. As a test exhibiting this feature to its maximum, we introduce the multimarker -Groups TDT ( ), a test which under the hypothesis of no linkage, asymptotically follows a distribution with degree of freedom regardless the number of markers. The statistic requires the division of parental haplotypes into two groups: disease susceptibility and disease protective haplotype groups. We assessed the test behavior by performing an extensive simulation study as well as a real-data study using several data sets of two complex diseases. We show that test is highly efficient and it achieves the highest power among all the tests used, even when the null hypothesis is tested in a second independent data set. Therefore, turns out to be a very promising multimarker TDT to perform genome-wide searches for disease susceptibility loci that may be used as a preprocessing step in the construction of more accurate genetic models to predict individual susceptibility to complex diseases. PMID:22363405
Murray, Kris A.; Skerratt, Lee F.; Garland, Stephen; Kriticos, Darren; McCallum, Hamish
2013-01-01
The pandemic amphibian disease chytridiomycosis often exhibits strong seasonality in both prevalence and disease-associated mortality once it becomes endemic. One hypothesis that could explain this temporal pattern is that simple weather-driven pathogen proliferation (population growth) is a major driver of chytridiomycosis disease dynamics. Despite various elaborations of this hypothesis in the literature for explaining amphibian declines (e.g., the chytrid thermal-optimum hypothesis) it has not been formally tested on infection patterns in the wild. In this study we developed a simple process-based model to simulate the growth of the pathogen Batrachochytrium dendrobatidis (Bd) under varying weather conditions to provide an a priori test of a weather-linked pathogen proliferation hypothesis for endemic chytridiomycosis. We found strong support for several predictions of the proliferation hypothesis when applied to our model species, Litoria pearsoniana, sampled across multiple sites and years: the weather-driven simulations of pathogen growth potential (represented as a growth index in the 30 days prior to sampling; GI30) were positively related to both the prevalence and intensity of Bd infections, which were themselves strongly and positively correlated. In addition, a machine-learning classifier achieved ∼72% success in classifying positive qPCR results when utilising just three informative predictors 1) GI30, 2) frog body size and 3) rain on the day of sampling. Hence, while intrinsic traits of the individuals sampled (species, size, sex) and nuisance sampling variables (rainfall when sampling) influenced infection patterns obtained when sampling via qPCR, our results also strongly suggest that weather-linked pathogen proliferation plays a key role in the infection dynamics of endemic chytridiomycosis in our study system. Predictive applications of the model include surveillance design, outbreak preparedness and response, climate change scenario modelling and the interpretation of historical patterns of amphibian decline. PMID:23613783
ERIC Educational Resources Information Center
Besken, Miri
2016-01-01
The perceptual fluency hypothesis claims that items that are easy to perceive at encoding induce an illusion that they will be easier to remember, despite the finding that perception does not generally affect recall. The current set of studies tested the predictions of the perceptual fluency hypothesis with a picture generation manipulation.…
Adolescents' Body Image Trajectories: A Further Test of the Self-Equilibrium Hypothesis
ERIC Educational Resources Information Center
Morin, Alexandre J. S.; Maïano, Christophe; Scalas, L. Francesca; Janosz, Michel; Litalien, David
2017-01-01
The self-equilibrium hypothesis underlines the importance of having a strong core self, which is defined as a high and developmentally stable self-concept. This study tested this hypothesis in relation to body image (BI) trajectories in a sample of 1,006 adolescents (M[subscript age] = 12.6, including 541 males and 465 females) across a 4-year…
ERIC Educational Resources Information Center
Trafimow, David
2017-01-01
There has been much controversy over the null hypothesis significance testing procedure, with much of the criticism centered on the problem of inverse inference. Specifically, p gives the probability of the finding (or one more extreme) given the null hypothesis, whereas the null hypothesis significance testing procedure involves drawing a…
ERIC Educational Resources Information Center
Lee, Jungmin
2016-01-01
This study tested the Bennett hypothesis by examining whether four-year colleges changed listed tuition and fees, the amount of institutional grants per student, and room and board charges after their states implemented statewide merit-based aid programs. According to the Bennett hypothesis, increases in government financial aid make it easier for…
Human female orgasm as evolved signal: a test of two hypotheses.
Ellsworth, Ryan M; Bailey, Drew H
2013-11-01
We present the results of a study designed to empirically test predictions derived from two hypotheses regarding human female orgasm behavior as an evolved communicative trait or signal. One hypothesis tested was the female fidelity hypothesis, which posits that human female orgasm signals a woman's sexual satisfaction and therefore her likelihood of future fidelity to a partner. The other was sire choice hypothesis, which posits that women's orgasm behavior signals increased chances of fertilization. To test the two hypotheses of human female orgasm, we administered a questionnaire to 138 females and 121 males who reported that they were currently in a romantic relationship. Key predictions of the female fidelity hypothesis were not supported. In particular, orgasm was not associated with female sexual fidelity nor was orgasm associated with male perceptions of partner sexual fidelity. However, faked orgasm was associated with female sexual infidelity and lower male relationship satisfaction. Overall, results were in greater support of the sire choice signaling hypothesis than the female fidelity hypothesis. Results also suggest that male satisfaction with, investment in, and sexual fidelity to a mate are benefits that favored the selection of orgasmic signaling in ancestral females.
Luo, Liqun; Zhao, Wei; Weng, Tangmei
2016-01-01
The Trivers-Willard hypothesis predicts that high-status parents will bias their investment to sons, whereas low-status parents will bias their investment to daughters. Among humans, tests of this hypothesis have yielded mixed results. This study tests the hypothesis using data collected among contemporary peasants in Central South China. We use current family status (rated by our informants) and father's former class identity (assigned by the Chinese Communist Party in the early 1950s) as measures of parental status, and proportion of sons in offspring and offspring's years of education as measures of parental investment. Results show that (i) those families with a higher former class identity such as landlord and rich peasant tend to have a higher socioeconomic status currently, (ii) high-status parents are more likely to have sons than daughters among their biological offspring, and (iii) in higher-status families, the years of education obtained by sons exceed that obtained by daughters to a larger extent than in lower-status families. Thus, the first assumption and the two predictions of the hypothesis are supported by this study. This article contributes a contemporary Chinese case to the testing of the Trivers-Willard hypothesis.
Hypothesis testing of a change point during cognitive decline among Alzheimer's disease patients.
Ji, Ming; Xiong, Chengjie; Grundman, Michael
2003-10-01
In this paper, we present a statistical hypothesis test for detecting a change point over the course of cognitive decline among Alzheimer's disease patients. The model under the null hypothesis assumes a constant rate of cognitive decline over time and the model under the alternative hypothesis is a general bilinear model with an unknown change point. When the change point is unknown, however, the null distribution of the test statistics is not analytically tractable and has to be simulated by parametric bootstrap. When the alternative hypothesis that a change point exists is accepted, we propose an estimate of its location based on the Akaike's Information Criterion. We applied our method to a data set from the Neuropsychological Database Initiative by implementing our hypothesis testing method to analyze Mini Mental Status Exam scores based on a random-slope and random-intercept model with a bilinear fixed effect. Our result shows that despite large amount of missing data, accelerated decline did occur for MMSE among AD patients. Our finding supports the clinical belief of the existence of a change point during cognitive decline among AD patients and suggests the use of change point models for the longitudinal modeling of cognitive decline in AD research.
Saghafi, Ramin; Kempker, Jordan A.; Schulman, David A.
2016-01-01
Rationale: Hypothesis-driven physical examination emphasizes the role of bedside examination in the refinement of differential diagnoses and improves diagnostic acumen. This approach has not yet been investigated as a tool to improve the ability of higher-level trainees to teach medical students. Objectives: To assess the effect of teaching hypothesis-driven physical diagnosis to pulmonary fellows on their ability to improve the pulmonary examination skills of first-year medical students. Methods: Fellows and students were assessed on teaching and diagnostic skills by self-rating on a Likert scale. One group of fellows received the hypothesis-driven teaching curriculum (the “intervention” group) and another received instruction on head-to-toe examination. Both groups subsequently taught physical diagnosis to a group of first-year medical students. An oral examination was administered to all students after completion of the course. Measurements and Main Results: Fellows were comfortable teaching physical diagnosis to students. Students in both groups reported a lack of comfort with the pulmonary examination at the beginning of the course and improvement in their comfort by the end. Students trained by intervention group fellows outperformed students trained by control group fellows in the interpretation of physical findings (P < 0.05). Conclusions: Teaching hypothesis-driven physical examination to higher-level trainees who teach medical students improves the ability of students to interpret physical findings. This benefit should be confirmed using validated testing tools. PMID:26730644
NASA Astrophysics Data System (ADS)
Přibil, Jiří; Přibilová, Anna; Frollo, Ivan
2017-12-01
The paper focuses on two methods of evaluation of successfulness of speech signal enhancement recorded in the open-air magnetic resonance imager during phonation for the 3D human vocal tract modeling. The first approach enables to obtain a comparison based on statistical analysis by ANOVA and hypothesis tests. The second method is based on classification by Gaussian mixture models (GMM). The performed experiments have confirmed that the proposed ANOVA and GMM classifiers for automatic evaluation of the speech quality are functional and produce fully comparable results with the standard evaluation based on the listening test method.
Integration of QSAR and in vitro toxicology.
Barratt, M D
1998-01-01
The principles of quantitative structure-activity relationships (QSAR) are based on the premise that the properties of a chemical are implicit in its molecular structure. Therefore, if a mechanistic hypothesis can be proposed linking a group of related chemicals with a particular toxic end point, the hypothesis can be used to define relevant parameters to establish a QSAR. Ways in which QSAR and in vitro toxicology can complement each other in development of alternatives to live animal experiments are described and illustrated by examples from acute toxicological end points. Integration of QSAR and in vitro methods is examined in the context of assessing mechanistic competence and improving the design of in vitro assays and the development of prediction models. The nature of biological variability is explored together with its implications for the selection of sets of chemicals for test development, optimization, and validation. Methods are described to support the use of data from in vivo tests that do not meet today's stringent requirements of acceptability. Integration of QSAR and in vitro methods into strategic approaches for the replacement, reduction, and refinement of the use of animals is described with examples. PMID:9599692
An experimental approach in revisiting the magnetic orientation of cattle
Weijers, Debby; Hemerik, Lia; Heitkönig, Ignas M. A.
2018-01-01
In response to the increasing number of observational studies on an apparent south-north orientation in non-homing, non-migrating terrestrial mammals, we experimentally tested the alignment hypothesis using strong neodymium magnets on the resting orientation of individual cattle in Portugal. Contrary to the hypothesis, the 34 cows in the experiment showed no directional preference, neither with, nor without a strong neodymium magnet fixed to their collar. The concurrently performed 2,428 daytime observations—excluding the hottest part of the day—of 659 resting individual cattle did not show a south-north alignment when at rest either. The preferred compass orientation of these cows was on average 130 degrees from the magnetic north (i.e., south east). Cow compass orientation correlated significantly with sun direction, but not with wind direction. In as far as we can determine, this is the first experimental test on magnetic orientation in larger, non-homing, non-migrating mammals. These experimental and observational findings do not support previously published suggestions on the magnetic south-north alignment in these mammals. PMID:29641517
Robustness of survival estimates for radio-marked animals
Bunck, C.M.; Chen, C.-L.
1992-01-01
Telemetry techniques are often used to study the survival of birds and mammals; particularly whcn mark-recapture approaches are unsuitable. Both parametric and nonparametric methods to estimate survival have becn developed or modified from other applications. An implicit assumption in these approaches is that the probability of re-locating an animal with a functioning transmitter is one. A Monte Carlo study was conducted to determine the bias and variance of the Kaplan-Meier estimator and an estimator based also on the assumption of constant hazard and to eva!uate the performance of the two-sample tests associated with each. Modifications of each estimator which allow a re-Iocation probability of less than one are described and evaluated. Generallv the unmodified estimators were biased but had lower variance. At low sample sizes all estimators performed poorly. Under the null hypothesis, the distribution of all test statistics reasonably approximated the null distribution when survival was low but not when it was high. The power of the two-sample tests were similar.
Bayesian Methods for Determining the Importance of Effects
USDA-ARS?s Scientific Manuscript database
Criticisms have plagued the frequentist null-hypothesis significance testing (NHST) procedure since the day it was created from the Fisher Significance Test and Hypothesis Test of Jerzy Neyman and Egon Pearson. Alternatives to NHST exist in frequentist statistics, but competing methods are also avai...
Huang, Tao; Zhong, Linda L D; Lin, Chen-Yuan; Zhao, Ling; Ning, Zi-Wan; Hu, Dong-Dong; Zhang, Man; Tian, Ke; Cheng, Chung-Wah; Bian, Zhao-Xiang
2018-01-01
Investigating the pharmacology is key to the modernization of Chinese Medicine (CM) formulas. However, identifying which are the active compound(s) of CM formulas, which biological entities they target, and through which signaling pathway(s) they act to modify disease symptoms, are still difficult tasks for researchers, even when equipped with an arsenal of advanced modern technologies. Multiple approaches, including network pharmacology, pharmaco-genomics, -proteomics, and -metabolomics, have been developed to study the pharmacology of CM formulas. They fall into two general categories in terms of how they tackle a problem: bottom-up and top-down. In this article, we compared these two different approaches in several dimensions by using the case of MaZiRenWan (MZRW, also known as Hemp Seed Pill), a CM herbal formula for functional constipation. Multiple hypotheses are easy to be proposed in the bottom-up approach (e.g. network pharmacology); but these hypotheses are usually false positives and hard to be tested. In contrast, it is hard to suggest hypotheses in the top-down approach (e.g. pharmacometabolomics); however, once a hypothesis is proposed, it is much easier to be tested. Merging of these two approaches could results in a powerful approach, which could be the new paradigm for the pharmacological study of CM formulas.
Testing for purchasing power parity in the long-run for ASEAN-5
NASA Astrophysics Data System (ADS)
Choji, Niri Martha; Sek, Siok Kun
2017-04-01
For more than a decade, there has been a substantial interest in testing for the validity of the purchasing power parity (PPP) hypothesis empirically. This paper performs a test on revealing a long-run relative Purchasing Power Parity for a group of ASEAN-5 countries for the period of 1996-2016 using monthly data. For this purpose, we used the Pedroni co-integration method to test for the long-run hypothesis of purchasing power parity. We first tested for the stationarity of the variables and found that the variables are non-stationary at levels but stationary at first difference. Results of the Pedroni test rejected the null hypothesis of no co-integration meaning that we have enough evidence to support PPP in the long-run for the ASEAN-5 countries over the period of 1996-2016. In other words, the rejection of null hypothesis implies a long-run relation between nominal exchange rates and relative prices.
UNIFORMLY MOST POWERFUL BAYESIAN TESTS
Johnson, Valen E.
2014-01-01
Uniformly most powerful tests are statistical hypothesis tests that provide the greatest power against a fixed null hypothesis among all tests of a given size. In this article, the notion of uniformly most powerful tests is extended to the Bayesian setting by defining uniformly most powerful Bayesian tests to be tests that maximize the probability that the Bayes factor, in favor of the alternative hypothesis, exceeds a specified threshold. Like their classical counterpart, uniformly most powerful Bayesian tests are most easily defined in one-parameter exponential family models, although extensions outside of this class are possible. The connection between uniformly most powerful tests and uniformly most powerful Bayesian tests can be used to provide an approximate calibration between p-values and Bayes factors. Finally, issues regarding the strong dependence of resulting Bayes factors and p-values on sample size are discussed. PMID:24659829
[Experimental testing of Pflüger's reflex hypothesis of menstruation in late 19th century].
Simmer, H H
1980-07-01
Pflüger's hypothesis of a nerve reflex as the cause of menstruation published in 1865 and accepted by many, nonetheless did not lead to experimental investigations for 25 years. According to this hypothesis the nerve reflex starts in the ovary by an increase of the intraovarian pressure by the growing follicles. In 1884 Adolph Kehrer proposed a program to test the nerve reflex, but only in 1890, Cohnstein artificially increased the intraovarian pressure in women by bimanual compression from the outside and the vagina. His results were not convincing. Six years later, Strassmann injected fluids into ovaries of animals and obtained changes in the uterus resembling those of oestrus. His results seemed to verify a prognosis derived from Pflüger's hypothesis. Thus, after a long interval, that hypothesis had become a paradigma. Though reasons can be given for the delay, it is little understood, why experimental testing started so late.
Effect of salt depletion on sodium ion transport from human erythrocytes.
Krzesinski, J M; Rorive, G L
1985-01-01
This study was realized to test De Wardener's hypothesis on the presence of a plasmatic and natriuretic factor in essential hypertension. By an indirect approach consisting of modification of plasma volume and sodium pool in chronic renal failure and primary hypertension with, at the same time, measurements of ionic flux variations in human erythrocytes, we provide some arguments to confirm the presence of such a factor in hydrosaline overloaded uraemic patients and in "salt-sensitive" essential hypertensive subjects.
Testing fundamental ecological concepts with a Pythium-Prunus pathosystem
USDA-ARS?s Scientific Manuscript database
The study of plant-pathogen interactions has enabled tests of basic ecological concepts on plant community assembly (Janzen-Connell Hypothesis) and plant invasion (Enemy Release Hypothesis). We used a field experiment to (#1) test whether Pythium effects depended on host (seedling) density and/or d...
A checklist to facilitate objective hypothesis testing in social psychology research.
Washburn, Anthony N; Morgan, G Scott; Skitka, Linda J
2015-01-01
Social psychology is not a very politically diverse area of inquiry, something that could negatively affect the objectivity of social psychological theory and research, as Duarte et al. argue in the target article. This commentary offers a number of checks to help researchers uncover possible biases and identify when they are engaging in hypothesis confirmation and advocacy instead of hypothesis testing.
Nan Liu; Hai Ren; Sufen Yuan; Qinfeng Guo; Long Yang
2013-01-01
The relative importance of facilitation and competition between pairwise plants across abiotic stress gradients as predicted by the stress-gradient hypothesis has been confirmed in arid and temperate ecosystems, but the hypothesis has rarely been tested in tropical systems, particularly across nutrient gradients. The current research examines the interactions between a...
Phase II Clinical Trials: D-methionine to Reduce Noise-Induced Hearing Loss
2012-03-01
loss (NIHL) and tinnitus in our troops. Hypotheses: Primary Hypothesis: Administration of oral D-methionine prior to and during weapons...reduce or prevent noise-induced tinnitus . Primary outcome to test the primary hypothesis: Pure tone air-conduction thresholds. Primary outcome to...test the secondary hypothesis: Tinnitus questionnaires. Specific Aims: 1. To determine whether administering oral D-methionine (D-met) can
ERIC Educational Resources Information Center
Ng, Chi-hung Clarence
2014-01-01
Academic self-schemas are important cognitive frames capable of guiding students' learning engagement. Using a cohort of Year 10 Australian students, this longitudinal study examined the self-congruence engagement hypothesis which maintains that there is a close relationship among academic self-schemas, achievement goals, learning approaches,…
An omnibus test for the global null hypothesis.
Futschik, Andreas; Taus, Thomas; Zehetmayer, Sonja
2018-01-01
Global hypothesis tests are a useful tool in the context of clinical trials, genetic studies, or meta-analyses, when researchers are not interested in testing individual hypotheses, but in testing whether none of the hypotheses is false. There are several possibilities how to test the global null hypothesis when the individual null hypotheses are independent. If it is assumed that many of the individual null hypotheses are false, combination tests have been recommended to maximize power. If, however, it is assumed that only one or a few null hypotheses are false, global tests based on individual test statistics are more powerful (e.g. Bonferroni or Simes test). However, usually there is no a priori knowledge on the number of false individual null hypotheses. We therefore propose an omnibus test based on cumulative sums of the transformed p-values. We show that this test yields an impressive overall performance. The proposed method is implemented in an R-package called omnibus.
Explorations in Statistics: Hypothesis Tests and P Values
ERIC Educational Resources Information Center
Curran-Everett, Douglas
2009-01-01
Learning about statistics is a lot like learning about science: the learning is more meaningful if you can actively explore. This second installment of "Explorations in Statistics" delves into test statistics and P values, two concepts fundamental to the test of a scientific null hypothesis. The essence of a test statistic is that it compares what…
Planned Hypothesis Tests Are Not Necessarily Exempt from Multiplicity Adjustment
ERIC Educational Resources Information Center
Frane, Andrew V.
2015-01-01
Scientific research often involves testing more than one hypothesis at a time, which can inflate the probability that a Type I error (false discovery) will occur. To prevent this Type I error inflation, adjustments can be made to the testing procedure that compensate for the number of tests. Yet many researchers believe that such adjustments are…
ERIC Educational Resources Information Center
Malda, Maike; van de Vijver, Fons J. R.; Temane, Q. Michael
2010-01-01
In this study, cross-cultural differences in cognitive test scores are hypothesized to depend on a test's cultural complexity (Cultural Complexity Hypothesis: CCH), here conceptualized as its content familiarity, rather than on its cognitive complexity (Spearman's Hypothesis: SH). The content familiarity of tests assessing short-term memory,…
Prenatal nutrition, epigenetics and schizophrenia risk: can we test causal effects?
Kirkbride, James B; Susser, Ezra; Kundakovic, Marija; Kresovich, Jacob K; Davey Smith, George; Relton, Caroline L
2012-06-01
We posit that maternal prenatal nutrition can influence offspring schizophrenia risk via epigenetic effects. In this article, we consider evidence that prenatal nutrition is linked to epigenetic outcomes in offspring and schizophrenia in offspring, and that schizophrenia is associated with epigenetic changes. We focus upon one-carbon metabolism as a mediator of the pathway between perturbed prenatal nutrition and the subsequent risk of schizophrenia. Although post-mortem human studies demonstrate DNA methylation changes in brains of people with schizophrenia, such studies cannot establish causality. We suggest a testable hypothesis that utilizes a novel two-step Mendelian randomization approach, to test the component parts of the proposed causal pathway leading from prenatal nutritional exposure to schizophrenia. Applied here to a specific example, such an approach is applicable for wider use to strengthen causal inference of the mediating role of epigenetic factors linking exposures to health outcomes in population-based studies.
Rosenlund, Signe; Broeng, Leif; Overgaard, Søren; Jensen, Carsten; Holsgaard-Larsen, Anders
2016-11-01
The lateral and the posterior approach are the most commonly used procedures for total hip arthroplasty. Due to the detachment of the hip abductors, lateral approach is claimed to cause reduced hip muscle strength and altered gait pattern. However, this has not been investigated in a randomised controlled trial. The aim was to compare the efficacy of total hip arthroplasty performed by lateral or posterior approach on gait function and hip muscle strength up to 12months post-operatively. We hypothesised that posterior approach would be superior to lateral approach. Forty-seven patients with primary hip osteoarthritis were randomised to total hip arthroplasty with either posterior or lateral approach and evaluated pre-operatively, 3 and 12months post-operatively using 3-dimensional gait analyses as objective measures of gait function, including Gait Deviation Index, temporo-spatial parameters and range of motion. Isometric maximal hip muscle strength in abduction, flexion and extension was also tested. Post-operatively, no between-group difference in gait function was observed. However, both hip abductor and flexor muscle strength improved more in the posterior approach group: -0.20(Nm/kg)[95%CI:-0.4 to 0.0] and -0.20(Nm/kg)[95%CI:-0.4 to 0.0], respectively. Contrary to our first hypothesis, the overall gait function in the posterior approach group did not improve more than in the lateral approach group. However, in agreement with our second hypothesis, patients in the posterior approach group improved more in hip abductor and flexor muscle strength at 12months. Further investigation of the effect of reduced maximal hip muscle strength on functional capacity is needed. ClinicalTrials.gov. No.: NCT01616667. Copyright © 2016 Elsevier Ltd. All rights reserved.
Is it better to select or to receive? Learning via active and passive hypothesis testing.
Markant, Douglas B; Gureckis, Todd M
2014-02-01
People can test hypotheses through either selection or reception. In a selection task, the learner actively chooses observations to test his or her beliefs, whereas in reception tasks data are passively encountered. People routinely use both forms of testing in everyday life, but the critical psychological differences between selection and reception learning remain poorly understood. One hypothesis is that selection learning improves learning performance by enhancing generic cognitive processes related to motivation, attention, and engagement. Alternatively, we suggest that differences between these 2 learning modes derives from a hypothesis-dependent sampling bias that is introduced when a person collects data to test his or her own individual hypothesis. Drawing on influential models of sequential hypothesis-testing behavior, we show that such a bias (a) can lead to the collection of data that facilitates learning compared with reception learning and (b) can be more effective than observing the selections of another person. We then report a novel experiment based on a popular category learning paradigm that compares reception and selection learning. We additionally compare selection learners to a set of "yoked" participants who viewed the exact same sequence of observations under reception conditions. The results revealed systematic differences in performance that depended on the learner's role in collecting information and the abstract structure of the problem.
Direct multitrait selection realizes the highest genetic response for ratio traits.
Zetouni, L; Henryon, M; Kargo, M; Lassen, J
2017-05-01
For a number of traits the phenotype considered to be the goal trait is a combination of 2 or more traits, like methane (CH) emission (CH/kg of milk). Direct selection on CH4 emission defined as a ratio is problematic, because it is uncertain whether the improvement comes from an improvement in milk yield, a decrease in CH emission or both. The goal was to test different strategies on selecting for 2 antagonistic traits- improving milk yield while decreasing methane emissions. The hypothesis was that to maximize genetic gain for a ratio trait, the best approach is to select directly for the component traits rather than using a ratio trait or a trait where 1 trait is corrected for the other as the selection criteria. Stochastic simulation was used to mimic a dairy cattle population. Three scenarios were tested, which differed in selection criteria but all selecting for increased milk yield: 1) selection based on a multitrait approach using the correlation structure between the 2 traits, 2) the ratio of methane to milk and 3) gross methane phenotypically corrected for milk. Four correlation sets were tested in all scenarios, to access robustness of the results. An average genetic gain of 66 kg of milk per yr was obtained in all scenarios, but scenario 1 had the best response for decreased methane emissions, with a genetic gain of 24.8 l/yr, while scenarios 2 and 3 had genetic gains of 27.1 and 27.3 kg/yr. The results found were persistent across correlation sets. These results confirm the hypothesis that to obtain the highest genetic gain a multitrait selection is a better approach than selecting for the ratio directly. The results are exemplified for a methane and milk scenario but can be generalized to other situations where combined traits need to be improved.
Does Testing Increase Spontaneous Mediation in Learning Semantically Related Paired Associates?
ERIC Educational Resources Information Center
Cho, Kit W.; Neely, James H.; Brennan, Michael K.; Vitrano, Deana; Crocco, Stephanie
2017-01-01
Carpenter (2011) argued that the testing effect she observed for semantically related but associatively unrelated paired associates supports the mediator effectiveness hypothesis. This hypothesis asserts that after the cue-target pair "mother-child" is learned, relative to restudying mother-child, a review test in which…
Potter, Timothy; Corneille, Olivier; Ruys, Kirsten I; Rhodes, Ginwan
2007-04-01
Findings on both attractiveness and memory for faces suggest that people should perceive more similarity among attractive than among unattractive faces. A multidimensional scaling approach was used to test this hypothesis in two studies. In Study 1, we derived a psychological face space from similarity ratings of attractive and unattractive Caucasian female faces. In Study 2, we derived a face space for attractive and unattractive male faces of Caucasians and non-Caucasians. Both studies confirm that attractive faces are indeed more tightly clustered than unattractive faces in people's psychological face spaces. These studies provide direct and original support for theoretical assumptions previously made in the face space and face memory literatures.
Hatori, Tsuyoshi; Takemura, Kazuhisa; Fujii, Satoshi; Ideno, Takashi
2011-06-01
This paper presents a new model of category judgment. The model hypothesizes that, when more attention is focused on a category, the psychological range of the category gets narrower (category-focusing hypothesis). We explain this hypothesis by using the metaphor of a "mental-box" model: the more attention that is focused on a mental box (i.e., a category set), the smaller the size of the box becomes (i.e., a cardinal number of the category set). The hypothesis was tested in an experiment (N = 40), where the focus of attention on prescribed verbal categories was manipulated. The obtained data gave support to the hypothesis: category-focusing effects were found in three experimental tasks (regarding the category of "food", "height", and "income"). The validity of the hypothesis was discussed based on the results.
Biesta-Peters, Elisabeth G.; Reij, Martine W.; Zwietering, Marcel H.; Gorris, Leon G. M.
2011-01-01
This research aims to test the absence (gamma hypothesis) or occurrence of synergy between two growth-limiting factors, i.e., pH and water activity (aw), using a systematic approach for model selection. In this approach, preset criteria were used to evaluate the performance of models. Such a systematic approach is required to be confident in the correctness of the individual components of the combined (synergy) models. With Bacillus cereus F4810/72 as the test organism, estimated growth boundaries for the aw-lowering solutes NaCl, KCl, and glucose were 1.13 M, 1.13 M, and 1.68 M, respectively. The accompanying aw values were 0.954, 0.956, and 0.961, respectively, indicating that equal aw values result in similar effects on growth. Out of the 12 models evaluated using the preset criteria, the model of J. H. T. Luong (Biotechnol. Bioeng. 27:280–285, 1985) was the best model to describe the effect of aw on growth. This aw model and the previously selected pH model were combined into a gamma model and into two synergy models. None of the three models was able to describe the combined pH and aw conditions sufficiently well to satisfy the preset criteria. The best matches between predicted and experimental data were obtained with the gamma model, followed by the synergy model of Y. Le Marc et al. (Int. J. Food Microbiol. 73:219–237, 2002). No combination of models that was able to predict the impact of both individual and combined hurdles correctly could be found. Consequently, in this case we could not prove the existence of synergy nor falsify the gamma hypothesis. PMID:21705525
Bayesian wavelet PCA methodology for turbomachinery damage diagnosis under uncertainty
NASA Astrophysics Data System (ADS)
Xu, Shengli; Jiang, Xiaomo; Huang, Jinzhi; Yang, Shuhua; Wang, Xiaofang
2016-12-01
Centrifugal compressor often suffers various defects such as impeller cracking, resulting in forced outage of the total plant. Damage diagnostics and condition monitoring of such a turbomachinery system has become an increasingly important and powerful tool to prevent potential failure in components and reduce unplanned forced outage and further maintenance costs, while improving reliability, availability and maintainability of a turbomachinery system. This paper presents a probabilistic signal processing methodology for damage diagnostics using multiple time history data collected from different locations of a turbomachine, considering data uncertainty and multivariate correlation. The proposed methodology is based on the integration of three advanced state-of-the-art data mining techniques: discrete wavelet packet transform, Bayesian hypothesis testing, and probabilistic principal component analysis. The multiresolution wavelet analysis approach is employed to decompose a time series signal into different levels of wavelet coefficients. These coefficients represent multiple time-frequency resolutions of a signal. Bayesian hypothesis testing is then applied to each level of wavelet coefficient to remove possible imperfections. The ratio of posterior odds Bayesian approach provides a direct means to assess whether there is imperfection in the decomposed coefficients, thus avoiding over-denoising. Power spectral density estimated by the Welch method is utilized to evaluate the effectiveness of Bayesian wavelet cleansing method. Furthermore, the probabilistic principal component analysis approach is developed to reduce dimensionality of multiple time series and to address multivariate correlation and data uncertainty for damage diagnostics. The proposed methodology and generalized framework is demonstrated with a set of sensor data collected from a real-world centrifugal compressor with impeller cracks, through both time series and contour analyses of vibration signal and principal components.
Su, Zhong; Zhang, Lisha; Ramakrishnan, V; Hagan, Michael; Anscher, Mitchell
2011-05-01
To evaluate both the Calypso Systems' (Calypso Medical Technologies, Inc., Seattle, WA) localization accuracy in the presence of wireless metal-oxide-semiconductor field-effect transistor (MOSFET) dosimeters of dose verification system (DVS, Sicel Technologies, Inc., Morrisville, NC) and the dosimeters' reading accuracy in the presence of wireless electromagnetic transponders inside a phantom. A custom-made, solid-water phantom was fabricated with space for transponders and dosimeters. Two inserts were machined with positioning grooves precisely matching the dimensions of the transponders and dosimeters and were arranged in orthogonal and parallel orientations, respectively. To test the transponder localization accuracy with/without presence of dosimeters (hypothesis 1), multivariate analyses were performed on transponder-derived localization data with and without dosimeters at each preset distance to detect statistically significant localization differences between the control and test sets. To test dosimeter dose-reading accuracy with/without presence of transponders (hypothesis 2), an approach of alternating the transponder presence in seven identical fraction dose (100 cGy) deliveries and measurements was implemented. Two-way analysis of variance was performed to examine statistically significant dose-reading differences between the two groups and the different fractions. A relative-dose analysis method was also used to evaluate transponder impact on dose-reading accuracy after dose-fading effect was removed by a second-order polynomial fit. Multivariate analysis indicated that hypothesis 1 was false; there was a statistically significant difference between the localization data from the control and test sets. However, the upper and lower bounds of the 95% confidence intervals of the localized positional differences between the control and test sets were less than 0.1 mm, which was significantly smaller than the minimum clinical localization resolution of 0.5 mm. For hypothesis 2, analysis of variance indicated that there was no statistically significant difference between the dosimeter readings with and without the presence of transponders. Both orthogonal and parallel configurations had difference of polynomial-fit dose to measured dose values within 1.75%. The phantom study indicated that the Calypso System's localization accuracy was not affected clinically due to the presence of DVS wireless MOSFET dosimeters and the dosimeter-measured doses were not affected by the presence of transponders. Thus, the same patients could be implanted with both transponders and dosimeters to benefit from improved accuracy of radiotherapy treatments offered by conjunctional use of the two systems.
Meta-analysis of laparoscopic versus open repair of perforated peptic ulcer.
Antoniou, Stavros A; Antoniou, George A; Koch, Oliver O; Pointner, Rudolph; Granderath, Frank A
2013-01-01
Laparoscopic treatment of perforated peptic ulcer (PPU) has been introduced as an alternative procedure to open surgery. It has been postulated that the minimally invasive approach involves less operative stress and results in decreased morbidity and mortality. We conducted a meta-analysis of randomized trials to test this hypothesis. Medline, EMBASE, and the Cochrane Central Register of Randomized Trials databases were searched, with no date or language restrictions. Our literature search identified 4 randomized trials, with a cumulative number of 289 patients, that compared the laparoscopic approach with open sutured repair of perforated ulcer. Analysis of outcomes did not favor either approach in terms of morbidity, mortality, and reoperation rate, although odds ratios seemed to consistently support the laparoscopic approach. Results did not determine the comparative efficiency and safety of laparoscopic or open approach for PPU. In view of an increased interest in the laparoscopic approach, further randomized trials are considered essential to determine the relative effectiveness of laparoscopic and open repair of PPU.
Meta-analysis of Laparoscopic Versus Open Repair of Perforated Peptic Ulcer
Antoniou, George A.; Koch, Oliver O.; Pointner, Rudolph; Granderath, Frank A.
2013-01-01
Background and Objectives: Laparoscopic treatment of perforated peptic ulcer (PPU) has been introduced as an alternative procedure to open surgery. It has been postulated that the minimally invasive approach involves less operative stress and results in decreased morbidity and mortality. Methods: We conducted a meta-analysis of randomized trials to test this hypothesis. Medline, EMBASE, and the Cochrane Central Register of Randomized Trials databases were searched, with no date or language restrictions. Results: Our literature search identified 4 randomized trials, with a cumulative number of 289 patients, that compared the laparoscopic approach with open sutured repair of perforated ulcer. Analysis of outcomes did not favor either approach in terms of morbidity, mortality, and reoperation rate, although odds ratios seemed to consistently support the laparoscopic approach. Results did not determine the comparative efficiency and safety of laparoscopic or open approach for PPU. Conclusion: In view of an increased interest in the laparoscopic approach, further randomized trials are considered essential to determine the relative effectiveness of laparoscopic and open repair of PPU. PMID:23743368
Approach to Cerebrospinal Fluid (CSF) Biomarker Discovery and Evaluation in HIV Infection
DOE Office of Scientific and Technical Information (OSTI.GOV)
Price, Richard W.; Peterson, Julia; Fuchs, Dietmar
2013-12-13
Central nervous system (CNS) infection is a nearly universal facet of systemic HIV infection that varies in character and neurological consequences. While clinical staging and neuropsychological test performance have been helpful in evaluating patients, cerebrospinal fluid (CSF) biomarkers present a valuable and objective approach to more accurate diagnosis, assessment of treatment effects and understanding of evolving pathobiology. We review some lessons from our recent experience with CSF biomarker studies. We have used two approaches to biomarker analysis: targeted, hypothesis-driven and non-targeted exploratory discovery methods. We illustrate the first with data from a cross-sectional study of defined subject groups across themore » spectrum of systemic and CNS disease progression and the second with a longitudinal study of the CSF proteome in subjects initiating antiretroviral treatment. Both approaches can be useful and, indeed, complementary. The first is helpful in assessing known or hypothesized biomarkers while the second can identify novel biomarkers and point to broad interactions in pathogenesis. Common to both is the need for well-defined samples and subjects that span a spectrum of biological activity and biomarker concentrations. Previouslydefined guide biomarkers of CNS infection, inflammation and neural injury are useful in categorizing samples for analysis and providing critical biological context for biomarker discovery studies. CSF biomarkers represent an underutilized but valuable approach to understanding the interactions of HIV and the CNS and to more objective diagnosis and assessment of disease activity. Both hypothesis-based and discovery methods can be useful in advancing the definition and use of these biomarkers.« less
Approach to cerebrospinal fluid (CSF) biomarker discovery and evaluation in HIV infection.
Price, Richard W; Peterson, Julia; Fuchs, Dietmar; Angel, Thomas E; Zetterberg, Henrik; Hagberg, Lars; Spudich, Serena; Smith, Richard D; Jacobs, Jon M; Brown, Joseph N; Gisslen, Magnus
2013-12-01
Central nervous system (CNS) infection is a nearly universal facet of systemic HIV infection that varies in character and neurological consequences. While clinical staging and neuropsychological test performance have been helpful in evaluating patients, cerebrospinal fluid (CSF) biomarkers present a valuable and objective approach to more accurate diagnosis, assessment of treatment effects and understanding of evolving pathobiology. We review some lessons from our recent experience with CSF biomarker studies. We have used two approaches to biomarker analysis: targeted, hypothesis-driven and non-targeted exploratory discovery methods. We illustrate the first with data from a cross-sectional study of defined subject groups across the spectrum of systemic and CNS disease progression and the second with a longitudinal study of the CSF proteome in subjects initiating antiretroviral treatment. Both approaches can be useful and, indeed, complementary. The first is helpful in assessing known or hypothesized biomarkers while the second can identify novel biomarkers and point to broad interactions in pathogenesis. Common to both is the need for well-defined samples and subjects that span a spectrum of biological activity and biomarker concentrations. Previously-defined guide biomarkers of CNS infection, inflammation and neural injury are useful in categorizing samples for analysis and providing critical biological context for biomarker discovery studies. CSF biomarkers represent an underutilized but valuable approach to understanding the interactions of HIV and the CNS and to more objective diagnosis and assessment of disease activity. Both hypothesis-based and discovery methods can be useful in advancing the definition and use of these biomarkers.
Corti, Daniele; Galbiati, Valentina; Gatti, Nicolò; Marinovich, Marina; Galli, Corrado L; Corsini, Emanuela
2015-10-01
Despite important impacts of systemic hypersensitivity induced by pharmaceuticals, for such endpoint no reliable preclinical approaches are available. We previously established an in vitro test to identify contact and respiratory allergens based on interleukin-8 (IL-8) production in THP-1 cells. Here, we challenged it for identification of pharmaceuticals associated with systemic hypersensitivity reactions, with the idea that drug sensitizers share common mechanisms of cell activation. Cells were exposed to drugs associated with systemic hypersensitivity reactions (streptozotocin, sulfamethoxazole, neomycin, probenecid, clonidine, procainamide, ofloxacin, methyl salicylate), while metformin was used as negative drug. Differently to chemicals, drugs tested were well tolerated, except clonidine and probenecid, with no signs of cytotoxicity up to 1-2mg/ml. THP-1 activation assay was adjusted, and conditions, that allow identification of all sensitizing drugs tested, were established. Next, using streptozotocin and selective inhibitors of PKC-β and p38 MAPK, two pathways involved in chemical allergen-induced cell activation, we tested the hypothesis that similar pathways were also involved in drug-induced IL-8 production and CD86 upregulation. Results indicated that drugs and chemical allergens share similar activation pathways. Finally, we made a structure-activity hypothesis related to hypersensitivity reactions, trying to individuate structural requisite that can be involved in immune mediated adverse reactions. Copyright © 2015 Elsevier Ltd. All rights reserved.
A probabilistic method for testing and estimating selection differences between populations
He, Yungang; Wang, Minxian; Huang, Xin; Li, Ran; Xu, Hongyang; Xu, Shuhua; Jin, Li
2015-01-01
Human populations around the world encounter various environmental challenges and, consequently, develop genetic adaptations to different selection forces. Identifying the differences in natural selection between populations is critical for understanding the roles of specific genetic variants in evolutionary adaptation. Although numerous methods have been developed to detect genetic loci under recent directional selection, a probabilistic solution for testing and quantifying selection differences between populations is lacking. Here we report the development of a probabilistic method for testing and estimating selection differences between populations. By use of a probabilistic model of genetic drift and selection, we showed that logarithm odds ratios of allele frequencies provide estimates of the differences in selection coefficients between populations. The estimates approximate a normal distribution, and variance can be estimated using genome-wide variants. This allows us to quantify differences in selection coefficients and to determine the confidence intervals of the estimate. Our work also revealed the link between genetic association testing and hypothesis testing of selection differences. It therefore supplies a solution for hypothesis testing of selection differences. This method was applied to a genome-wide data analysis of Han and Tibetan populations. The results confirmed that both the EPAS1 and EGLN1 genes are under statistically different selection in Han and Tibetan populations. We further estimated differences in the selection coefficients for genetic variants involved in melanin formation and determined their confidence intervals between continental population groups. Application of the method to empirical data demonstrated the outstanding capability of this novel approach for testing and quantifying differences in natural selection. PMID:26463656
Fan, Jung-Wei; Lussier, Yves A
2017-01-01
Dietary supplements remain a relatively underexplored source for drug repurposing. A systematic approach to soliciting responses from a large consumer population is desirable to speed up innovation. We tested a workflow that mines unexpected benefits of dietary supplements from massive consumer reviews. A (non-exhaustive) list of regular expressions was used to screen over 2 million reviews on health and personal care products. The matched reviews were manually analyzed, and one supplement-disease pair was linked to biological databases for enriching the hypothesized association. The regular expressions found 169 candidate reviews, of which 45.6% described unexpected benefits of certain dietary supplements. The manual analysis showed some of the supplement-disease associations to be novel or in agreement with evidence published later in the literature. The hypothesis enrichment was able to identify meaningful function similarity between the supplement and the disease. The results demonstrated value of the workflow in identifying candidates for supplement repurposing.
Emotional facilitation of sensory processing in the visual cortex.
Schupp, Harald T; Junghöfer, Markus; Weike, Almut I; Hamm, Alfons O
2003-01-01
A key function of emotion is the preparation for action. However, organization of successful behavioral strategies depends on efficient stimulus encoding. The present study tested the hypothesis that perceptual encoding in the visual cortex is modulated by the emotional significance of visual stimuli. Event-related brain potentials were measured while subjects viewed pleasant, neutral, and unpleasant pictures. Early selective encoding of pleasant and unpleasant images was associated with a posterior negativity, indicating primary sources of activation in the visual cortex. The study also replicated previous findings in that affective cues also elicited enlarged late positive potentials, indexing increased stimulus relevance at higher-order stages of stimulus processing. These results support the hypothesis that sensory encoding of affective stimuli is facilitated implicitly by natural selective attention. Thus, the affect system not only modulates motor output (i.e., favoring approach or avoidance dispositions), but already operates at an early level of sensory encoding.
Brady, B A; Tucker, C M; Alfino, P A; Tarrant, D G; Finlayson, G C
1997-01-01
This research tested the hypothesis that fluid adherence (i.e. mean weekend interdialysis fluid weight gain) among adult chronic hemodialysis patients would have significant associations with fluid adherence efficacy expectation, fluid adherence outcome expectation, and fluid adherence motivation. The association of these variables with patients' medical characteristics was also examined. Results provide partial support for the hypothesis. Fluid adherence efficacy expectation was found to be a significant predictor of mean weekend interdialysis fluid weight gain (fluid adherence). Patients with higher fluid adherence efficacy expectations had lower mean weekend interdialysis fluid weight gains. However, fluid adherence outcome expectation and fluid adherence motivation were not found to be significant predictors of fluid adherence. Results also revealed that certain of the investigated medical characteristics were significantly associated with mean weekend interdialysis fluid weight gain and fluid adherence efficacy expectation. Implications for studying and modifying fluid adherence among hemodialysis patients are discussed.
Reward Motivation Enhances Task Coding in Frontoparietal Cortex
Etzel, Joset A.; Cole, Michael W.; Zacks, Jeffrey M.; Kay, Kendrick N.; Braver, Todd S.
2016-01-01
Reward motivation often enhances task performance, but the neural mechanisms underlying such cognitive enhancement remain unclear. Here, we used a multivariate pattern analysis (MVPA) approach to test the hypothesis that motivation-related enhancement of cognitive control results from improved encoding and representation of task set information. Participants underwent two fMRI sessions of cued task switching, the first under baseline conditions, and the second with randomly intermixed reward incentive and no-incentive trials. Information about the upcoming task could be successfully decoded from cue-related activation patterns in a set of frontoparietal regions typically associated with task control. More critically, MVPA classifiers trained on the baseline session had significantly higher decoding accuracy on incentive than non-incentive trials, with decoding improvement mediating reward-related enhancement of behavioral performance. These results strongly support the hypothesis that reward motivation enhances cognitive control, by improving the discriminability of task-relevant information coded and maintained in frontoparietal brain regions. PMID:25601237
Deciphering the crowd: modeling and identification of pedestrian group motion.
Yücel, Zeynep; Zanlungo, Francesco; Ikeda, Tetsushi; Miyashita, Takahiro; Hagita, Norihiro
2013-01-14
Associating attributes to pedestrians in a crowd is relevant for various areas like surveillance, customer profiling and service providing. The attributes of interest greatly depend on the application domain and might involve such social relations as friends or family as well as the hierarchy of the group including the leader or subordinates. Nevertheless, the complex social setting inherently complicates this task. We attack this problem by exploiting the small group structures in the crowd. The relations among individuals and their peers within a social group are reliable indicators of social attributes. To that end, this paper identifies social groups based on explicit motion models integrated through a hypothesis testing scheme. We develop two models relating positional and directional relations. A pair of pedestrians is identified as belonging to the same group or not by utilizing the two models in parallel, which defines a compound hypothesis testing scheme. By testing the proposed approach on three datasets with different environmental properties and group characteristics, it is demonstrated that we achieve an identification accuracy of 87% to 99%. The contribution of this study lies in its definition of positional and directional relation models, its description of compound evaluations, and the resolution of ambiguities with our proposed uncertainty measure based on the local and global indicators of group relation.
Deciphering the Crowd: Modeling and Identification of Pedestrian Group Motion
Yücel, Zeynep; Zanlungo, Francesco; Ikeda, Tetsushi; Miyashita, Takahiro; Hagita, Norihiro
2013-01-01
Associating attributes to pedestrians in a crowd is relevant for various areas like surveillance, customer profiling and service providing. The attributes of interest greatly depend on the application domain and might involve such social relations as friends or family as well as the hierarchy of the group including the leader or subordinates. Nevertheless, the complex social setting inherently complicates this task. We attack this problem by exploiting the small group structures in the crowd. The relations among individuals and their peers within a social group are reliable indicators of social attributes. To that end, this paper identifies social groups based on explicit motion models integrated through a hypothesis testing scheme. We develop two models relating positional and directional relations. A pair of pedestrians is identified as belonging to the same group or not by utilizing the two models in parallel, which defines a compound hypothesis testing scheme. By testing the proposed approach on three datasets with different environmental properties and group characteristics, it is demonstrated that we achieve an identification accuracy of 87% to 99%. The contribution of this study lies in its definition of positional and directional relation models, its description of compound evaluations, and the resolution of ambiguities with our proposed uncertainty measure based on the local and global indicators of group relation. PMID:23344382
Neuropathic pain screening questionnaires have limited measurement properties. A systematic review.
Mathieson, Stephanie; Maher, Christopher G; Terwee, Caroline B; Folly de Campos, Tarcisio; Lin, Chung-Wei Christine
2015-08-01
The Douleur Neuropathique 4 (DN4), ID Pain, Leeds Assessment of Neuropathic Symptoms and Signs (LANSS), PainDETECT, and Neuropathic Pain Questionnaire have been recommended as screening questionnaires for neuropathic pain. This systematic review aimed to evaluate the measurement properties (eg, criterion validity and reliability) of these questionnaires. Online database searches were conducted and two independent reviewers screened studies and extracted data. Methodological quality of included studies and the measurement properties were assessed against established criteria. A modified Grading of Recommendations Assessment, Development and Evaluation approach was used to summarize the level of evidence. Thirty-seven studies were included. Most studies recruited participants from pain clinics. The original version of the DN4 (French) and Neuropathic Pain Questionnaire (English) had the most number of satisfactory measurement properties. The ID Pain (English) demonstrated satisfactory hypothesis testing and reliability, but all other properties tested were unsatisfactory. The LANSS (English) was unsatisfactory for all properties, except specificity. The PainDETECT (English) demonstrated satisfactory hypothesis testing and criterion validity. In general, the cross-cultural adaptations had less evidence than the original versions. Overall, the DN4 and Neuropathic Pain Questionnaire were most suitable for clinical use. These screening questionnaires should not replace a thorough clinical assessment. Crown Copyright © 2015. Published by Elsevier Inc. All rights reserved.
Gottscho, Andrew D; Marks, Sharyn B; Jennings, W Bryan
2014-01-01
The North American deserts were impacted by both Neogene plate tectonics and Quaternary climatic fluctuations, yet it remains unclear how these events influenced speciation in this region. We tested published hypotheses regarding the timing and mode of speciation, population structure, and demographic history of the Mojave Fringe-toed Lizard (Uma scoparia), a sand dune specialist endemic to the Mojave Desert of California and Arizona. We sampled 109 individual lizards representing 22 insular dune localities, obtained DNA sequences for 14 nuclear loci, and found that U. scoparia has low genetic diversity relative to the U. notata species complex, comparable to that of chimpanzees and southern elephant seals. Analyses of genotypes using Bayesian clustering algorithms did not identify discrete populations within U. scoparia. Using isolation-with-migration (IM) models and a novel coalescent-based hypothesis testing approach, we estimated that U. scoparia diverged from U. notata in the Pleistocene epoch. The likelihood ratio test and the Akaike Information Criterion consistently rejected nested speciation models that included parameters for migration and population growth of U. scoparia. We reject the Neogene vicariance hypothesis for the speciation of U. scoparia and define this species as a single evolutionarily significant unit for conservation purposes. PMID:25360285
NASA Astrophysics Data System (ADS)
Zakiya, Hanifah; Sinaga, Parlindungan; Hamidah, Ida
2017-05-01
The results of field studies showed the ability of science literacy of students was still low. One root of the problem lies in the books used in learning is not oriented toward science literacy component. This study focused on the effectiveness of the use of textbook-oriented provisioning capability science literacy by using multi modal representation. The text books development method used Design Representational Approach Learning to Write (DRALW). Textbook design which was applied to the topic of "Kinetic Theory of Gases" is implemented in XI grade students of high school learning. Effectiveness is determined by consideration of the effect and the normalized percentage gain value, while the hypothesis was tested using Independent T-test. The results showed that the textbooks which were developed using multi-mode representation science can improve the literacy skills of students. Based on the size of the effect size textbooks developed with representation multi modal was found effective in improving students' science literacy skills. The improvement was occurred in all the competence and knowledge of scientific literacy. The hypothesis testing showed that there was a significant difference on the ability of science literacy between class that uses textbooks with multi modal representation and the class that uses the regular textbook used in schools.
Leimu, Roosa; Koricheva, Julia
2004-01-01
Temporal changes in the magnitude of research findings have recently been recognized as a general phenomenon in ecology, and have been attributed to the delayed publication of non-significant results and disconfirming evidence. Here we introduce a method of cumulative meta-analysis which allows detection of both temporal trends and publication bias in the ecological literature. To illustrate the application of the method, we used two datasets from recently conducted meta-analyses of studies testing two plant defence theories. Our results revealed three phases in the evolution of the treatment effects. Early studies strongly supported the hypothesis tested, but the magnitude of the effect decreased considerably in later studies. In the latest studies, a trend towards an increase in effect size was observed. In one of the datasets, a cumulative meta-analysis revealed publication bias against studies reporting disconfirming evidence; such studies were published in journals with a lower impact factor compared to studies with results supporting the hypothesis tested. Correlation analysis revealed neither temporal trends nor evidence of publication bias in the datasets analysed. We thus suggest that cumulative meta-analysis should be used as a visual aid to detect temporal trends and publication bias in research findings in ecology in addition to the correlative approach. PMID:15347521
Multiple-hypothesis multiple-model line tracking
NASA Astrophysics Data System (ADS)
Pace, Donald W.; Owen, Mark W.; Cox, Henry
2000-07-01
Passive sonar signal processing generally includes tracking of narrowband and/or broadband signature components observed on a Lofargram or on a Bearing-Time-Record (BTR) display. Fielded line tracking approaches to date have been recursive and single-hypthesis-oriented Kalman- or alpha-beta filters, with no mechanism for considering tracking alternatives beyond the most recent scan of measurements. While adaptivity is often built into the filter to handle changing track dynamics, these approaches are still extensions of single target tracking solutions to multiple target tracking environment. This paper describes an application of multiple-hypothesis, multiple target tracking technology to the sonar line tracking problem. A Multiple Hypothesis Line Tracker (MHLT) is developed which retains the recursive minimum-mean-square-error tracking behavior of a Kalman Filter in a maximum-a-posteriori delayed-decision multiple hypothesis context. Multiple line track filter states are developed and maintained using the interacting multiple model (IMM) state representation. Further, the data association and assignment problem is enhanced by considering line attribute information (line bandwidth and SNR) in addition to beam/bearing and frequency fit. MHLT results on real sonar data are presented to demonstrate the benefits of the multiple hypothesis approach. The utility of the system in cluttered environments and particularly in crossing line situations is shown.
The Hypothesis-Driven Physical Examination.
Garibaldi, Brian T; Olson, Andrew P J
2018-05-01
The physical examination remains a vital part of the clinical encounter. However, physical examination skills have declined in recent years, in part because of decreased time at the bedside. Many clinicians question the relevance of physical examinations in the age of technology. A hypothesis-driven approach to teaching and practicing the physical examination emphasizes the performance of maneuvers that can alter the likelihood of disease. Likelihood ratios are diagnostic weights that allow clinicians to estimate the post-probability of disease. This hypothesis-driven approach to the physical examination increases its value and efficiency, while preserving its cultural role in the patient-physician relationship. Copyright © 2017 Elsevier Inc. All rights reserved.
Debates—Hypothesis testing in hydrology: Introduction
NASA Astrophysics Data System (ADS)
Blöschl, Günter
2017-03-01
This paper introduces the papers in the "Debates—Hypothesis testing in hydrology" series. The four articles in the series discuss whether and how the process of testing hypotheses leads to progress in hydrology. Repeated experiments with controlled boundary conditions are rarely feasible in hydrology. Research is therefore not easily aligned with the classical scientific method of testing hypotheses. Hypotheses in hydrology are often enshrined in computer models which are tested against observed data. Testability may be limited due to model complexity and data uncertainty. All four articles suggest that hypothesis testing has contributed to progress in hydrology and is needed in the future. However, the procedure is usually not as systematic as the philosophy of science suggests. A greater emphasis on a creative reasoning process on the basis of clues and explorative analyses is therefore needed.
ERIC Educational Resources Information Center
White, Brian
2004-01-01
This paper presents a generally applicable method for characterizing subjects' hypothesis-testing behaviour based on a synthesis that extends on previous work. Beginning with a transcript of subjects' speech and videotape of their actions, a Reasoning Map is created that depicts the flow of their hypotheses, tests, predictions, results, and…
Why Is Test-Restudy Practice Beneficial for Memory? An Evaluation of the Mediator Shift Hypothesis
ERIC Educational Resources Information Center
Pyc, Mary A.; Rawson, Katherine A.
2012-01-01
Although the memorial benefits of testing are well established empirically, the mechanisms underlying this benefit are not well understood. The authors evaluated the mediator shift hypothesis, which states that test-restudy practice is beneficial for memory because retrieval failures during practice allow individuals to evaluate the effectiveness…
Growth factor transgenes interactively regulate articular chondrocytes.
Shi, Shuiliang; Mercer, Scott; Eckert, George J; Trippel, Stephen B
2013-04-01
Adult articular chondrocytes lack an effective repair response to correct damage from injury or osteoarthritis. Polypeptide growth factors that stimulate articular chondrocyte proliferation and cartilage matrix synthesis may augment this response. Gene transfer is a promising approach to delivering such factors. Multiple growth factor genes regulate these cell functions, but multiple growth factor gene transfer remains unexplored. We tested the hypothesis that multiple growth factor gene transfer selectively modulates articular chondrocyte proliferation and matrix synthesis. We tested the hypothesis by delivering combinations of the transgenes encoding insulin-like growth factor I (IGF-I), fibroblast growth factor-2 (FGF-2), transforming growth factor beta1 (TGF-β1), bone morphogenetic protein-2 (BMP-2), and bone morphogenetic protien-7 (BMP-7) to articular chondrocytes and measured changes in the production of DNA, glycosaminoglycan, and collagen. The transgenes differentially regulated all these chondrocyte activities. In concert, the transgenes interacted to generate widely divergent responses from the cells. These interactions ranged from inhibitory to synergistic. The transgene pair encoding IGF-I and FGF-2 maximized cell proliferation. The three-transgene group encoding IGF-I, BMP-2, and BMP-7 maximized matrix production and also optimized the balance between cell proliferation and matrix production. These data demonstrate an approach to articular chondrocyte regulation that may be tailored to stimulate specific cell functions, and suggest that certain growth factor gene combinations have potential value for cell-based articular cartilage repair. Copyright © 2012 Wiley Periodicals, Inc.
Map LineUps: Effects of spatial structure on graphical inference.
Beecham, Roger; Dykes, Jason; Meulemans, Wouter; Slingsby, Aidan; Turkay, Cagatay; Wood, Jo
2017-01-01
Fundamental to the effective use of visualization as an analytic and descriptive tool is the assurance that presenting data visually provides the capability of making inferences from what we see. This paper explores two related approaches to quantifying the confidence we may have in making visual inferences from mapped geospatial data. We adapt Wickham et al.'s 'Visual Line-up' method as a direct analogy with Null Hypothesis Significance Testing (NHST) and propose a new approach for generating more credible spatial null hypotheses. Rather than using as a spatial null hypothesis the unrealistic assumption of complete spatial randomness, we propose spatially autocorrelated simulations as alternative nulls. We conduct a set of crowdsourced experiments (n=361) to determine the just noticeable difference (JND) between pairs of choropleth maps of geographic units controlling for spatial autocorrelation (Moran's I statistic) and geometric configuration (variance in spatial unit area). Results indicate that people's abilities to perceive differences in spatial autocorrelation vary with baseline autocorrelation structure and the geometric configuration of geographic units. These results allow us, for the first time, to construct a visual equivalent of statistical power for geospatial data. Our JND results add to those provided in recent years by Klippel et al. (2011), Harrison et al. (2014) and Kay & Heer (2015) for correlation visualization. Importantly, they provide an empirical basis for an improved construction of visual line-ups for maps and the development of theory to inform geospatial tests of graphical inference.
Evolution of limited seed dispersal ability on gypsum islands.
Schenk, John J
2013-09-01
Dispersal is a major feature of plant evolution that has many advantages but is not always favored. Wide dispersal, for example, leads to greater seed loss in oceanic-island endemics, and evolution has favored morphologies that limit dispersal. I tested the hypothesis that selection favored limited dispersal on gypsum islands in western North America, where edaphic communities are sparsely vegetated except for a specialized flora that competes poorly with the surrounding flora. • I applied a series of comparative phylogenetic approaches to gypsophilic species of Mentzelia section Bartonia (Loasaceae) to investigate the evolution of limited dispersal function in seed wings, which increase primary dispersal by wind. Through these tests, I determined whether narrowed wings were selected for in gypsophilic species. • Gypsophily was derived four to seven times. Seed area was not significantly correlated with gypsophily or wing area. Wing area was significantly smaller in the derived gypsum endemics, supporting the hypothesis in favor of limited dispersal function. A model-fitting approach identified two trait optima in wing area, with gypsum endemics having a lower optimum. • Evolution into novel ecologies influences morphological evolution. Morphological characters have been selected for limited dispersal following evolution onto gypsum islands. Selection for limited dispersal ability has occurred across animals and plants, both in oceanic and terrestrial systems, which suggests that reduced dispersal ability may be a general process: selection favors limited dispersal if the difference in survival between the habitat of the parent and the surrounding area is great enough.
Mayo, Ruth; Alfasi, Dana; Schwarz, Norbert
2014-06-01
Feelings of distrust alert people not to take information at face value, which may influence their reasoning strategy. Using the Wason (1960) rule identification task, we tested whether chronic and temporary distrust increase the use of negative hypothesis testing strategies suited to falsify one's own initial hunch. In Study 1, participants who were low in dispositional trust were more likely to engage in negative hypothesis testing than participants high in dispositional trust. In Study 2, trust and distrust were induced through an alleged person-memory task. Paralleling the effects of chronic distrust, participants exposed to a single distrust-eliciting face were 3 times as likely to engage in negative hypothesis testing as participants exposed to a trust-eliciting face. In both studies, distrust increased negative hypothesis testing, which was associated with better performance on the Wason task. In contrast, participants' initial rule generation was not consistently affected by distrust. These findings provide first evidence that distrust can influence which reasoning strategy people adopt. PsycINFO Database Record (c) 2014 APA, all rights reserved.
NASA Astrophysics Data System (ADS)
Fovet, O.; Hrachowitz, M.; RUIZ, L.; Gascuel-odoux, C.; Savenije, H.
2013-12-01
While most hydrological models reproduce the general flow dynamics of a system, they frequently fail to adequately mimic system internal processes. This is likely to make them inadequate to simulate solutes transport. For example, the hysteresis between storage and discharge, which is often observed in shallow hard-rock aquifers, is rarely well reproduced by models. One main reason is that this hysteresis has little weight in the calibration because objective functions are based on time series of individual variables. This reduces the ability of classical calibration/validation procedures to assess the relevance of the conceptual hypothesis associated with hydrological models. Calibrating models on variables derived from the combination of different individual variables (like stream discharge and groundwater levels) is a way to insure that models will be accepted based on their consistency. Here we therefore test the value of this more systems-like approach to test different hypothesis on the behaviour of a small experimental low-land catchment in French Brittany (ORE AgrHys) where a high hysteresis is observed on the stream flow vs. shallow groundwater level relationship. Several conceptual models were applied to this site, and calibrated using objective functions based on metrics of this hysteresis. The tested model structures differed with respect to the storage function in each reservoir, the storage-discharge function in each reservoir, the deep loss expressions (as constant or variable fraction), the number of reservoirs (from 1 to 4) and their organization (parallel, series). The observed hysteretic groundwater level-discharge relationship was not satisfactorily reproduced by most of the tested models except for the most complex ones. Those were thus more consistent, their underlying hypotheses are probably more realistic even though their performance for simulating observed stream flow was decreased. Selecting models based on such systems-like approach is likely to improve their efficiency for environmental application e.g. on solute transport issues. The next step would be to apply the same approach with variables combining hydrological and biogeochemical variables.
Dediu, Dan
2011-02-07
Language is a hallmark of our species and understanding linguistic diversity is an area of major interest. Genetic factors influencing the cultural transmission of language provide a powerful and elegant explanation for aspects of the present day linguistic diversity and a window into the emergence and evolution of language. In particular, it has recently been proposed that linguistic tone-the usage of voice pitch to convey lexical and grammatical meaning-is biased by two genes involved in brain growth and development, ASPM and Microcephalin. This hypothesis predicts that tone is a stable characteristic of language because of its 'genetic anchoring'. The present paper tests this prediction using a Bayesian phylogenetic framework applied to a large set of linguistic features and language families, using multiple software implementations, data codings, stability estimations, linguistic classifications and outgroup choices. The results of these different methods and datasets show a large agreement, suggesting that this approach produces reliable estimates of the stability of linguistic data. Moreover, linguistic tone is found to be stable across methods and datasets, providing suggestive support for the hypothesis of genetic influences on its distribution.
Oxytocin tempers calculated greed but not impulsive defense in predator–prey contests
Scholte, H. Steven; van Winden, Frans A. A. M.; Ridderinkhof, K. Richard
2015-01-01
Human cooperation and competition is modulated by oxytocin, a hypothalamic neuropeptide that functions as both hormone and neurotransmitter. Oxytocin’s functions can be captured in two explanatory yet largely contradictory frameworks: the fear-dampening (FD) hypothesis that oxytocin has anxiolytic effects and reduces fear-motivated action; and the social approach/avoidance (SAA) hypothesis that oxytocin increases cooperative approach and facilitates protection against aversive stimuli and threat. We tested derivations from both frameworks in a novel predator–prey contest game. Healthy males given oxytocin or placebo invested as predator to win their prey’s endowment, or as prey to protect their endowment against predation. Neural activity was registered using 3T-MRI. In prey, (fear-motivated) investments were fast and conditioned on the amygdala. Inconsistent with FD, oxytocin did not modulate neural and behavioral responding in prey. In predators, (greed-motivated) investments were slower, and conditioned on the superior frontal gyrus (SFG). Consistent with SAA, oxytocin reduced predator investment, time to decide and activation in SFG. Thus, whereas oxytocin does not incapacitate the impulsive ability to protect and defend oneself, it lowers the greedy and more calculated appetite for coming out ahead. PMID:25140047
Area, length and thickness conservation: Dogma or reality?
NASA Astrophysics Data System (ADS)
Moretti, Isabelle; Callot, Jean Paul
2012-08-01
The basic assumption of quantitative structural geology is the preservation of material during deformation. However the hypothesis of volume conservation alone does not help to predict past or future geometries and so this assumption is usually translated into bed length in 2D (or area in 3D) and thickness conservation. When subsurface data are missing, geologists may extrapolate surface data to depth using the kink-band approach. These extrapolations, preserving both thicknesses and dips, lead to geometries which are restorable but often erroneous, due to both disharmonic deformation and internal deformation of layers. First, the Bolivian Sub-Andean Zone case is presented to highlight the evolution of the concepts on which balancing is based, and the important role played by a decoupling level in enhancing disharmony. Second, analogue models are analyzed to test the validity of the balancing techniques. Chamberlin's excess area approach is shown to be on average valid. However, neither the length nor the thicknesses are preserved. We propose that in real cases, the length preservation hypothesis during shortening could also be a wrong assumption. If the data are good enough to image the decollement level, the Chamberlin excess area method could be used to compute the bed length changes.
Biased Competition in Visual Processing Hierarchies: A Learning Approach Using Multiple Cues.
Gepperth, Alexander R T; Rebhan, Sven; Hasler, Stephan; Fritsch, Jannik
2011-03-01
In this contribution, we present a large-scale hierarchical system for object detection fusing bottom-up (signal-driven) processing results with top-down (model or task-driven) attentional modulation. Specifically, we focus on the question of how the autonomous learning of invariant models can be embedded into a performing system and how such models can be used to define object-specific attentional modulation signals. Our system implements bi-directional data flow in a processing hierarchy. The bottom-up data flow proceeds from a preprocessing level to the hypothesis level where object hypotheses created by exhaustive object detection algorithms are represented in a roughly retinotopic way. A competitive selection mechanism is used to determine the most confident hypotheses, which are used on the system level to train multimodal models that link object identity to invariant hypothesis properties. The top-down data flow originates at the system level, where the trained multimodal models are used to obtain space- and feature-based attentional modulation signals, providing biases for the competitive selection process at the hypothesis level. This results in object-specific hypothesis facilitation/suppression in certain image regions which we show to be applicable to different object detection mechanisms. In order to demonstrate the benefits of this approach, we apply the system to the detection of cars in a variety of challenging traffic videos. Evaluating our approach on a publicly available dataset containing approximately 3,500 annotated video images from more than 1 h of driving, we can show strong increases in performance and generalization when compared to object detection in isolation. Furthermore, we compare our results to a late hypothesis rejection approach, showing that early coupling of top-down and bottom-up information is a favorable approach especially when processing resources are constrained.
Self organising hypothesis networks: a new approach for representing and structuring SAR knowledge
2014-01-01
Background Combining different sources of knowledge to build improved structure activity relationship models is not easy owing to the variety of knowledge formats and the absence of a common framework to interoperate between learning techniques. Most of the current approaches address this problem by using consensus models that operate at the prediction level. We explore the possibility to directly combine these sources at the knowledge level, with the aim to harvest potentially increased synergy at an earlier stage. Our goal is to design a general methodology to facilitate knowledge discovery and produce accurate and interpretable models. Results To combine models at the knowledge level, we propose to decouple the learning phase from the knowledge application phase using a pivot representation (lingua franca) based on the concept of hypothesis. A hypothesis is a simple and interpretable knowledge unit. Regardless of its origin, knowledge is broken down into a collection of hypotheses. These hypotheses are subsequently organised into hierarchical network. This unification permits to combine different sources of knowledge into a common formalised framework. The approach allows us to create a synergistic system between different forms of knowledge and new algorithms can be applied to leverage this unified model. This first article focuses on the general principle of the Self Organising Hypothesis Network (SOHN) approach in the context of binary classification problems along with an illustrative application to the prediction of mutagenicity. Conclusion It is possible to represent knowledge in the unified form of a hypothesis network allowing interpretable predictions with performances comparable to mainstream machine learning techniques. This new approach offers the potential to combine knowledge from different sources into a common framework in which high level reasoning and meta-learning can be applied; these latter perspectives will be explored in future work. PMID:24959206
Synergies in the space of control variables within the equilibrium-point hypothesis.
Ambike, S; Mattos, D; Zatsiorsky, V M; Latash, M L
2016-02-19
We use an approach rooted in the recent theory of synergies to analyze possible co-variation between two hypothetical control variables involved in finger force production based on the equilibrium-point (EP) hypothesis. These control variables are the referent coordinate (R) and apparent stiffness (C) of the finger. We tested a hypothesis that inter-trial co-variation in the {R; C} space during repeated, accurate force production trials stabilizes the fingertip force. This was expected to correspond to a relatively low amount of inter-trial variability affecting force and a high amount of variability keeping the force unchanged. We used the "inverse piano" apparatus to apply small and smooth positional perturbations to fingers during force production tasks. Across trials, R and C showed strong co-variation with the data points lying close to a hyperbolic curve. Hyperbolic regressions accounted for over 99% of the variance in the {R; C} space. Another analysis was conducted by randomizing the original {R; C} data sets and creating surrogate data sets that were then used to compute predicted force values. The surrogate sets always showed much higher force variance compared to the actual data, thus reinforcing the conclusion that finger force control was organized in the {R; C} space, as predicted by the EP hypothesis, and involved co-variation in that space stabilizing total force. Copyright © 2015 IBRO. Published by Elsevier Ltd. All rights reserved.
Synergies in the space of control variables within the equilibrium-point hypothesis
Ambike, Satyajit; Mattos, Daniela; Zatsiorsky, Vladimir M.; Latash, Mark L.
2015-01-01
We use an approach rooted in the recent theory of synergies to analyze possible co-variation between two hypothetical control variables involved in finger force production based in the equilibrium-point hypothesis. These control variables are the referent coordinate (R) and apparent stiffness (C) of the finger. We tested a hypothesis that inter-trial co-variation in the {R; C} space during repeated, accurate force production trials stabilizes the fingertip force. This was expected to correspond to a relatively low amount of inter-trial variability affecting force and a high amount of variability keeping the force unchanged. We used the “inverse piano” apparatus to apply small and smooth positional perturbations to fingers during force production tasks. Across trials, R and C showed strong co-variation with the data points lying close to a hyperbolic curve. Hyperbolic regressions accounted for over 99% of the variance in the {R; C} space. Another analysis was conducted by randomizing the original {R; C} data sets and creating surrogate data sets that were then used to compute predicted force values. The surrogate sets always showed much higher force variance compared to the actual data, thus reinforcing the conclusion that finger force control was organized in the {R; C} space, as predicted by the equilibrium-point hypothesis, and involved co-variation in that space stabilizing total force. PMID:26701299
In Defense of the Play-Creativity Hypothesis
ERIC Educational Resources Information Center
Silverman, Irwin W.
2016-01-01
The hypothesis that pretend play facilitates the creative thought process in children has received a great deal of attention. In a literature review, Lillard et al. (2013, p. 8) concluded that the evidence for this hypothesis was "not convincing." This article focuses on experimental and training studies that have tested this hypothesis.…
A two-hypothesis approach to establishing a life detection/biohazard protocol for planetary samples
NASA Astrophysics Data System (ADS)
Conley, Catharine; Steele, Andrew
2016-07-01
The COSPAR policy on performing a biohazard assessment on samples brought from Mars to Earth is framed in the context of a concern for false-positive results. However, as noted during the 2012 Workshop for Life Detection in Samples from Mars (ref. Kminek et al., 2014), a more significant concern for planetary samples brought to Earth is false-negative results, because an undetected biohazard could increase risk to the Earth. This is the reason that stringent contamination control must be a high priority for all Category V Restricted Earth Return missions. A useful conceptual framework for addressing these concerns involves two complementary 'null' hypotheses: testing both of them, together, would allow statistical and community confidence to be developed regarding one or the other conclusion. As noted above, false negatives are of primary concern for safety of the Earth, so the 'Earth Safety null hypothesis' -- that must be disproved to assure low risk to the Earth from samples introduced by Category V Restricted Earth Return missions -- is 'There is native life in these samples.' False positives are of primary concern for Astrobiology, so the 'Astrobiology null hypothesis' -- that must be disproved in order to demonstrate the existence of extraterrestrial life is 'There is no life in these samples.' The presence of Earth contamination would render both of these hypotheses more difficult to disprove. Both these hypotheses can be tested following a strict science protocol; analyse, interprete, test the hypotheses and repeat. The science measurements undertaken are then done in an iterative fashion that responds to discovery with both hypotheses testable from interpretation of the scientific data. This is a robust, community involved activity that ensures maximum science return with minimal sample use.
Keers, Robert; Coleman, Jonathan R.I.; Lester, Kathryn J.; Roberts, Susanna; Breen, Gerome; Thastum, Mikael; Bögels, Susan; Schneider, Silvia; Heiervang, Einar; Meiser-Stedman, Richard; Nauta, Maaike; Creswell, Cathy; Thirlwall, Kerstin; Rapee, Ronald M.; Hudson, Jennifer L.; Lewis, Cathryn; Plomin, Robert; Eley, Thalia C.
2016-01-01
Background The differential susceptibly hypothesis suggests that certain genetic variants moderate the effects of both negative and positive environments on mental health and may therefore be important predictors of response to psychological treatments. Nevertheless, the identification of such variants has so far been limited to preselected candidate genes. In this study we extended the differential susceptibility hypothesis from a candidate gene to a genome-wide approach to test whether a polygenic score of environmental sensitivity predicted response to cognitive behavioural therapy (CBT) in children with anxiety disorders. Methods We identified variants associated with environmental sensitivity using a novel method in which within-pair variability in emotional problems in 1,026 monozygotic twin pairs was examined as a function of the pairs' genotype. We created a polygenic score of environmental sensitivity based on the whole-genome findings and tested the score as a moderator of parenting on emotional problems in 1,406 children and response to individual, group and brief parent-led CBT in 973 children with anxiety disorders. Results The polygenic score significantly moderated the effects of parenting on emotional problems and the effects of treatment. Individuals with a high score responded significantly better to individual CBT than group CBT or brief parent-led CBT (remission rates: 70.9, 55.5 and 41.6%, respectively). Conclusions Pending successful replication, our results should be considered exploratory. Nevertheless, if replicated, they suggest that individuals with the greatest environmental sensitivity may be more likely to develop emotional problems in adverse environments but also benefit more from the most intensive types of treatment. PMID:27043157
Hanrahan, Lawrence P.; Anderson, Henry A.; Busby, Brian; Bekkedal, Marni; Sieger, Thomas; Stephenson, Laura; Knobeloch, Lynda; Werner, Mark; Imm, Pamela; Olson, Joseph
2004-01-01
In this article we describe the development of an information system for environmental childhood cancer surveillance. The Wisconsin Cancer Registry annually receives more than 25,000 incident case reports. Approximately 269 cases per year involve children. Over time, there has been considerable community interest in understanding the role the environment plays as a cause of these cancer cases. Wisconsin’s Public Health Information Network (WI-PHIN) is a robust web portal integrating both Health Alert Network and National Electronic Disease Surveillance System components. WI-PHIN is the information technology platform for all public health surveillance programs. Functions include the secure, automated exchange of cancer case data between public health–based and hospital-based cancer registrars; web-based supplemental data entry for environmental exposure confirmation and hypothesis testing; automated data analysis, visualization, and exposure–outcome record linkage; directories of public health and clinical personnel for role-based access control of sensitive surveillance information; public health information dissemination and alerting; and information technology security and critical infrastructure protection. For hypothesis generation, cancer case data are sent electronically to WI-PHIN and populate the integrated data repository. Environmental data are linked and the exposure–disease relationships are explored using statistical tools for ecologic exposure risk assessment. For hypothesis testing, case–control interviews collect exposure histories, including parental employment and residential histories. This information technology approach can thus serve as the basis for building a comprehensive system to assess environmental cancer etiology. PMID:15471739
The frequentist implications of optional stopping on Bayesian hypothesis tests.
Sanborn, Adam N; Hills, Thomas T
2014-04-01
Null hypothesis significance testing (NHST) is the most commonly used statistical methodology in psychology. The probability of achieving a value as extreme or more extreme than the statistic obtained from the data is evaluated, and if it is low enough, the null hypothesis is rejected. However, because common experimental practice often clashes with the assumptions underlying NHST, these calculated probabilities are often incorrect. Most commonly, experimenters use tests that assume that sample sizes are fixed in advance of data collection but then use the data to determine when to stop; in the limit, experimenters can use data monitoring to guarantee that the null hypothesis will be rejected. Bayesian hypothesis testing (BHT) provides a solution to these ills because the stopping rule used is irrelevant to the calculation of a Bayes factor. In addition, there are strong mathematical guarantees on the frequentist properties of BHT that are comforting for researchers concerned that stopping rules could influence the Bayes factors produced. Here, we show that these guaranteed bounds have limited scope and often do not apply in psychological research. Specifically, we quantitatively demonstrate the impact of optional stopping on the resulting Bayes factors in two common situations: (1) when the truth is a combination of the hypotheses, such as in a heterogeneous population, and (2) when a hypothesis is composite-taking multiple parameter values-such as the alternative hypothesis in a t-test. We found that, for these situations, while the Bayesian interpretation remains correct regardless of the stopping rule used, the choice of stopping rule can, in some situations, greatly increase the chance of experimenters finding evidence in the direction they desire. We suggest ways to control these frequentist implications of stopping rules on BHT.
Comparison of futility monitoring guidelines using completed phase III oncology trials.
Zhang, Qiang; Freidlin, Boris; Korn, Edward L; Halabi, Susan; Mandrekar, Sumithra; Dignam, James J
2017-02-01
Futility (inefficacy) interim monitoring is an important component in the conduct of phase III clinical trials, especially in life-threatening diseases. Desirable futility monitoring guidelines allow timely stopping if the new therapy is harmful or if it is unlikely to demonstrate to be sufficiently effective if the trial were to continue to its final analysis. There are a number of analytical approaches that are used to construct futility monitoring boundaries. The most common approaches are based on conditional power, sequential testing of the alternative hypothesis, or sequential confidence intervals. The resulting futility boundaries vary considerably with respect to the level of evidence required for recommending stopping the study. We evaluate the performance of commonly used methods using event histories from completed phase III clinical trials of the Radiation Therapy Oncology Group, Cancer and Leukemia Group B, and North Central Cancer Treatment Group. We considered published superiority phase III trials with survival endpoints initiated after 1990. There are 52 studies available for this analysis from different disease sites. Total sample size and maximum number of events (statistical information) for each study were calculated using protocol-specified effect size, type I and type II error rates. In addition to the common futility approaches, we considered a recently proposed linear inefficacy boundary approach with an early harm look followed by several lack-of-efficacy analyses. For each futility approach, interim test statistics were generated for three schedules with different analysis frequency, and early stopping was recommended if the interim result crossed a futility stopping boundary. For trials not demonstrating superiority, the impact of each rule is summarized as savings on sample size, study duration, and information time scales. For negative studies, our results show that the futility approaches based on testing the alternative hypothesis and repeated confidence interval rules yielded less savings (compared to the other two rules). These boundaries are too conservative, especially during the first half of the study (<50% of information). The conditional power rules are too aggressive during the second half of the study (>50% of information) and may stop a trial even when there is a clinically meaningful treatment effect. The linear inefficacy boundary with three or more interim analyses provided the best results. For positive studies, we demonstrated that none of the futility rules would have stopped the trials. The linear inefficacy boundary futility approach is attractive from statistical, clinical, and logistical standpoints in clinical trials evaluating new anti-cancer agents.
Sex and Adolescent Ethanol Exposure Influence Pavlovian Conditioned Approach
Madayag, Aric C.; Stringfield, Sierra J.; Reissner, Kathryn J.; Boettiger, Charlotte A.; Robinson, Donita L.
2017-01-01
BACKGROUND Alcohol use among adolescents is widespread and a growing concern due to long-term behavioral deficits, including altered Pavlovian behavior, that potentially contribute to addiction vulnerability. We tested the hypothesis that adolescent intermittent ethanol (AIE) exposure alters Pavlovian behavior in males and females as measured by a shift from goal-tracking to sign-tracking. Additionally, we investigated GLT-1, an astrocytic glutamate transporter, as a potential contributor to a sign-tracking phenotype. METHODS Male and female Sprague-Dawley rats were exposed to AIE (5g/Kg, intragastric) or water intermittently 2 days on, 2 days off from postnatal day (P) 25 to 54. Around P70, animals began 20 daily sessions of Pavlovian conditioned approach, where they learned that a cue predicted non-contingent reward delivery. Lever pressing indicated interaction with the cue, or sign-tracking, and receptacle entries indicated approach to the reward delivery location, or goal-tracking. To test for effects of AIE on nucleus accumbens excitatory signaling, we isolated membrane subfractions and measured protein levels of the glutamate transporter GLT-1 after animals completed behavior as a measure of glutamate homeostasis. RESULTS Females exhibited elevated sign-tracking compared to males with significantly more lever presses, faster latency to first lever press, and greater probability to lever press in a trial. AIE significantly increased lever pressing while blunting goal tracking, as indicated by fewer cue-evoked receptacle entries, slower latency to receptacle entry, and lower probability to enter the receptacle in a trial. No significant Sex-by-Exposure interactions were observed in sign- or goal-tracking metrics. Moreover, we found no significant effects of Sex or Exposure on membrane GLT-1 expression in the nucleus accumbens. CONCLUSIONS Females exhibited enhanced sign-tracking compared to males, while AIE decreased goal-tracking compared to control exposure. Our findings support the hypothesis that adolescent binge ethanol can shift conditioned behavior from goal- to cue-directed in Pavlovian conditioned approach, especially in females. PMID:28196273
NASA Astrophysics Data System (ADS)
Soulis, K. X.; Valiantzas, J. D.
2012-03-01
The Soil Conservation Service Curve Number (SCS-CN) approach is widely used as a simple method for predicting direct runoff volume for a given rainfall event. The CN parameter values corresponding to various soil, land cover, and land management conditions can be selected from tables, but it is preferable to estimate the CN value from measured rainfall-runoff data if available. However, previous researchers indicated that the CN values calculated from measured rainfall-runoff data vary systematically with the rainfall depth. Hence, they suggested the determination of a single asymptotic CN value observed for very high rainfall depths to characterize the watersheds' runoff response. In this paper, the hypothesis that the observed correlation between the calculated CN value and the rainfall depth in a watershed reflects the effect of soils and land cover spatial variability on its hydrologic response is being tested. Based on this hypothesis, the simplified concept of a two-CN heterogeneous system is introduced to model the observed CN-rainfall variation by reducing the CN spatial variability into two classes. The behaviour of the CN-rainfall function produced by the simplified two-CN system is approached theoretically, it is analysed systematically, and it is found to be similar to the variation observed in natural watersheds. Synthetic data tests, natural watersheds examples, and detailed study of two natural experimental watersheds with known spatial heterogeneity characteristics were used to evaluate the method. The results indicate that the determination of CN values from rainfall runoff data using the proposed two-CN system approach provides reasonable accuracy and it over performs the previous methods based on the determination of a single asymptotic CN value. Although the suggested method increases the number of unknown parameters to three (instead of one), a clear physical reasoning for them is presented.
Gana, Kamel; Saada, Yaël; Broc, Guillaume; Quintard, Bruno; Amieva, Hélène; Dartigues, Jean-François
2016-02-01
Reciprocal relationships between positive affect (PA) and health are now subject of a heuristic debate in psychology and behavioral medicine. Two radically opposed approaches address the link between subjective well being (SWB) and physical health: top-down (i.e., psychosomatic hypothesis) and bottom-up (i.e., disability/ability hypothesis) approaches. The aim of the present study was to test these two approaches by investigating thirteen-year longitudinal relationships between PA, as an affective dimension of SWB, and functional health in older people. The study included 3754 participants aged 62-101 years assessed 6 times over a thirteen-year period. PA was measured by the mean of the positive affect subscale of the CES-D scale. Functional health was assessed by four composite items: a single-item self-rating of hearing impairment, a single-item self-rating of vision impairment, the number of medically prescribed drugs, and a single-item self-rating of dyspnoea. We used cross-lagged modeling with latent variables, which is appropriate for testing specific theories. Mean arterial pressure, diabetes mellitus and hypercholesterolemia status, sequelae of stroke, gender, level of education, and age at baseline were use as control variables in the models. Results indicated that good health significantly predicted subsequent levels of PA (average β = -0.58, p < 0.001), but PA did not predict subsequent levels of good health (β = 0.01, ns). This finding, obtained from a sample of older people, is in keeping with the bottom-up approach, and supports the popular adage "As long as you've got your health". Limitations of this finding are reviewed and discussed. Models including longitudinal mediators, such as biomarkers and life style patterns, are needed to clarify the nature of the link between these constructs. Copyright © 2015 Elsevier Ltd. All rights reserved.
Robustly Aligning a Shape Model and Its Application to Car Alignment of Unknown Pose.
Li, Yan; Gu, Leon; Kanade, Takeo
2011-09-01
Precisely localizing in an image a set of feature points that form a shape of an object, such as car or face, is called alignment. Previous shape alignment methods attempted to fit a whole shape model to the observed data, based on the assumption of Gaussian observation noise and the associated regularization process. However, such an approach, though able to deal with Gaussian noise in feature detection, turns out not to be robust or precise because it is vulnerable to gross feature detection errors or outliers resulting from partial occlusions or spurious features from the background or neighboring objects. We address this problem by adopting a randomized hypothesis-and-test approach. First, a Bayesian inference algorithm is developed to generate a shape-and-pose hypothesis of the object from a partial shape or a subset of feature points. For alignment, a large number of hypotheses are generated by randomly sampling subsets of feature points, and then evaluated to find the one that minimizes the shape prediction error. This method of randomized subset-based matching can effectively handle outliers and recover the correct object shape. We apply this approach on a challenging data set of over 5,000 different-posed car images, spanning a wide variety of car types, lighting, background scenes, and partial occlusions. Experimental results demonstrate favorable improvements over previous methods on both accuracy and robustness.
TRANSGENIC MOUSE MODELS AND PARTICULATE MATTER (PM)
The hypothesis to be tested is that metal catalyzed oxidative stress can contribute to the biological effects of particulate matter. We acquired several transgenic mouse strains to test this hypothesis. Breeding of the mice was accomplished by Duke University. Particles employed ...
Hypothesis Testing Using the Films of the Three Stooges
ERIC Educational Resources Information Center
Gardner, Robert; Davidson, Robert
2010-01-01
The use of The Three Stooges' films as a source of data in an introductory statistics class is described. The Stooges' films are separated into three populations. Using these populations, students may conduct hypothesis tests with data they collect.
Hovick, Stephen M; Whitney, Kenneth D
2014-01-01
The hypothesis that interspecific hybridisation promotes invasiveness has received much recent attention, but tests of the hypothesis can suffer from important limitations. Here, we provide the first systematic review of studies experimentally testing the hybridisation-invasion (H-I) hypothesis in plants, animals and fungi. We identified 72 hybrid systems for which hybridisation has been putatively associated with invasiveness, weediness or range expansion. Within this group, 15 systems (comprising 34 studies) experimentally tested performance of hybrids vs. their parental species and met our other criteria. Both phylogenetic and non-phylogenetic meta-analyses demonstrated that wild hybrids were significantly more fecund and larger than their parental taxa, but did not differ in survival. Resynthesised hybrids (which typically represent earlier generations than do wild hybrids) did not consistently differ from parental species in fecundity, survival or size. Using meta-regression, we found that fecundity increased (but survival decreased) with generation in resynthesised hybrids, suggesting that natural selection can play an important role in shaping hybrid performance – and thus invasiveness – over time. We conclude that the available evidence supports the H-I hypothesis, with the caveat that our results are clearly driven by tests in plants, which are more numerous than tests in animals and fungi. PMID:25234578
The Harm Done to Reproducibility by the Culture of Null Hypothesis Significance Testing.
Lash, Timothy L
2017-09-15
In the last few years, stakeholders in the scientific community have raised alarms about a perceived lack of reproducibility of scientific results. In reaction, guidelines for journals have been promulgated and grant applicants have been asked to address the rigor and reproducibility of their proposed projects. Neither solution addresses a primary culprit, which is the culture of null hypothesis significance testing that dominates statistical analysis and inference. In an innovative research enterprise, selection of results for further evaluation based on null hypothesis significance testing is doomed to yield a low proportion of reproducible results and a high proportion of effects that are initially overestimated. In addition, the culture of null hypothesis significance testing discourages quantitative adjustments to account for systematic errors and quantitative incorporation of prior information. These strategies would otherwise improve reproducibility and have not been previously proposed in the widely cited literature on this topic. Without discarding the culture of null hypothesis significance testing and implementing these alternative methods for statistical analysis and inference, all other strategies for improving reproducibility will yield marginal gains at best. © The Author(s) 2017. Published by Oxford University Press on behalf of the Johns Hopkins Bloomberg School of Public Health. All rights reserved. For permissions, please e-mail: journals.permissions@oup.com.
Luo, Jing; Wang, An-Lu; Xu, Hao; Shi, Da-Zhuo; Chen, Ke-Ji
2016-11-01
Stenosis of the coronary artery has been considered as an essential component of ischemic heart disease (IHD). Consequently, revascularization [e.g., percutaneous coronary intervention (PCI) and coronary artery bypass] has been the primary therapeutic approach to IHD. Such strategy has indeed revolutionized the management of IHD patients. However, not all patients with myocardial ischemia have visible coronary stenosis. Moreover, cardiovascular events occur in nearly 20% patients with stable coronary artery disease who have undergone PCI. The recently proposed "solar system" hypothesis of IHD postulates that coronary stenosis is only one (albeit important) of its features. Mechanistic contribution and clinical implication of multiple pathophysiological processes beyond coronary stenosis are highlighted in this hypothesis. On the basis of a holistic regulation and individualized medicine, Chinese medicine (CM) has been used in the real-world setting to manage a variety of diseases, including IHD, for more than two thousands years. In this article, we summarize the evidence of CM that supports the "solar system" IHD hypothesis, and argue for a comprehensive approach to IHD. At the theoretical level, the central features of this approach include a holistic view of disease and human subjects, as well as individualized medicine. At the practical level, this approach emphasizes anoxia-tolerance and self-healing.
Entropy generation in biophysical systems
NASA Astrophysics Data System (ADS)
Lucia, U.; Maino, G.
2013-03-01
Recently, in theoretical biology and in biophysical engineering the entropy production has been verified to approach asymptotically its maximum rate, by using the probability of individual elementary modes distributed in accordance with the Boltzmann distribution. The basis of this approach is the hypothesis that the entropy production rate is maximum at the stationary state. In the present work, this hypothesis is explained and motivated, starting from the entropy generation analysis. This latter quantity is obtained from the entropy balance for open systems considering the lifetime of the natural real process. The Lagrangian formalism is introduced in order to develop an analytical approach to the thermodynamic analysis of the open irreversible systems. The stationary conditions of the open systems are thus obtained in relation to the entropy generation and the least action principle. Consequently, the considered hypothesis is analytically proved and it represents an original basic approach in theoretical and mathematical biology and also in biophysical engineering. It is worth remarking that the present results show that entropy generation not only increases but increases as fast as possible.
Gray, Wayne D; Sims, Chris R; Fu, Wai-Tat; Schoelles, Michael J
2006-07-01
Soft constraints hypothesis (SCH) is a rational analysis approach that holds that the mixture of perceptual-motor and cognitive resources allocated for interactive behavior is adjusted based on temporal cost-benefit tradeoffs. Alternative approaches maintain that cognitive resources are in some sense protected or conserved in that greater amounts of perceptual-motor effort will be expended to conserve lesser amounts of cognitive effort. One alternative, the minimum memory hypothesis (MMH), holds that people favor strategies that minimize the use of memory. SCH is compared with MMH across 3 experiments and with predictions of an Ideal Performer Model that uses ACT-R's memory system in a reinforcement learning approach that maximizes expected utility by minimizing time. Model and data support the SCH view of resource allocation; at the under 1000-ms level of analysis, mixtures of cognitive and perceptual-motor resources are adjusted based on their cost-benefit tradeoffs for interactive behavior. ((c) 2006 APA, all rights reserved).
Tachinardi, Patricia; Valentinuzzi, Verónica S; Oda, Gisele A; Buck, C Loren
The tuco-tuco (Ctenomys aff. knighti) is among the rodent species known to be nocturnal under standard laboratory conditions and diurnal under natural conditions. The circadian thermoenergetics (CTE) hypothesis postulates that switches in activity timing are a response to energetic challenges; daytime activity reduces thermoregulatory costs by consolidating activity to the warmest part of the day. Studying wild animals under both captive and natural conditions can increase understanding of how temporal activity patterns are shaped by the environment and could serve as a test of the CTE hypothesis. We estimated the effects of activity timing on energy expenditure for the tuco-tuco by combining laboratory measurements of metabolic rate with environmental temperature records in both winter and summer. We showed that, in winter, there would be considerable energy savings if activity is allocated at least partially during daylight, lending support to the CTE hypothesis. In summer, the impact of activity timing on energy expenditure is small, suggesting that during this season other factors, such as predation risk, water balance, and social interaction, may have more important roles than energetics in the determination of activity time.
Watts, Joseph; Greenhill, Simon J.; Atkinson, Quentin D.; Currie, Thomas E.; Bulbulia, Joseph; Gray, Russell D.
2015-01-01
Supernatural belief presents an explanatory challenge to evolutionary theorists—it is both costly and prevalent. One influential functional explanation claims that the imagined threat of supernatural punishment can suppress selfishness and enhance cooperation. Specifically, morally concerned supreme deities or ‘moralizing high gods' have been argued to reduce free-riding in large social groups, enabling believers to build the kind of complex societies that define modern humanity. Previous cross-cultural studies claiming to support the MHG hypothesis rely on correlational analyses only and do not correct for the statistical non-independence of sampled cultures. Here we use a Bayesian phylogenetic approach with a sample of 96 Austronesian cultures to test the MHG hypothesis as well as an alternative supernatural punishment hypothesis that allows punishment by a broad range of moralizing agents. We find evidence that broad supernatural punishment drives political complexity, whereas MHGs follow political complexity. We suggest that the concept of MHGs diffused as part of a suite of traits arising from cultural exchange between complex societies. Our results show the power of phylogenetic methods to address long-standing debates about the origins and functions of religion in human society. PMID:25740888
2010-01-01
Background Despite constant progress, cancer remains the second leading cause of death in the United States. The ability of tumors to metastasize is central to this dilemma, as many studies demonstrate successful treatment correlating to diagnosis prior to cancer spread. Hence a better understanding of cancer invasiveness and metastasis could provide critical insight. Presentation of the hypothesis We hypothesize that a systems biology-based comparison of cancer invasiveness and suburban sprawl will reveal similarities that are instructive. Testing the hypothesis We compare the structure and behavior of invasive cancer to suburban sprawl development. While these two systems differ vastly in dimension, they appear to adhere to scale-invariant laws consistent with invasive behavior in general. We demonstrate that cancer and sprawl have striking similarities in their natural history, initiating factors, patterns of invasion, vessel distribution and even methods of causing death. Implications of the hypothesis We propose that metastatic cancer and suburban sprawl provide striking analogs in invasive behavior, to the extent that conclusions from one system could be predictive of behavior in the other. We suggest ways in which this model could be used to advance our understanding of cancer biology and treatment. PMID:20181145
The Impact of Economic Factors and Acquisition Reforms on the Cost of Defense Weapon Systems
2006-03-01
test for homoskedasticity, the Breusch - Pagan test is employed. The null hypothesis of the Breusch - Pagan test is that the variance is equal to zero...made. Using the Breusch - Pagan test shown in Table 19 below, the prob>chi2 is greater than 05.=α , therefore we fail to reject the null hypothesis...overrunpercentfp100 Breusch - Pagan Test (Ho=Constant Variance) Estimated Results Variance Standard Deviation overrunpercent100
ERIC Educational Resources Information Center
Tryon, Warren W.; Lewis, Charles
2008-01-01
Evidence of group matching frequently takes the form of a nonsignificant test of statistical difference. Theoretical hypotheses of no difference are also tested in this way. These practices are flawed in that null hypothesis statistical testing provides evidence against the null hypothesis and failing to reject H[subscript 0] is not evidence…
Effects of Item Exposure for Conventional Examinations in a Continuous Testing Environment.
ERIC Educational Resources Information Center
Hertz, Norman R.; Chinn, Roberta N.
This study explored the effect of item exposure on two conventional examinations administered as computer-based tests. A principal hypothesis was that item exposure would have little or no effect on average difficulty of the items over the course of an administrative cycle. This hypothesis was tested by exploring conventional item statistics and…
ERIC Educational Resources Information Center
McNeil, Keith
The use of directional and nondirectional hypothesis testing was examined from the perspectives of textbooks, journal articles, and members of editorial boards. Three widely used statistical texts were reviewed in terms of how directional and nondirectional tests of significance were presented. Texts reviewed were written by: (1) D. E. Hinkle, W.…
Ikehara, Kenji
2016-01-01
It is no doubt quite difficult to solve the riddle of the origin of life. So, firstly, I would like to point out the kinds of obstacles there are in solving this riddle and how we should tackle these difficult problems, reviewing the studies that have been conducted so far. After that, I will propose that the consecutive evolutionary steps in a timeline can be rationally deduced by using a common event as a juncture, which is obtained by two counter-directional approaches: one is the bottom-up approach through which many researchers have studied the origin of life, and the other is the top-down approach, through which I established the [GADV]-protein world hypothesis or GADV hypothesis on the origin of life starting from a study on the formation of entirely new genes in extant microorganisms. Last, I will describe the probable evolutionary process from the formation of Earth to the emergence of life, which was deduced by using a common event—the establishment of the first genetic code encoding [GADV]-amino acids—as a juncture for the results obtained from the two approaches. PMID:26821048
Ikehara, Kenji
2016-01-26
It is no doubt quite difficult to solve the riddle of the origin of life. So, firstly, I would like to point out the kinds of obstacles there are in solving this riddle and how we should tackle these difficult problems, reviewing the studies that have been conducted so far. After that, I will propose that the consecutive evolutionary steps in a timeline can be rationally deduced by using a common event as a juncture, which is obtained by two counter-directional approaches: one is the bottom-up approach through which many researchers have studied the origin of life, and the other is the top-down approach, through which I established the [GADV]-protein world hypothesis or GADV hypothesis on the origin of life starting from a study on the formation of entirely new genes in extant microorganisms. Last, I will describe the probable evolutionary process from the formation of Earth to the emergence of life, which was deduced by using a common event-the establishment of the first genetic code encoding [GADV]-amino acids-as a juncture for the results obtained from the two approaches.
The Feminization of School Hypothesis Called into Question among Junior and High School Students
ERIC Educational Resources Information Center
Verniers, Catherine; Martinot, Delphine; Dompnier, Benoît
2016-01-01
Background: The feminization of school hypothesis suggests that boys underachieve in school compared to girls because school rewards feminine characteristics that are at odds with boys' masculine features. Aims: The feminization of school hypothesis lacks empirical evidence. The aim of this study was to test this hypothesis by examining the extent…
Supporting shared hypothesis testing in the biomedical domain.
Agibetov, Asan; Jiménez-Ruiz, Ernesto; Ondrésik, Marta; Solimando, Alessandro; Banerjee, Imon; Guerrini, Giovanna; Catalano, Chiara E; Oliveira, Joaquim M; Patanè, Giuseppe; Reis, Rui L; Spagnuolo, Michela
2018-02-08
Pathogenesis of inflammatory diseases can be tracked by studying the causality relationships among the factors contributing to its development. We could, for instance, hypothesize on the connections of the pathogenesis outcomes to the observed conditions. And to prove such causal hypotheses we would need to have the full understanding of the causal relationships, and we would have to provide all the necessary evidences to support our claims. In practice, however, we might not possess all the background knowledge on the causality relationships, and we might be unable to collect all the evidence to prove our hypotheses. In this work we propose a methodology for the translation of biological knowledge on causality relationships of biological processes and their effects on conditions to a computational framework for hypothesis testing. The methodology consists of two main points: hypothesis graph construction from the formalization of the background knowledge on causality relationships, and confidence measurement in a causality hypothesis as a normalized weighted path computation in the hypothesis graph. In this framework, we can simulate collection of evidences and assess confidence in a causality hypothesis by measuring it proportionally to the amount of available knowledge and collected evidences. We evaluate our methodology on a hypothesis graph that represents both contributing factors which may cause cartilage degradation and the factors which might be caused by the cartilage degradation during osteoarthritis. Hypothesis graph construction has proven to be robust to the addition of potentially contradictory information on the simultaneously positive and negative effects. The obtained confidence measures for the specific causality hypotheses have been validated by our domain experts, and, correspond closely to their subjective assessments of confidences in investigated hypotheses. Overall, our methodology for a shared hypothesis testing framework exhibits important properties that researchers will find useful in literature review for their experimental studies, planning and prioritizing evidence collection acquisition procedures, and testing their hypotheses with different depths of knowledge on causal dependencies of biological processes and their effects on the observed conditions.
The limits to pride: A test of the pro-anorexia hypothesis.
Cornelius, Talea; Blanton, Hart
2016-01-01
Many social psychological models propose that positive self-conceptions promote self-esteem. An extreme version of this hypothesis is advanced in "pro-anorexia" communities: identifying with anorexia, in conjunction with disordered eating, can lead to higher self-esteem. The current study empirically tested this hypothesis. Results challenge the pro-anorexia hypothesis. Although those with higher levels of pro-anorexia identification trended towards higher self-esteem with increased disordered eating, this did not overcome the strong negative main effect of pro-anorexia identification. These data suggest a more effective strategy for promoting self-esteem is to encourage rejection of disordered eating and an anorexic identity.
Does the Slow-Growth, High-Mortality Hypothesis Apply Below Ground?
Hourston, James E; Bennett, Alison E; Johnson, Scott N; Gange, Alan C
2016-01-01
Belowground tri-trophic study systems present a challenging environment in which to study plant-herbivore-natural enemy interactions. For this reason, belowground examples are rarely available for testing general ecological theories. To redress this imbalance, we present, for the first time, data on a belowground tri-trophic system to test the slow growth, high mortality hypothesis. We investigated whether the differing performance of entomopathogenic nematodes (EPNs) in controlling the common pest black vine weevil Otiorhynchus sulcatus could be linked to differently resistant cultivars of the red raspberry Rubus idaeus. The O. sulcatus larvae recovered from R. idaeus plants showed significantly slower growth and higher mortality on the Glen Rosa cultivar, relative to the more commercially favored Glen Ample cultivar creating a convenient system for testing this hypothesis. Heterorhabditis megidis was found to be less effective at controlling O. sulcatus than Steinernema kraussei, but conformed to the hypothesis. However, S. kraussei maintained high levels of O. sulcatus mortality regardless of how larval growth was influenced by R. idaeus cultivar. We link this to direct effects that S. kraussei had on reducing O. sulcatus larval mass, indicating potential sub-lethal effects of S. kraussei, which the slow-growth, high-mortality hypothesis does not account for. Possible origins of these sub-lethal effects of EPN infection and how they may impact on a hypothesis designed and tested with aboveground predator and parasitoid systems are discussed.
Age-Related Impairment on a Forced-Choice Version of the Mnemonic Similarity Task
Huffman, Derek J.; Stark, Craig E. L.
2018-01-01
Previous studies from our lab have indicated that healthy older adults are impaired in their ability to mnemonically discriminate between previously viewed objects and similar lure objects in the Mnemonic Similarity Task (MST). These studies have used either old/similar/new or old/new test formats. The forced-choice test format (e.g., “Did you see object A or object A’ during the encoding phase?”) relies on different assumptions than the old/new test format (e.g., “Did you see this object during the encoding phase?”); hence, converging evidence from these approaches would bolster the conclusion that healthy aging is accompanied by impaired performance on the MST. Consistent with our hypothesis, healthy older adults exhibited impaired performance on a forced-choice test format that required discriminating between a target and a similar lure. We also tested the hypothesis that age-related impairments on the MST could be modeled within a global matching computational framework. We found that decreasing the probability of successful feature encoding in the models caused changes that were similar to the empirical data in healthy older adults. Collectively, our behavioral results extend to the forced-choice test format the finding that healthy aging is accompanied by an impaired ability to discriminate between targets and similar lures, and our modeling results suggest that a diminished probability of encoding stimulus features is a candidate mechanism for memory changes in healthy aging. We also discuss the ability of global matching models to account for findings in other studies that have used variants on mnemonic similarity tasks. PMID:28004951
Energetics, kinetics, and pathway of SNARE folding and assembly revealed by optical tweezers.
Zhang, Yongli
2017-07-01
Soluble N-ethylmaleimide-sensitive factor attachment protein receptors (SNAREs) are universal molecular engines that drive membrane fusion. Particularly, synaptic SNAREs mediate fast calcium-triggered fusion of neurotransmitter-containing vesicles with plasma membranes for synaptic transmission, the basis of all thought and action. During membrane fusion, complementary SNAREs located on two apposed membranes (often called t- and v-SNAREs) join together to assemble into a parallel four-helix bundle, releasing the energy to overcome the energy barrier for fusion. A long-standing hypothesis suggests that SNAREs act like a zipper to draw the two membranes into proximity and thereby force them to fuse. However, a quantitative test of this SNARE zippering hypothesis was hindered by difficulties to determine the energetics and kinetics of SNARE assembly and to identify the relevant folding intermediates. Here, we first review different approaches that have been applied to study SNARE assembly and then focus on high-resolution optical tweezers. We summarize the folding energies, kinetics, and pathways of both wild-type and mutant SNARE complexes derived from this new approach. These results show that synaptic SNAREs assemble in four distinct stages with different functions: slow N-terminal domain association initiates SNARE assembly; a middle domain suspends and controls SNARE assembly; and rapid sequential zippering of the C-terminal domain and the linker domain directly drive membrane fusion. In addition, the kinetics and pathway of the stagewise assembly are shared by other SNARE complexes. These measurements prove the SNARE zippering hypothesis and suggest new mechanisms for SNARE assembly regulated by other proteins. © 2017 The Protein Society.
Connolly, Brian; Matykiewicz, Pawel; Bretonnel Cohen, K; Standridge, Shannon M; Glauser, Tracy A; Dlugos, Dennis J; Koh, Susan; Tham, Eric; Pestian, John
2014-01-01
The constant progress in computational linguistic methods provides amazing opportunities for discovering information in clinical text and enables the clinical scientist to explore novel approaches to care. However, these new approaches need evaluation. We describe an automated system to compare descriptions of epilepsy patients at three different organizations: Cincinnati Children's Hospital, the Children's Hospital Colorado, and the Children's Hospital of Philadelphia. To our knowledge, there have been no similar previous studies. In this work, a support vector machine (SVM)-based natural language processing (NLP) algorithm is trained to classify epilepsy progress notes as belonging to a patient with a specific type of epilepsy from a particular hospital. The same SVM is then used to classify notes from another hospital. Our null hypothesis is that an NLP algorithm cannot be trained using epilepsy-specific notes from one hospital and subsequently used to classify notes from another hospital better than a random baseline classifier. The hypothesis is tested using epilepsy progress notes from the three hospitals. We are able to reject the null hypothesis at the 95% level. It is also found that classification was improved by including notes from a second hospital in the SVM training sample. With a reasonably uniform epilepsy vocabulary and an NLP-based algorithm able to use this uniformity to classify epilepsy progress notes across different hospitals, we can pursue automated comparisons of patient conditions, treatments, and diagnoses across different healthcare settings. Published by the BMJ Publishing Group Limited. For permission to use (where not already granted under a licence) please go to http://group.bmj.com/group/rights-licensing/permissions.
Calculation of free turbulent mixing by interaction approach.
NASA Technical Reports Server (NTRS)
Morel, T.; Torda, T. P.
1973-01-01
The applicability of Bradshaw's interaction hypothesis to two-dimensional free shear flows was investigated. According to it, flows with velocity extrema may be considered to consist of several interacting layers. The hypothesis leads to a new expression for the shear stress which removes the usual restriction that shear stress vanishes at the velocity extremum. The approach is based on kinetic energy and the length scale equations. The compressible flow equations are simplified by restriction to low Mach numbers, and the range of their applicability is discussed. The empirical functions of the turbulence model are found here to be correlated with the spreading rate of the shear layer. The analysis demonstrates that the interaction hypothesis is a workable concept.
Improvements in the simulation code of the SOX experiment
NASA Astrophysics Data System (ADS)
Caminata, A.; Agostini, M.; Altenmüeller, K.; Appel, S.; Atroshchenko, V.; Bellini, G.; Benziger, J.; Bick, D.; Bonfini, G.; Bravo, D.; Caccianiga, B.; Calaprice, F.; Carlini, M.; Cavalcante, P.; Chepurnov, A.; Choi, K.; Cribier, M.; D'Angelo, D.; Davini, S.; Derbin, A.; Di Noto, L.; Drachnev, I.; Durero, M.; Etenko, A.; Farinon, S.; Fischer, V.; Fomenko, K.; Franco, D.; Gabriele, F.; Gaffiot, J.; Galbiati, C.; Gschwender, M.; Ghiano, C.; Giammarchi, M.; Goeger-Neff, M.; Goretti, A.; Gromov, M.; Hagner, C.; Houdy, Th.; Hungerford, E.; Ianni, Aldo; Ianni, Andrea; Jonquères, N.; Jany, A.; Jedrzejczak, K.; Jeschke, D.; Kobychev, V.; Korablev, D.; Korga, G.; Kornoukhov, V.; Kryn, D.; Lachenmaier, T.; Lasserre, T.; Laubenstein, M.; Lehnert, B.; Link, J.; Litvinovich, E.; Lombardi, F.; Lombardi, P.; Ludhova, L.; Lukyanchenko, G.; Machulin, I.; Manecki, S.; Maneschg, W.; Manuzio, G.; Marcocci, S.; Maricic, J.; Mention, G.; Meroni, E.; Meyer, M.; Miramonti, L.; Misiaszek, M.; Montuschi, M.; Mosteiro, P.; Muratova, V.; Musenich, R.; Neumair, B.; Oberauer, L.; Obolensky, M.; Ortica, F.; Pallavicini, M.; Papp, L.; Pocar, A.; Ranucci, G.; Razeto, A.; Re, A.; Romani, A.; Roncin, R.; Rossi, N.; Schönert, S.; Scola, L.; Semenov, D.; Skorokhvatov, M.; Smirnov, O.; Sotnikov, A.; Sukhotin, S.; Suvorov, Y.; Tartaglia, R.; Testera, G.; Thurn, J.; Toropova, M.; Unzhakov, E.; Veyssiére, C.; Vishneva, A.; Vivier, M.; Vogelaar, R. B.; von Feilitzsch, F.; Wang, H.; Weinz, S.; Winter, J.; Wojcik, M.; Wurm, M.; Yokley, Z.; Zaimidoroga, O.; Zavatarelli, S.; Zuber, K.; Zuzel, G.
2017-09-01
The aim of the SOX experiment is to test the hypothesis of existence of light sterile neutrinos trough a short baseline experiment. Electron antineutrinos will be produced by an high activity source and detected in the Borexino experiment. Both an oscillometry approach and a conventional disappearance analysis will be performed and, if combined, SOX will be able to investigate most of the anomaly region at 95% c.l. This paper focuses on the improvements performed on the simulation code and on the techniques (calibrations) used to validate the results.
Williams, L. Keoki; Buu, Anne
2017-01-01
We propose a multivariate genome-wide association test for mixed continuous, binary, and ordinal phenotypes. A latent response model is used to estimate the correlation between phenotypes with different measurement scales so that the empirical distribution of the Fisher’s combination statistic under the null hypothesis is estimated efficiently. The simulation study shows that our proposed correlation estimation methods have high levels of accuracy. More importantly, our approach conservatively estimates the variance of the test statistic so that the type I error rate is controlled. The simulation also shows that the proposed test maintains the power at the level very close to that of the ideal analysis based on known latent phenotypes while controlling the type I error. In contrast, conventional approaches–dichotomizing all observed phenotypes or treating them as continuous variables–could either reduce the power or employ a linear regression model unfit for the data. Furthermore, the statistical analysis on the database of the Study of Addiction: Genetics and Environment (SAGE) demonstrates that conducting a multivariate test on multiple phenotypes can increase the power of identifying markers that may not be, otherwise, chosen using marginal tests. The proposed method also offers a new approach to analyzing the Fagerström Test for Nicotine Dependence as multivariate phenotypes in genome-wide association studies. PMID:28081206
Dynamic sensor management of dispersed and disparate sensors for tracking resident space objects
NASA Astrophysics Data System (ADS)
El-Fallah, A.; Zatezalo, A.; Mahler, R.; Mehra, R. K.; Donatelli, D.
2008-04-01
Dynamic sensor management of dispersed and disparate sensors for space situational awareness presents daunting scientific and practical challenges as it requires optimal and accurate maintenance of all Resident Space Objects (RSOs) of interest. We demonstrate an approach to the space-based sensor management problem by extending a previously developed and tested sensor management objective function, the Posterior Expected Number of Targets (PENT), to disparate and dispersed sensors. This PENT extension together with observation models for various sensor platforms, and a Probability Hypothesis Density Particle Filter (PHD-PF) tracker provide a powerful tool for tackling this challenging problem. We demonstrate the approach using simulations for tracking RSOs by a Space Based Visible (SBV) sensor and ground based radars.
USDA-ARS?s Scientific Manuscript database
This study tests the hypothesis that phylogenetic classification can predict whether A. pullulans strains will produce useful levels of the commercial polysaccharide, pullulan, or the valuable enzyme, xylanase. To test this hypothesis, 19 strains of A. pullulans with previously described phenotypes...
Mert, Mehmet; Bölük, Gülden
2016-11-01
This study examines the impact of foreign direct investment (FDI) and the potential of renewable energy consumption on carbon dioxide (CO 2 ) emissions in 21 Kyoto countries using an unbalanced panel data. For this purpose, Environmental Kuznets Curve (EKC) hypothesis was tested using panel cointegration analysis. Panel causality tests show that there are significant long-run causalities from the variables to carbon emissions, renewable energy consumption, fossil fuel energy consumption and inflow foreign direct investments. The results of our model support the pollution haloes hypothesis which states that FDI brings in clean technology and improves the environmental standards. However, an inverted U-shaped relationship (EKC) was not supported by the estimated model for the 21 Kyoto countries. This means that economic growth cannot ensure environmental protection itself or environmental goals cannot await economic growth. Another important finding is that renewable energy consumption decreases carbon emissions. Based on the empirical results, some important policy implications emerge. Kyoto countries should stimulate the FDI inflows and usage of renewable energy consumption to mitigate the air pollution and meet the emission targets. This paper provides new insights into environment and energy policies through FDI inclusion.
Do Arctic breeding geese track or overtake a green wave during spring migration?
Si, Yali; Xin, Qinchuan; de Boer, Willem F; Gong, Peng; Ydenberg, Ronald C; Prins, Herbert H T
2015-03-04
Geese breeding in the Arctic have to do so in a short time-window while having sufficient body reserves. Hence, arrival time and body condition upon arrival largely influence breeding success. The green wave hypothesis posits that geese track a successively delayed spring flush of plant development on the way to their breeding sites. The green wave has been interpreted as representing either the onset of spring or the peak in nutrient biomass. However, geese tend to adopt a partial capital breeding strategy and might overtake the green wave to accomplish a timely arrival on the breeding site. To test the green wave hypothesis, we link the satellite-derived onset of spring and peak in nutrient biomass with the stopover schedule of individual Barnacle Geese. We find that geese track neither the onset of spring nor the peak in nutrient biomass. Rather, they arrive at the southernmost stopover site around the peak in nutrient biomass, and gradually overtake the green wave to match their arrival at the breeding site with the local onset of spring, thereby ensuring gosling benefit from the peak in nutrient biomass. Our approach for estimating plant development stages is critical in testing the migration strategies of migratory herbivores.
NASA Technical Reports Server (NTRS)
Porter, D. W.; Lefler, R. M.
1979-01-01
A generalized hypothesis testing approach is applied to the problem of tracking several objects where several different associations of data with objects are possible. Such problems occur, for instance, when attempting to distinctly track several aircraft maneuvering near each other or when tracking ships at sea. Conceptually, the problem is solved by first, associating data with objects in a statistically reasonable fashion and then, tracking with a bank of Kalman filters. The objects are assumed to have motion characterized by a fixed but unknown deterministic portion plus a random process portion modeled by a shaping filter. For example, the object might be assumed to have a mean straight line path about which it maneuvers in a random manner. Several hypothesized associations of data with objects are possible because of ambiguity as to which object the data comes from, false alarm/detection errors, and possible uncertainty in the number of objects being tracked. The statistical likelihood function is computed for each possible hypothesized association of data with objects. Then the generalized likelihood is computed by maximizing the likelihood over parameters that define the deterministic motion of the object.
Jutla, Antarpreet; Huq, Anwar; Colwell, Rita R
2015-01-01
West Nile virus (WNV), mosquito-borne and water-based disease, is increasingly a global threat to public health. Since its appearance in the northeastern United States in 1999, WNV has since been reported in several states in the continental United States. The objective of this study is to highlight role of hydroclimatic processes estimated through satellite sensors in capturing conditions for emergence of the vectors in historically disease free regions. We tested the hypothesis that an increase in surface temperature, in combination with intensification of vegetation, and enhanced precipitation, lead to conditions favorable for vector (mosquito) growth. Analysis of land surface temperature (LST) pattern shows that temperature values >16°C, with heavy precipitation, may lead to abundance of the mosquito population. This hypothesis was tested in West Virginia where a sudden epidemic of WNV infection was reported in 2012. Our results emphasize the value of hydroclimatic processes estimated by satellite remote sensing, as well as continued environmental surveillance of mosquitoes, because when a vector-borne infection like WNV is discovered in contiguous regions, the risk of spread of WNV mosquitoes increase at points where appropriate hydroclimatic processes intersect with the vector niche.