Sample records for hypothesis testing framework

  1. Semantically enabled and statistically supported biological hypothesis testing with tissue microarray databases

    PubMed Central

    2011-01-01

    Background Although many biological databases are applying semantic web technologies, meaningful biological hypothesis testing cannot be easily achieved. Database-driven high throughput genomic hypothesis testing requires both of the capabilities of obtaining semantically relevant experimental data and of performing relevant statistical testing for the retrieved data. Tissue Microarray (TMA) data are semantically rich and contains many biologically important hypotheses waiting for high throughput conclusions. Methods An application-specific ontology was developed for managing TMA and DNA microarray databases by semantic web technologies. Data were represented as Resource Description Framework (RDF) according to the framework of the ontology. Applications for hypothesis testing (Xperanto-RDF) for TMA data were designed and implemented by (1) formulating the syntactic and semantic structures of the hypotheses derived from TMA experiments, (2) formulating SPARQLs to reflect the semantic structures of the hypotheses, and (3) performing statistical test with the result sets returned by the SPARQLs. Results When a user designs a hypothesis in Xperanto-RDF and submits it, the hypothesis can be tested against TMA experimental data stored in Xperanto-RDF. When we evaluated four previously validated hypotheses as an illustration, all the hypotheses were supported by Xperanto-RDF. Conclusions We demonstrated the utility of high throughput biological hypothesis testing. We believe that preliminary investigation before performing highly controlled experiment can be benefited. PMID:21342584

  2. A two-step hierarchical hypothesis set testing framework, with applications to gene expression data on ordered categories

    PubMed Central

    2014-01-01

    Background In complex large-scale experiments, in addition to simultaneously considering a large number of features, multiple hypotheses are often being tested for each feature. This leads to a problem of multi-dimensional multiple testing. For example, in gene expression studies over ordered categories (such as time-course or dose-response experiments), interest is often in testing differential expression across several categories for each gene. In this paper, we consider a framework for testing multiple sets of hypothesis, which can be applied to a wide range of problems. Results We adopt the concept of the overall false discovery rate (OFDR) for controlling false discoveries on the hypothesis set level. Based on an existing procedure for identifying differentially expressed gene sets, we discuss a general two-step hierarchical hypothesis set testing procedure, which controls the overall false discovery rate under independence across hypothesis sets. In addition, we discuss the concept of the mixed-directional false discovery rate (mdFDR), and extend the general procedure to enable directional decisions for two-sided alternatives. We applied the framework to the case of microarray time-course/dose-response experiments, and proposed three procedures for testing differential expression and making multiple directional decisions for each gene. Simulation studies confirm the control of the OFDR and mdFDR by the proposed procedures under independence and positive correlations across genes. Simulation results also show that two of our new procedures achieve higher power than previous methods. Finally, the proposed methodology is applied to a microarray dose-response study, to identify 17 β-estradiol sensitive genes in breast cancer cells that are induced at low concentrations. Conclusions The framework we discuss provides a platform for multiple testing procedures covering situations involving two (or potentially more) sources of multiplicity. The framework is easy to use and adaptable to various practical settings that frequently occur in large-scale experiments. Procedures generated from the framework are shown to maintain control of the OFDR and mdFDR, quantities that are especially relevant in the case of multiple hypothesis set testing. The procedures work well in both simulations and real datasets, and are shown to have better power than existing methods. PMID:24731138

  3. An Argument Framework for the Application of Null Hypothesis Statistical Testing in Support of Research

    ERIC Educational Resources Information Center

    LeMire, Steven D.

    2010-01-01

    This paper proposes an argument framework for the teaching of null hypothesis statistical testing and its application in support of research. Elements of the Toulmin (1958) model of argument are used to illustrate the use of p values and Type I and Type II error rates in support of claims about statistical parameters and subject matter research…

  4. Perspectives on the Use of Null Hypothesis Statistical Testing. Part III: the Various Nuts and Bolts of Statistical and Hypothesis Testing

    ERIC Educational Resources Information Center

    Marmolejo-Ramos, Fernando; Cousineau, Denis

    2017-01-01

    The number of articles showing dissatisfaction with the null hypothesis statistical testing (NHST) framework has been progressively increasing over the years. Alternatives to NHST have been proposed and the Bayesian approach seems to have achieved the highest amount of visibility. In this last part of the special issue, a few alternative…

  5. Supporting shared hypothesis testing in the biomedical domain.

    PubMed

    Agibetov, Asan; Jiménez-Ruiz, Ernesto; Ondrésik, Marta; Solimando, Alessandro; Banerjee, Imon; Guerrini, Giovanna; Catalano, Chiara E; Oliveira, Joaquim M; Patanè, Giuseppe; Reis, Rui L; Spagnuolo, Michela

    2018-02-08

    Pathogenesis of inflammatory diseases can be tracked by studying the causality relationships among the factors contributing to its development. We could, for instance, hypothesize on the connections of the pathogenesis outcomes to the observed conditions. And to prove such causal hypotheses we would need to have the full understanding of the causal relationships, and we would have to provide all the necessary evidences to support our claims. In practice, however, we might not possess all the background knowledge on the causality relationships, and we might be unable to collect all the evidence to prove our hypotheses. In this work we propose a methodology for the translation of biological knowledge on causality relationships of biological processes and their effects on conditions to a computational framework for hypothesis testing. The methodology consists of two main points: hypothesis graph construction from the formalization of the background knowledge on causality relationships, and confidence measurement in a causality hypothesis as a normalized weighted path computation in the hypothesis graph. In this framework, we can simulate collection of evidences and assess confidence in a causality hypothesis by measuring it proportionally to the amount of available knowledge and collected evidences. We evaluate our methodology on a hypothesis graph that represents both contributing factors which may cause cartilage degradation and the factors which might be caused by the cartilage degradation during osteoarthritis. Hypothesis graph construction has proven to be robust to the addition of potentially contradictory information on the simultaneously positive and negative effects. The obtained confidence measures for the specific causality hypotheses have been validated by our domain experts, and, correspond closely to their subjective assessments of confidences in investigated hypotheses. Overall, our methodology for a shared hypothesis testing framework exhibits important properties that researchers will find useful in literature review for their experimental studies, planning and prioritizing evidence collection acquisition procedures, and testing their hypotheses with different depths of knowledge on causal dependencies of biological processes and their effects on the observed conditions.

  6. Risk-Based, Hypothesis-Driven Framework for Hydrological Field Campaigns with Case Studies

    NASA Astrophysics Data System (ADS)

    Harken, B.; Rubin, Y.

    2014-12-01

    There are several stages in any hydrological modeling campaign, including: formulation and analysis of a priori information, data acquisition through field campaigns, inverse modeling, and prediction of some environmental performance metric (EPM). The EPM being predicted could be, for example, contaminant concentration or plume travel time. These predictions often have significant bearing on a decision that must be made. Examples include: how to allocate limited remediation resources between contaminated groundwater sites or where to place a waste repository site. Answering such questions depends on predictions of EPMs using forward models as well as levels of uncertainty related to these predictions. Uncertainty in EPM predictions stems from uncertainty in model parameters, which can be reduced by measurements taken in field campaigns. The costly nature of field measurements motivates a rational basis for determining a measurement strategy that is optimal with respect to the uncertainty in the EPM prediction. The tool of hypothesis testing allows this uncertainty to be quantified by computing the significance of the test resulting from a proposed field campaign. The significance of the test gives a rational basis for determining the optimality of a proposed field campaign. This hypothesis testing framework is demonstrated and discussed using various synthetic case studies. This study involves contaminated aquifers where a decision must be made based on prediction of when a contaminant will arrive at a specified location. The EPM, in this case contaminant travel time, is cast into the hypothesis testing framework. The null hypothesis states that the contaminant plume will arrive at the specified location before a critical amount of time passes, and the alternative hypothesis states that the plume will arrive after the critical time passes. The optimality of different field campaigns is assessed by computing the significance of the test resulting from each one. Evaluating the level of significance caused by a field campaign involves steps including likelihood-based inverse modeling and semi-analytical conditional particle tracking.

  7. Correlates of androgens in wild male Barbary macaques: Testing the challenge hypothesis.

    PubMed

    Rincon, Alan V; Maréchal, Laëtitia; Semple, Stuart; Majolo, Bonaventura; MacLarnon, Ann

    2017-10-01

    Investigating causes and consequences of variation in hormonal expression is a key focus in behavioral ecology. Many studies have explored patterns of secretion of the androgen testosterone in male vertebrates, using the challenge hypothesis (Wingfield, Hegner, Dufty, & Ball, 1990; The American Naturalist, 136(6), 829-846) as a theoretical framework. Rather than the classic association of testosterone with male sexual behavior, this hypothesis predicts that high levels of testosterone are associated with male-male reproductive competition but also inhibit paternal care. The hypothesis was originally developed for birds, and subsequently tested in other vertebrate taxa, including primates. Such studies have explored the link between testosterone and reproductive aggression as well as other measures of mating competition, or between testosterone and aspects of male behavior related to the presence of infants. Very few studies have simultaneously investigated the links between testosterone and male aggression, other aspects of mating competition and infant-related behavior. We tested predictions derived from the challenge hypothesis in wild male Barbary macaques (Macaca sylvanus), a species with marked breeding seasonality and high levels of male-infant affiliation, providing a powerful test of this theoretical framework. Over 11 months, 251 hr of behavioral observations and 296 fecal samples were collected from seven adult males in the Middle Atlas Mountains, Morocco. Fecal androgen levels rose before the onset of the mating season, during a period of rank instability, and were positively related to group mating activity across the mating season. Androgen levels were unrelated to rates of male-male aggression in any period, but higher ranked males had higher levels in both the mating season and in the period of rank instability. Lower androgen levels were associated with increased rates of male-infant grooming during the mating and unstable periods. Our results generally support the challenge hypothesis and highlight the importance of considering individual species' behavioral ecology when testing this framework. © 2017 Wiley Periodicals, Inc.

  8. A Bayesian framework to estimate diversification rates and their variation through time and space

    PubMed Central

    2011-01-01

    Background Patterns of species diversity are the result of speciation and extinction processes, and molecular phylogenetic data can provide valuable information to derive their variability through time and across clades. Bayesian Markov chain Monte Carlo methods offer a promising framework to incorporate phylogenetic uncertainty when estimating rates of diversification. Results We introduce a new approach to estimate diversification rates in a Bayesian framework over a distribution of trees under various constant and variable rate birth-death and pure-birth models, and test it on simulated phylogenies. Furthermore, speciation and extinction rates and their posterior credibility intervals can be estimated while accounting for non-random taxon sampling. The framework is particularly suitable for hypothesis testing using Bayes factors, as we demonstrate analyzing dated phylogenies of Chondrostoma (Cyprinidae) and Lupinus (Fabaceae). In addition, we develop a model that extends the rate estimation to a meta-analysis framework in which different data sets are combined in a single analysis to detect general temporal and spatial trends in diversification. Conclusions Our approach provides a flexible framework for the estimation of diversification parameters and hypothesis testing while simultaneously accounting for uncertainties in the divergence times and incomplete taxon sampling. PMID:22013891

  9. Female employment and fertility in Peninsular Malaysia: the maternal role incompatibility hypothesis reconsidered.

    PubMed

    Mason, K O; Palan, V T

    1981-11-01

    Multivariate analysis of the 1974 Malaysian Fertility and Family Survey tests the hypothesis that an inverse relationship between women's work and fertility occurs only when there are serious conflicts between working and caring for children. The results are only partly consistent with the hypothesis and suggest that normative conflicts between working and mothering affect the employment-fertility relationship in Malaysia more than spacio-temporal conflicts do. The lack of consistent evidence for the hypothesis, as well as some conceptual problems, lead us to propose an alternative framework for understanding variation in the employment-fertility relationship, both in Malaysia and elsewhere. This framework incorporates ideas from the role incompatibility hypothesis but views the employment-fertility relationship as dependent not just on role conflicts but more generally on the structure of the household's socioeconomic opportunities.

  10. Genetics and recent human evolution.

    PubMed

    Templeton, Alan R

    2007-07-01

    Starting with "mitochondrial Eve" in 1987, genetics has played an increasingly important role in studies of the last two million years of human evolution. It initially appeared that genetic data resolved the basic models of recent human evolution in favor of the "out-of-Africa replacement" hypothesis in which anatomically modern humans evolved in Africa about 150,000 years ago, started to spread throughout the world about 100,000 years ago, and subsequently drove to complete genetic extinction (replacement) all other human populations in Eurasia. Unfortunately, many of the genetic studies on recent human evolution have suffered from scientific flaws, including misrepresenting the models of recent human evolution, focusing upon hypothesis compatibility rather than hypothesis testing, committing the ecological fallacy, and failing to consider a broader array of alternative hypotheses. Once these flaws are corrected, there is actually little genetic support for the out-of-Africa replacement hypothesis. Indeed, when genetic data are used in a hypothesis-testing framework, the out-of-Africa replacement hypothesis is strongly rejected. The model of recent human evolution that emerges from a statistical hypothesis-testing framework does not correspond to any of the traditional models of human evolution, but it is compatible with fossil and archaeological data. These studies also reveal that any one gene or DNA region captures only a small part of human evolutionary history, so multilocus studies are essential. As more and more loci became available, genetics will undoubtedly offer additional insights and resolutions of human evolution.

  11. A Bayesian Approach to the Paleomagnetic Conglomerate Test

    NASA Astrophysics Data System (ADS)

    Heslop, David; Roberts, Andrew P.

    2018-02-01

    The conglomerate test has served the paleomagnetic community for over 60 years as a means to detect remagnetizations. The test states that if a suite of clasts within a bed have uniformly random paleomagnetic directions, then the conglomerate cannot have experienced a pervasive event that remagnetized the clasts in the same direction. The current form of the conglomerate test is based on null hypothesis testing, which results in a binary "pass" (uniformly random directions) or "fail" (nonrandom directions) outcome. We have recast the conglomerate test in a Bayesian framework with the aim of providing more information concerning the level of support a given data set provides for a hypothesis of uniformly random paleomagnetic directions. Using this approach, we place the conglomerate test in a fully probabilistic framework that allows for inconclusive results when insufficient information is available to draw firm conclusions concerning the randomness or nonrandomness of directions. With our method, sample sets larger than those typically employed in paleomagnetism may be required to achieve strong support for a hypothesis of random directions. Given the potentially detrimental effect of unrecognized remagnetizations on paleomagnetic reconstructions, it is important to provide a means to draw statistically robust data-driven inferences. Our Bayesian analysis provides a means to do this for the conglomerate test.

  12. Spatio-temporal conditional inference and hypothesis tests for neural ensemble spiking precision

    PubMed Central

    Harrison, Matthew T.; Amarasingham, Asohan; Truccolo, Wilson

    2014-01-01

    The collective dynamics of neural ensembles create complex spike patterns with many spatial and temporal scales. Understanding the statistical structure of these patterns can help resolve fundamental questions about neural computation and neural dynamics. Spatio-temporal conditional inference (STCI) is introduced here as a semiparametric statistical framework for investigating the nature of precise spiking patterns from collections of neurons that is robust to arbitrarily complex and nonstationary coarse spiking dynamics. The main idea is to focus statistical modeling and inference, not on the full distribution of the data, but rather on families of conditional distributions of precise spiking given different types of coarse spiking. The framework is then used to develop families of hypothesis tests for probing the spatio-temporal precision of spiking patterns. Relationships among different conditional distributions are used to improve multiple hypothesis testing adjustments and to design novel Monte Carlo spike resampling algorithms. Of special note are algorithms that can locally jitter spike times while still preserving the instantaneous peri-stimulus time histogram (PSTH) or the instantaneous total spike count from a group of recorded neurons. The framework can also be used to test whether first-order maximum entropy models with possibly random and time-varying parameters can account for observed patterns of spiking. STCI provides a detailed example of the generic principle of conditional inference, which may be applicable in other areas of neurostatistical analysis. PMID:25380339

  13. Knowledge dimensions in hypothesis test problems

    NASA Astrophysics Data System (ADS)

    Krishnan, Saras; Idris, Noraini

    2012-05-01

    The reformation in statistics education over the past two decades has predominantly shifted the focus of statistical teaching and learning from procedural understanding to conceptual understanding. The emphasis of procedural understanding is on the formulas and calculation procedures. Meanwhile, conceptual understanding emphasizes students knowing why they are using a particular formula or executing a specific procedure. In addition, the Revised Bloom's Taxonomy offers a twodimensional framework to describe learning objectives comprising of the six revised cognition levels of original Bloom's taxonomy and four knowledge dimensions. Depending on the level of complexities, the four knowledge dimensions essentially distinguish basic understanding from the more connected understanding. This study identifiesthe factual, procedural and conceptual knowledgedimensions in hypothesis test problems. Hypothesis test being an important tool in making inferences about a population from sample informationis taught in many introductory statistics courses. However, researchers find that students in these courses still have difficulty in understanding the underlying concepts of hypothesis test. Past studies also show that even though students can perform the hypothesis testing procedure, they may not understand the rationale of executing these steps or know how to apply them in novel contexts. Besides knowing the procedural steps in conducting a hypothesis test, students must have fundamental statistical knowledge and deep understanding of the underlying inferential concepts such as sampling distribution and central limit theorem. By identifying the knowledge dimensions of hypothesis test problems in this study, suitable instructional and assessment strategies can be developed in future to enhance students' learning of hypothesis test as a valuable inferential tool.

  14. A computational study of the discretization error in the solution of the Spencer-Lewis equation by doubling applied to the upwind finite-difference approximation

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Nelson, P.; Seth, D.L.; Ray, A.K.

    A detailed and systematic study of the nature of the discretization error associated with the upwind finite-difference method is presented. A basic model problem has been identified and based upon the results for this problem, a basic hypothesis regarding the accuracy of the computational solution of the Spencer-Lewis equation is formulated. The basic hypothesis is then tested under various systematic single complexifications of the basic model problem. The results of these tests provide the framework of the refined hypothesis presented in the concluding comments. 27 refs., 3 figs., 14 tabs.

  15. Mourning dove hunting regulation strategy based on annual harvest statistics and banding data

    USGS Publications Warehouse

    Otis, D.L.

    2006-01-01

    Although managers should strive to base game bird harvest management strategies on mechanistic population models, monitoring programs required to build and continuously update these models may not be in place. Alternatively, If estimates of total harvest and harvest rates are available, then population estimates derived from these harvest data can serve as the basis for making hunting regulation decisions based on population growth rates derived from these estimates. I present a statistically rigorous approach for regulation decision-making using a hypothesis-testing framework and an assumed framework of 3 hunting regulation alternatives. I illustrate and evaluate the technique with historical data on the mid-continent mallard (Anas platyrhynchos) population. I evaluate the statistical properties of the hypothesis-testing framework using the best available data on mourning doves (Zenaida macroura). I use these results to discuss practical implementation of the technique as an interim harvest strategy for mourning doves until reliable mechanistic population models and associated monitoring programs are developed.

  16. Partial F-tests with multiply imputed data in the linear regression framework via coefficient of determination.

    PubMed

    Chaurasia, Ashok; Harel, Ofer

    2015-02-10

    Tests for regression coefficients such as global, local, and partial F-tests are common in applied research. In the framework of multiple imputation, there are several papers addressing tests for regression coefficients. However, for simultaneous hypothesis testing, the existing methods are computationally intensive because they involve calculation with vectors and (inversion of) matrices. In this paper, we propose a simple method based on the scalar entity, coefficient of determination, to perform (global, local, and partial) F-tests with multiply imputed data. The proposed method is evaluated using simulated data and applied to suicide prevention data. Copyright © 2014 John Wiley & Sons, Ltd.

  17. Testing the single-state dominance hypothesis

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Álvarez-Rodríguez, R.; Moreno, O.; Moya de Guerra, E.

    2013-12-30

    We present a theoretical analysis of the single-state dominance hypothesis for the two-neutrino double-beta decay process. The theoretical framework is a proton-neutron QRPA based on a deformed Hartree-Fock mean field with BCS pairing correlations. We focus on the decays of {sup 100}Mo, {sup 116}Cd and {sup 128}Te. We do not find clear evidences for single-state dominance within the present approach.

  18. Demonstration of risk based, goal driven framework for hydrological field campaigns and inverse modeling with case studies

    NASA Astrophysics Data System (ADS)

    Harken, B.; Geiges, A.; Rubin, Y.

    2013-12-01

    There are several stages in any hydrological modeling campaign, including: formulation and analysis of a priori information, data acquisition through field campaigns, inverse modeling, and forward modeling and prediction of some environmental performance metric (EPM). The EPM being predicted could be, for example, contaminant concentration, plume travel time, or aquifer recharge rate. These predictions often have significant bearing on some decision that must be made. Examples include: how to allocate limited remediation resources between multiple contaminated groundwater sites, where to place a waste repository site, and what extraction rates can be considered sustainable in an aquifer. Providing an answer to these questions depends on predictions of EPMs using forward models as well as levels of uncertainty related to these predictions. Uncertainty in model parameters, such as hydraulic conductivity, leads to uncertainty in EPM predictions. Often, field campaigns and inverse modeling efforts are planned and undertaken with reduction of parametric uncertainty as the objective. The tool of hypothesis testing allows this to be taken one step further by considering uncertainty reduction in the ultimate prediction of the EPM as the objective and gives a rational basis for weighing costs and benefits at each stage. When using the tool of statistical hypothesis testing, the EPM is cast into a binary outcome. This is formulated as null and alternative hypotheses, which can be accepted and rejected with statistical formality. When accounting for all sources of uncertainty at each stage, the level of significance of this test provides a rational basis for planning, optimization, and evaluation of the entire campaign. Case-specific information, such as consequences prediction error and site-specific costs can be used in establishing selection criteria based on what level of risk is deemed acceptable. This framework is demonstrated and discussed using various synthetic case studies. The case studies involve contaminated aquifers where a decision must be made based on prediction of when a contaminant will arrive at a given location. The EPM, in this case contaminant travel time, is cast into the hypothesis testing framework. The null hypothesis states that the contaminant plume will arrive at the specified location before a critical value of time passes, and the alternative hypothesis states that the plume will arrive after the critical time passes. Different field campaigns are analyzed based on effectiveness in reducing the probability of selecting the wrong hypothesis, which in this case corresponds to reducing uncertainty in the prediction of plume arrival time. To examine the role of inverse modeling in this framework, case studies involving both Maximum Likelihood parameter estimation and Bayesian inversion are used.

  19. Frequency Spectrum Neutrality Tests: One for All and All for One

    PubMed Central

    Achaz, Guillaume

    2009-01-01

    Neutrality tests based on the frequency spectrum (e.g., Tajima's D or Fu and Li's F) are commonly used by population geneticists as routine tests to assess the goodness-of-fit of the standard neutral model on their data sets. Here, I show that these neutrality tests are specific instances of a general model that encompasses them all. I illustrate how this general framework can be taken advantage of to devise new more powerful tests that better detect deviations from the standard model. Finally, I exemplify the usefulness of the framework on SNP data by showing how it supports the selection hypothesis in the lactase human gene by overcoming the ascertainment bias. The framework presented here paves the way for constructing novel tests optimized for specific violations of the standard model that ultimately will help to unravel scenarios of evolution. PMID:19546320

  20. Does McNemar's test compare the sensitivities and specificities of two diagnostic tests?

    PubMed

    Kim, Soeun; Lee, Woojoo

    2017-02-01

    McNemar's test is often used in practice to compare the sensitivities and specificities for the evaluation of two diagnostic tests. For correct evaluation of accuracy, an intuitive recommendation is to test the diseased and the non-diseased groups separately so that the sensitivities can be compared among the diseased, and specificities can be compared among the healthy group of people. This paper provides a rigorous theoretical framework for this argument and study the validity of McNemar's test regardless of the conditional independence assumption. We derive McNemar's test statistic under the null hypothesis considering both assumptions of conditional independence and conditional dependence. We then perform power analyses to show how the result is affected by the amount of the conditional dependence under alternative hypothesis.

  1. Correcting power and p-value calculations for bias in diffusion tensor imaging.

    PubMed

    Lauzon, Carolyn B; Landman, Bennett A

    2013-07-01

    Diffusion tensor imaging (DTI) provides quantitative parametric maps sensitive to tissue microarchitecture (e.g., fractional anisotropy, FA). These maps are estimated through computational processes and subject to random distortions including variance and bias. Traditional statistical procedures commonly used for study planning (including power analyses and p-value/alpha-rate thresholds) specifically model variability, but neglect potential impacts of bias. Herein, we quantitatively investigate the impacts of bias in DTI on hypothesis test properties (power and alpha-rate) using a two-sided hypothesis testing framework. We present theoretical evaluation of bias on hypothesis test properties, evaluate the bias estimation technique SIMEX for DTI hypothesis testing using simulated data, and evaluate the impacts of bias on spatially varying power and alpha rates in an empirical study of 21 subjects. Bias is shown to inflame alpha rates, distort the power curve, and cause significant power loss even in empirical settings where the expected difference in bias between groups is zero. These adverse effects can be attenuated by properly accounting for bias in the calculation of power and p-values. Copyright © 2013 Elsevier Inc. All rights reserved.

  2. Flexural strength and failure modes of layered ceramic structures.

    PubMed

    Borba, Márcia; de Araújo, Maico D; de Lima, Erick; Yoshimura, Humberto N; Cesar, Paulo F; Griggs, Jason A; Della Bona, Alvaro

    2011-12-01

    To evaluate the effect of the specimen design on the flexural strength (σ(f)) and failure mode of ceramic structures, testing the hypothesis that the ceramic material under tension controls the mechanical performance of the structure. Three ceramics used as framework materials for fixed partial dentures (YZ--Vita In-Ceram YZ; IZ--Vita In-Ceram Zirconia; AL--Vita In-Ceram AL) and two veneering porcelains (VM7 and VM9) were studied. Bar-shaped specimens were produced in three different designs (n=10): monolithic, two layers (porcelain-framework) and three layers (TRI) (porcelain-framework-porcelain). Specimens were tested for three-point flexural strength at 1MPa/s in 37°C artificial saliva. For bi-layered design, the specimens were tested in both conditions: with porcelain (PT) or framework ceramic (FT) layer under tension. Fracture surfaces were analyzed using stereomicroscope and scanning electron microscopy (SEM). Young's modulus (E) and Poisson's ratio (ν) were determined using ultrasonic pulse-echo method. Results were statistically analyzed by Kruskal-Wallis and Student-Newman-Keuls tests. Except for VM7 and VM9, significant differences were observed for E values among the materials. YZ showed the highest ν value followed by IZ and AL. YZ presented the highest σ(f). There was no statistical difference in the σ(f) value between IZ and IZ-FT and between AL and AL-FT. σ(f) values for YZ-PT, IZ-PT, IZ-TRI, AL-PT, AL-TRI were similar to the results obtained for VM7 and VM9. Two types of fracture mode were identified: total and partial failure. The mechanical performance of the specimens was determined by the material under tension during testing, confirming the study hypothesis. Copyright © 2011 Academy of Dental Materials. Published by Elsevier Ltd. All rights reserved.

  3. Entity-Centric Abstraction and Modeling Framework for Transportation Architectures

    NASA Technical Reports Server (NTRS)

    Lewe, Jung-Ho; DeLaurentis, Daniel A.; Mavris, Dimitri N.; Schrage, Daniel P.

    2007-01-01

    A comprehensive framework for representing transpportation architectures is presented. After discussing a series of preceding perspectives and formulations, the intellectual underpinning of the novel framework using an entity-centric abstraction of transportation is described. The entities include endogenous and exogenous factors and functional expressions are offered that relate these and their evolution. The end result is a Transportation Architecture Field which permits analysis of future concepts under the holistic perspective. A simulation model which stems from the framework is presented and exercised producing results which quantify improvements in air transportation due to advanced aircraft technologies. Finally, a modeling hypothesis and its accompanying criteria are proposed to test further use of the framework for evaluating new transportation solutions.

  4. Testing the Developmental Origins of Health and Disease Hypothesis for Psychopathology Using Family-Based Quasi-Experimental Designs

    PubMed Central

    D’Onofrio, Brian M.; Class, Quetzal A.; Lahey, Benjamin B.; Larsson, Henrik

    2014-01-01

    The Developmental Origin of Health and Disease (DOHaD) hypothesis is a broad theoretical framework that emphasizes how early risk factors have a causal influence on psychopathology. Researchers have raised concerns about the causal interpretation of statistical associations between early risk factors and later psychopathology because most existing studies have been unable to rule out the possibility of environmental and genetic confounding. In this paper we illustrate how family-based quasi-experimental designs can test the DOHaD hypothesis by ruling out alternative hypotheses. We review the logic underlying sibling-comparison, co-twin control, offspring of siblings/twins, adoption, and in vitro fertilization designs. We then present results from studies using these designs focused on broad indices of fetal development (low birth weight and gestational age) and a particular teratogen, smoking during pregnancy. The results provide mixed support for the DOHaD hypothesis for psychopathology, illustrating the critical need to use design features that rule out unmeasured confounding. PMID:25364377

  5. Formulating appropriate statistical hypotheses for treatment comparison in clinical trial design and analysis.

    PubMed

    Huang, Peng; Ou, Ai-hua; Piantadosi, Steven; Tan, Ming

    2014-11-01

    We discuss the problem of properly defining treatment superiority through the specification of hypotheses in clinical trials. The need to precisely define the notion of superiority in a one-sided hypothesis test problem has been well recognized by many authors. Ideally designed null and alternative hypotheses should correspond to a partition of all possible scenarios of underlying true probability models P={P(ω):ω∈Ω} such that the alternative hypothesis Ha={P(ω):ω∈Ωa} can be inferred upon the rejection of null hypothesis Ho={P(ω):ω∈Ω(o)} However, in many cases, tests are carried out and recommendations are made without a precise definition of superiority or a specification of alternative hypothesis. Moreover, in some applications, the union of probability models specified by the chosen null and alternative hypothesis does not constitute a completed model collection P (i.e., H(o)∪H(a) is smaller than P). This not only imposes a strong non-validated assumption of the underlying true models, but also leads to different superiority claims depending on which test is used instead of scientific plausibility. Different ways to partition P fro testing treatment superiority often have different implications on sample size, power, and significance in both efficacy and comparative effectiveness trial design. Such differences are often overlooked. We provide a theoretical framework for evaluating the statistical properties of different specification of superiority in typical hypothesis testing. This can help investigators to select proper hypotheses for treatment comparison inclinical trial design. Copyright © 2014 Elsevier Inc. All rights reserved.

  6. The Regression Hypothesis as a Framework for First Language Attrition

    ERIC Educational Resources Information Center

    Keijzer, Merel

    2010-01-01

    In an attempt to explain first language attrition in emigrant populations, this paper investigates the explanatory power of a framework that has--until now--received little attention: the regression hypothesis (Jakobson, 1941). This hypothesis predicts that the order of attrition is the reverse of the order of acquisition. The regression…

  7. FORUM: Dynamics and Causation of Environmental Equity, Locally Unwanted Land Uses, and Neighborhood Changes

    PubMed

    Liu

    1997-09-01

    / Why are some environmental risks distributed disproportionately in the neighborhoods of the minorities and the poor? A hypothesis was proposed in a recent study that market dynamics contributed to the current environmental inequity. That is, locally unwanted land uses (LULUs) make the host communities home to more poor people and people of color. This hypothesis was allegedly supported by a Houston case study, whereby its author analyzed the postsiting changes of the socioeconomic characteristics of the neighborhoods surrounding solid waste facilities. I argue that such an analysis of postsiting changes alone is insufficient to test the causation hypothesis. Instead, I propose a conceptual framework for analysis of environmental equity dynamics and causation. I suggest that the presiting neighborhood dynamics and the characteristics of control neighborhoods be analyzed as the first test for the causation hypothesis. Furthermore, I present theories of neighborhood change and then examine alternative hypotheses that these theories offer for explaining neighborhood changes and for the roles of LULUs in neighborhood changes. These alternative hypotheses should be examined when analyzing the relationship between LULUs and neighborhood changes in a metropolitan area. Using this framework of analysis, I revisited the Houston case. First, I found no evidence that provided support for the hypothesis that the presence of LULUs made the neighborhoods home to more blacks and poor people, contrary to the conclusion made by the previous study. Second, I examined alternative hypotheses for explaining neighborhood changes-invasion-succession, other push forces, and neighborhood life-cycle; the former two might offer better explanation.KEY WORDS: Environmental equity and justice; Locally unwanted lane uses; Siting; Market dynamics; Invasion-succession; Neighborhood changes

  8. Continuous Improvement in the Industrial and Management Systems Engineering Programme at Kuwait University

    ERIC Educational Resources Information Center

    Aldowaisan, Tariq; Allahverdi, Ali

    2016-01-01

    This paper describes the process employed by the Industrial and Management Systems Engineering programme at Kuwait University to continuously improve the programme. Using a continuous improvement framework, the paper demonstrates how various qualitative and quantitative analyses methods, such as hypothesis testing and control charts, have been…

  9. Looking Good, Sounding Good: Femininity Ideology and Adolescent Girls' Mental Health

    ERIC Educational Resources Information Center

    Tolman, Deborah L.; Impett, Emily A.; Tracy, Allison J.; Michael, Alice

    2006-01-01

    This study used a feminist psychodynamic developmental framework to test the hypothesis that internalizing conventional femininity ideologies in two domains--inauthenticity in relationships and body objectification--is associated with early adolescent girls' mental health. One hundred forty-eight eighth-grade girls completed measures of femininity…

  10. Short- and long-term rhythmic interventions: perspectives for language rehabilitation.

    PubMed

    Schön, Daniele; Tillmann, Barbara

    2015-03-01

    This paper brings together different perspectives on the investigation and understanding of temporal processing and temporal expectations. We aim to bridge different temporal deficit hypotheses in dyslexia, dysphasia, or deafness in a larger framework, taking into account multiple nested temporal scales. We present data testing the hypothesis that temporal attention can be influenced by external rhythmic auditory stimulation (i.e., musical rhythm) and benefits subsequent language processing, including syntax processing and speech production. We also present data testing the hypothesis that phonological awareness can be influenced by several months of musical training and, more particularly, rhythmic training, which in turn improves reading skills. Together, our data support the hypothesis of a causal role of rhythm-based processing for language processing and acquisition. These results open new avenues for music-based remediation of language and hearing impairment. © 2015 New York Academy of Sciences.

  11. Lunar phases and crisis center telephone calls.

    PubMed

    Wilson, J E; Tobacyk, J J

    1990-02-01

    The lunar hypothesis, that is, the notion that lunar phases can directly affect human behavior, was tested by time-series analysis of 4,575 crisis center telephone calls (all calls recorded for a 6-month interval). As expected, the lunar hypothesis was not supported. The 28-day lunar cycle accounted for less than 1% of the variance of the frequency of crisis center calls. Also, as hypothesized from an attribution theory framework, crisis center workers reported significantly greater belief in lunar effects than a non-crisis-center-worker comparison group.

  12. A critique of the hypothesis, and a defense of the question, as a framework for experimentation.

    PubMed

    Glass, David J

    2010-07-01

    Scientists are often steered by common convention, funding agencies, and journal guidelines into a hypothesis-driven experimental framework, despite Isaac Newton's dictum that hypotheses have no place in experimental science. Some may think that Newton's cautionary note, which was in keeping with an experimental approach espoused by Francis Bacon, is inapplicable to current experimental method since, in accord with the philosopher Karl Popper, modern-day hypotheses are framed to serve as instruments of falsification, as opposed to verification. But Popper's "critical rationalist" framework too is problematic. It has been accused of being: inconsistent on philosophical grounds; unworkable for modern "large science," such as systems biology; inconsistent with the actual goals of experimental science, which is verification and not falsification; and harmful to the process of discovery as a practical matter. A criticism of the hypothesis as a framework for experimentation is offered. Presented is an alternative framework-the query/model approach-which many scientists may discover is the framework they are actually using, despite being required to give lip service to the hypothesis.

  13. Risk pathways among traumatic stress, posttraumatic stress disorder symptoms, and alcohol and drug problems: a test of four hypotheses.

    PubMed

    Haller, Moira; Chassin, Laurie

    2014-09-01

    The present study utilized longitudinal data from a community sample (n = 377; 166 trauma-exposed; 54% males; 73% non-Hispanic Caucasian; 22% Hispanic; 5% other ethnicity) to test whether pretrauma substance use problems increase risk for trauma exposure (high-risk hypothesis) or posttraumatic stress disorder (PTSD) symptoms (susceptibility hypothesis), whether PTSD symptoms increase risk for later alcohol/drug problems (self-medication hypothesis), and whether the association between PTSD symptoms and alcohol/drug problems is attributable to shared risk factors (shared vulnerability hypothesis). Logistic and negative binomial regressions were performed in a path analysis framework. Results provided the strongest support for the self-medication hypothesis, such that PTSD symptoms predicted higher levels of later alcohol and drug problems, over and above the influences of pretrauma family risk factors, pretrauma substance use problems, trauma exposure, and demographic variables. Results partially supported the high-risk hypothesis, such that adolescent substance use problems increased risk for assaultive violence exposure but did not influence overall risk for trauma exposure. There was no support for the susceptibility hypothesis. Finally, there was little support for the shared vulnerability hypothesis. Neither trauma exposure nor preexisting family adversity accounted for the link between PTSD symptoms and later substance use problems. Rather, PTSD symptoms mediated the effect of pretrauma family adversity on later alcohol and drug problems, thereby supporting the self-medication hypothesis. These findings make important contributions to better understanding the directions of influence among traumatic stress, PTSD symptoms, and substance use problems.

  14. MachineProse: an Ontological Framework for Scientific Assertions

    PubMed Central

    Dinakarpandian, Deendayal; Lee, Yugyung; Vishwanath, Kartik; Lingambhotla, Rohini

    2006-01-01

    Objective: The idea of testing a hypothesis is central to the practice of biomedical research. However, the results of testing a hypothesis are published mainly in the form of prose articles. Encoding the results as scientific assertions that are both human and machine readable would greatly enhance the synergistic growth and dissemination of knowledge. Design: We have developed MachineProse (MP), an ontological framework for the concise specification of scientific assertions. MP is based on the idea of an assertion constituting a fundamental unit of knowledge. This is in contrast to current approaches that use discrete concept terms from domain ontologies for annotation and assertions are only inferred heuristically. Measurements: We use illustrative examples to highlight the advantages of MP over the use of the Medical Subject Headings (MeSH) system and keywords in indexing scientific articles. Results: We show how MP makes it possible to carry out semantic annotation of publications that is machine readable and allows for precise search capabilities. In addition, when used by itself, MP serves as a knowledge repository for emerging discoveries. A prototype for proof of concept has been developed that demonstrates the feasibility and novel benefits of MP. As part of the MP framework, we have created an ontology of relationship types with about 100 terms optimized for the representation of scientific assertions. Conclusion: MachineProse is a novel semantic framework that we believe may be used to summarize research findings, annotate biomedical publications, and support sophisticated searches. PMID:16357355

  15. The Elaborated Environmental Stress Hypothesis as a Framework for Understanding the Association Between Motor Skills and Internalizing Problems: A Mini-Review

    PubMed Central

    Mancini, Vincent O.; Rigoli, Daniela; Cairney, John; Roberts, Lynne D.; Piek, Jan P.

    2016-01-01

    Poor motor skills have been shown to be associated with a range of psychosocial issues, including internalizing problems (anxiety and depression). While well-documented empirically, our understanding of why this relationship occurs remains theoretically underdeveloped. The Elaborated Environmental Stress Hypothesis by Cairney et al. (2013) provides a promising framework that seeks to explain the association between motor skills and internalizing problems, specifically in children with developmental coordination disorder (DCD). The framework posits that poor motor skills predispose the development of internalizing problems via interactions with intermediary environmental stressors. At the time the model was proposed, limited direct evidence was available to support or refute the framework. Several studies and developments related to the framework have since been published. This mini-review seeks to provide an up-to-date overview of recent developments related to the Elaborated Environmental Stress Hypothesis. We briefly discuss the past research that led to its development, before moving to studies that have investigated the framework since it was proposed. While originally developed within the context of DCD in childhood, recent developments have found support for the model in community samples. Through the reviewed literature, this article provides support for the Elaborated Environmental Stress Hypothesis as a promising theoretical framework that explains the psychosocial correlates across the broader spectrum of motor ability. However, given its recent conceptualization, ongoing evaluation of the Elaborated Environmental Stress Hypothesis is recommended. PMID:26941690

  16. RPD-based Hypothesis Reasoning for Cyber Situation Awareness

    NASA Astrophysics Data System (ADS)

    Yen, John; McNeese, Michael; Mullen, Tracy; Hall, David; Fan, Xiaocong; Liu, Peng

    Intelligence workers such as analysts, commanders, and soldiers often need a hypothesis reasoning framework to gain improved situation awareness of the highly dynamic cyber space. The development of such a framework requires the integration of interdisciplinary techniques, including supports for distributed cognition (human-in-the-loop hypothesis generation), supports for team collaboration (identification of information for hypothesis evaluation), and supports for resource-constrained information collection (hypotheses competing for information collection resources). We here describe a cognitively-inspired framework that is built upon Klein’s recognition-primed decision model and integrates the three components of Endsley’s situation awareness model. The framework naturally connects the logic world of tools for cyber situation awareness with the mental world of human analysts, enabling the perception, comprehension, and prediction of cyber situations for better prevention, survival, and response to cyber attacks by adapting missions at the operational, tactical, and strategic levels.

  17. Parents' Decision on Child Labour and School Attendance: Evidence from Iranian Households

    ERIC Educational Resources Information Center

    Keshavarz Haddad, GholamReza

    2017-01-01

    In the framework of a household's collective decision processes, this study presents a structural empirical model to test the hypothesis that child labour is compelled by household's poverty and parent's bargaining power against one another. To this end, a measure for mother's intra-household bargaining power is developed. I use Iranian…

  18. STAPP: Spatiotemporal analysis of plantar pressure measurements using statistical parametric mapping.

    PubMed

    Booth, Brian G; Keijsers, Noël L W; Sijbers, Jan; Huysmans, Toon

    2018-05-03

    Pedobarography produces large sets of plantar pressure samples that are routinely subsampled (e.g. using regions of interest) or aggregated (e.g. center of pressure trajectories, peak pressure images) in order to simplify statistical analysis and provide intuitive clinical measures. We hypothesize that these data reductions discard gait information that can be used to differentiate between groups or conditions. To test the hypothesis of null information loss, we created an implementation of statistical parametric mapping (SPM) for dynamic plantar pressure datasets (i.e. plantar pressure videos). Our SPM software framework brings all plantar pressure videos into anatomical and temporal correspondence, then performs statistical tests at each sampling location in space and time. Novelly, we introduce non-linear temporal registration into the framework in order to normalize for timing differences within the stance phase. We refer to our software framework as STAPP: spatiotemporal analysis of plantar pressure measurements. Using STAPP, we tested our hypothesis on plantar pressure videos from 33 healthy subjects walking at different speeds. As walking speed increased, STAPP was able to identify significant decreases in plantar pressure at mid-stance from the heel through the lateral forefoot. The extent of these plantar pressure decreases has not previously been observed using existing plantar pressure analysis techniques. We therefore conclude that the subsampling of plantar pressure videos - a task which led to the discarding of gait information in our study - can be avoided using STAPP. Copyright © 2018 Elsevier B.V. All rights reserved.

  19. Learning from instructional explanations: effects of prompts based on the active-constructive-interactive framework.

    PubMed

    Roelle, Julian; Müller, Claudia; Roelle, Detlev; Berthold, Kirsten

    2015-01-01

    Although instructional explanations are commonly provided when learners are introduced to new content, they often fail because they are not integrated into effective learning activities. The recently introduced active-constructive-interactive framework posits an effectiveness hierarchy in which interactive learning activities are at the top; these are then followed by constructive and active learning activities, respectively. Against this background, we combined instructional explanations with different types of prompts that were designed to elicit these learning activities and tested the central predictions of the active-constructive-interactive framework. In Experiment 1, N = 83 students were randomly assigned to one of four combinations of instructional explanations and prompts. To test the active < constructive learning hypothesis, the learners received either (1) complete explanations and engaging prompts designed to elicit active activities or (2) explanations that were reduced by inferences and inference prompts designed to engage learners in constructing the withheld information. Furthermore, in order to explore how interactive learning activities can be elicited, we gave the learners who had difficulties in constructing the prompted inferences adapted remedial explanations with either (3) unspecific engaging prompts or (4) revision prompts. In support of the active < constructive learning hypothesis, we found that the learners who received reduced explanations and inference prompts outperformed the learners who received complete explanations and engaging prompts. Moreover, revision prompts were more effective in eliciting interactive learning activities than engaging prompts. In Experiment 2, N = 40 students were randomly assigned to either (1) a reduced explanations and inference prompts or (2) a reduced explanations and inference prompts plus adapted remedial explanations and revision prompts condition. In support of the constructive < interactive learning hypothesis, the learners who received adapted remedial explanations and revision prompts as add-ons to reduced explanations and inference prompts acquired more conceptual knowledge.

  20. Learning from Instructional Explanations: Effects of Prompts Based on the Active-Constructive-Interactive Framework

    PubMed Central

    Roelle, Julian; Müller, Claudia; Roelle, Detlev; Berthold, Kirsten

    2015-01-01

    Although instructional explanations are commonly provided when learners are introduced to new content, they often fail because they are not integrated into effective learning activities. The recently introduced active-constructive-interactive framework posits an effectiveness hierarchy in which interactive learning activities are at the top; these are then followed by constructive and active learning activities, respectively. Against this background, we combined instructional explanations with different types of prompts that were designed to elicit these learning activities and tested the central predictions of the active-constructive-interactive framework. In Experiment 1, N = 83 students were randomly assigned to one of four combinations of instructional explanations and prompts. To test the active < constructive learning hypothesis, the learners received either (1) complete explanations and engaging prompts designed to elicit active activities or (2) explanations that were reduced by inferences and inference prompts designed to engage learners in constructing the withheld information. Furthermore, in order to explore how interactive learning activities can be elicited, we gave the learners who had difficulties in constructing the prompted inferences adapted remedial explanations with either (3) unspecific engaging prompts or (4) revision prompts. In support of the active < constructive learning hypothesis, we found that the learners who received reduced explanations and inference prompts outperformed the learners who received complete explanations and engaging prompts. Moreover, revision prompts were more effective in eliciting interactive learning activities than engaging prompts. In Experiment 2, N = 40 students were randomly assigned to either (1) a reduced explanations and inference prompts or (2) a reduced explanations and inference prompts plus adapted remedial explanations and revision prompts condition. In support of the constructive < interactive learning hypothesis, the learners who received adapted remedial explanations and revision prompts as add-ons to reduced explanations and inference prompts acquired more conceptual knowledge. PMID:25853629

  1. Trajectory-Based Performance Assessment for Aviation Weather Information

    NASA Technical Reports Server (NTRS)

    Vigeant-Langlois, Laurence; Hansman, R. John, Jr.

    2003-01-01

    Based on an analysis of aviation decision-makers' time-related weather information needs, an abstraction of the aviation weather decision task was developed, that involves 4-D intersection testing between aircraft trajectory hypertubes and hazardous weather hypervolumes. The framework builds on the hypothesis that hazardous meteorological fields can be simplified using discrete boundaries of surrogate threat attributes. The abstractions developed in the framework may be useful in studying how to improve the performance of weather forecasts from the trajectory-centric perspective, as well as for developing useful visualization techniques of weather information.

  2. Testing the Consolidated Framework for Implementation Research on health care innovations from South Yorkshire.

    PubMed

    Ilott, Irene; Gerrish, Kate; Booth, Andrew; Field, Becky

    2013-10-01

    There is an international imperative to implement research into clinical practice to improve health care. Understanding the dynamics of change requires knowledge from theoretical and empirical studies. This paper presents a novel approach to testing a new meta theoretical framework: the Consolidated Framework for Implementation Research. The utility of the Framework was evaluated using a post hoc, deductive analysis of 11 narrative accounts of innovation in health care services and practice from England, collected in 2010. A matrix, comprising the five domains and 39 constructs of the Framework was developed to examine the coherence of the terminology, to compare results across contexts and to identify new theoretical developments. The Framework captured the complexity of implementation across 11 diverse examples, offering theoretically informed, comprehensive coverage. The Framework drew attention to relevant points in individual cases together with patterns across cases; for example, all were internally developed innovations that brought direct or indirect patient advantage. In 10 cases, the change was led by clinicians. Most initiatives had been maintained for several years and there was evidence of spread in six examples. Areas for further development within the Framework include sustainability and patient/public engagement in implementation. Our analysis suggests that this conceptual framework has the potential to offer useful insights, whether as part of a situational analysis or by developing context-specific propositions for hypothesis testing. Such studies are vital now that innovation is being promoted as core business for health care. © 2012 John Wiley & Sons Ltd.

  3. Goal Orientation, Academic Achievement, and Depression: Evidence in Favor of a Revised Goal Theory Framework

    ERIC Educational Resources Information Center

    Sideridis, Georgios D.

    2005-01-01

    The objective of this investigation was to evaluate and expand the goal-orientation model of depression vulnerability proposed by B. M. Dykman (1998), which posits that a performance orientation creates a vulnerability to depression through repeated failure. This hypothesis was tested in 5 studies with students in Grades 5 and 6. A…

  4. Goals and Personal Resources that Contribute to the Development and Agency Attachment of Older Adult Volunteers

    ERIC Educational Resources Information Center

    Gillespie, Alayna A.; Gottlieb, Benjamin H.; Maitland, Scott B.

    2011-01-01

    We examined the volunteer service contribution of older adults (N = 100) to volunteer role development and agency attachment. Informed by a developmental regulation framework and socio-emotional selectivity theory, we tested a twofold hypothesis for the premise that greater role development and agency attachment would be experienced by (1) older…

  5. What Explains Gender Gaps in Maths Achievement in Primary Schools in Kenya?

    ERIC Educational Resources Information Center

    Ngware, Moses W.; Ciera, James; Abuya, Benta A.; Oketch, Moses; Mutisya, Maurice

    2012-01-01

    This paper aims to improve the understanding of classroom-based gender differences that may lead to differential opportunities to learn provided to girls and boys in low and high performing primary schools in Kenya. The paper uses an opportunity to learn framework and tests the hypothesis that teaching practices and classroom interactions explain…

  6. The Role of Perceived Threat in Reducing Health Knowledge Gaps.

    ERIC Educational Resources Information Center

    Yows, Suzanne R.

    A study tested the knowledge gap hypothesis, a promising framework for research in the field of mass communication devised by P. Tichenor, G. Donohue, and C. Olien in 1970. The study investigated the relative contribution of two types of factors--structural and motivational--in predicting the degree to which persons will attend to health messages,…

  7. Contributions of Academic Language, Perspective Taking, and Complex Reasoning to Deep Reading Comprehension

    ERIC Educational Resources Information Center

    LaRusso, Maria; Kim, Ha Yeon; Selman, Robert; Uccelli, Paola; Dawson, Theo; Jones, Stephanie; Donovan, Suzanne; Snow, Catherine

    2016-01-01

    "Deep reading comprehension" refers to the process required to succeed at tasks defined by the Common Core State Literacy Standards, as well as to achieve proficiency on the more challenging reading tasks in the Program for International Student Assessment (PISA) framework. The purpose of this study was to test the hypothesis that three…

  8. Bifurcation and Hysteresis Effects in Student Performance: The Signature of Complexity and Chaos in Educational Research

    ERIC Educational Resources Information Center

    Stamovlasis, Dimitrios

    2014-01-01

    This paper addresses some methodological issues concerning traditional linear approaches and shows the need for a paradigm shift in education research towards the Complexity and Nonlinear Dynamical Systems (NDS) framework. It presents a quantitative piece of research aiming to test the nonlinear dynamical hypothesis in education. It applies…

  9. Framework for the quantitative weight-of-evidence analysis of 'omics data for regulatory purposes.

    PubMed

    Bridges, Jim; Sauer, Ursula G; Buesen, Roland; Deferme, Lize; Tollefsen, Knut E; Tralau, Tewes; van Ravenzwaay, Ben; Poole, Alan; Pemberton, Mark

    2017-12-01

    A framework for the quantitative weight-of-evidence (QWoE) analysis of 'omics data for regulatory purposes is presented. The QWoE framework encompasses seven steps to evaluate 'omics data (also together with non-'omics data): (1) Hypothesis formulation, identification and weighting of lines of evidence (LoEs). LoEs conjoin different (types of) studies that are used to critically test the hypothesis. As an essential component of the QWoE framework, step 1 includes the development of templates for scoring sheets that predefine scoring criteria with scores of 0-4 to enable a quantitative determination of study quality and data relevance; (2) literature searches and categorisation of studies into the pre-defined LoEs; (3) and (4) quantitative assessment of study quality and data relevance using the respective pre-defined scoring sheets for each study; (5) evaluation of LoE-specific strength of evidence based upon the study quality and study relevance scores of the studies conjoined in the respective LoE; (6) integration of the strength of evidence from the individual LoEs to determine the overall strength of evidence; (7) characterisation of uncertainties and conclusion on the QWoE. To put the QWoE framework in practice, case studies are recommended to confirm the relevance of its different steps, or to adapt them as necessary. Copyright © 2017 The Authors. Published by Elsevier Inc. All rights reserved.

  10. Using Backward Design in Education Research: A Research Methods Essay †

    PubMed Central

    Jensen, Jamie L.; Bailey, Elizabeth G.; Kummer, Tyler A.; Weber, K. Scott

    2017-01-01

    Education research within the STEM disciplines applies a scholarly approach to teaching and learning, with the intent of better understanding how people learn and of improving pedagogy at the undergraduate level. Most of the professionals practicing in this field have ‘crossed over’ from other disciplinary fields and thus have faced challenges in becoming experts in a new discipline. In this article, we offer a novel framework for approaching education research design called Backward Design in Education Research. It is patterned on backward curricular design and provides a three-step, systematic approach to designing education projects: 1) Define a research question that leads to a testable causal hypothesis based on a theoretical rationale; 2) Choose or design the assessment instruments to test the research hypothesis; and 3) Develop an experimental protocol that will be effective in testing the research hypothesis. This approach provides a systematic method to develop and carry out evidence-based research design. PMID:29854045

  11. Principles Underlying the Use of Multiple Informants’ Reports

    PubMed Central

    De Los Reyes, Andres; Thomas, Sarah A.; Goodman, Kimberly L.; Kundey, Shannon M.A.

    2014-01-01

    Researchers use multiple informants’ reports to assess and examine behavior. However, informants’ reports commonly disagree. Informants’ reports often disagree in their perceived levels of a behavior (“low” vs. “elevated” mood), and examining multiple reports in a single study often results in inconsistent findings. Although researchers often espouse taking a multi-informant assessment approach, they frequently address informant discrepancies using techniques that treat discrepancies as measurement error. Yet, recent work indicates that researchers in a variety of fields often may be unable to justify treating informant discrepancies as measurement error. In this paper, the authors advance a framework (Operations Triad Model) outlining general principles for using and interpreting informants’ reports. Using the framework, researchers can test whether or not they can extract meaningful information about behavior from discrepancies among multiple informants’ reports. The authors provide supportive evidence for this framework and discuss its implications for hypothesis testing, study design, and quantitative review. PMID:23140332

  12. Testing for Divergent Transmission Histories among Cultural Characters: A Study Using Bayesian Phylogenetic Methods and Iranian Tribal Textile Data

    PubMed Central

    Matthews, Luke J.; Tehrani, Jamie J.; Jordan, Fiona M.; Collard, Mark; Nunn, Charles L.

    2011-01-01

    Background Archaeologists and anthropologists have long recognized that different cultural complexes may have distinct descent histories, but they have lacked analytical techniques capable of easily identifying such incongruence. Here, we show how Bayesian phylogenetic analysis can be used to identify incongruent cultural histories. We employ the approach to investigate Iranian tribal textile traditions. Methods We used Bayes factor comparisons in a phylogenetic framework to test two models of cultural evolution: the hierarchically integrated system hypothesis and the multiple coherent units hypothesis. In the hierarchically integrated system hypothesis, a core tradition of characters evolves through descent with modification and characters peripheral to the core are exchanged among contemporaneous populations. In the multiple coherent units hypothesis, a core tradition does not exist. Rather, there are several cultural units consisting of sets of characters that have different histories of descent. Results For the Iranian textiles, the Bayesian phylogenetic analyses supported the multiple coherent units hypothesis over the hierarchically integrated system hypothesis. Our analyses suggest that pile-weave designs represent a distinct cultural unit that has a different phylogenetic history compared to other textile characters. Conclusions The results from the Iranian textiles are consistent with the available ethnographic evidence, which suggests that the commercial rug market has influenced pile-rug designs but not the techniques or designs incorporated in the other textiles produced by the tribes. We anticipate that Bayesian phylogenetic tests for inferring cultural units will be of great value for researchers interested in studying the evolution of cultural traits including language, behavior, and material culture. PMID:21559083

  13. The relationship between energy consumption and economic growth in Malaysia: ARDL bound test approach

    NASA Astrophysics Data System (ADS)

    Razali, Radzuan; Khan, Habib; Shafie, Afza; Hassan, Abdul Rahman

    2016-11-01

    The objective of this paper is to examine the short-run and long-run dynamic causal relationship between energy consumption and income per capita both in bivariate and multivariate framework over the period 1971-2014 in the case of Malaysia [1]. The study applies ARDL Bound test procedure for the long run co-integration and Granger causality test for investigation of causal link between the variables. The ARDL bound test confirms the existence of long run co-integration relationship between the variables. The causality test show a feed-back hypothesis between income per capita and energy consumption over the period in the case of Malaysia.

  14. The Different Role of Working Memory in Open-Ended versus Closed-Ended Creative Problem Solving: A Dual-Process Theory Account

    ERIC Educational Resources Information Center

    Lin, Wei-Lun; Lien, Yunn-Wen

    2013-01-01

    This study examined how working memory plays different roles in open-ended versus closed-ended creative problem-solving processes, as represented by divergent thinking tests and insight problem-solving tasks. With respect to the analysis of different task demands and the framework of dual-process theories, the hypothesis was that the idea…

  15. Affective Feedback from Computers and its Effect on Perceived Ability and Affect: A Test of the Computers as Social Actor Hypothesis

    ERIC Educational Resources Information Center

    Mishra, Punya

    2006-01-01

    We report an experimental study that examined two questions: (a) The effect of affective feedback from computers on participants' motivation and self-perception of ability; and (b) whether people respond similarly to computer feedback as they do to human feedback. This study, framed within the Computers As Social Actors (CASA) framework,…

  16. Embracing the comparative approach: how robust phylogenies and broader developmental sampling impacts the understanding of nervous system evolution.

    PubMed

    Hejnol, Andreas; Lowe, Christopher J

    2015-12-19

    Molecular biology has provided a rich dataset to develop hypotheses of nervous system evolution. The startling patterning similarities between distantly related animals during the development of their central nervous system (CNS) have resulted in the hypothesis that a CNS with a single centralized medullary cord and a partitioned brain is homologous across bilaterians. However, the ability to precisely reconstruct ancestral neural architectures from molecular genetic information requires that these gene networks specifically map with particular neural anatomies. A growing body of literature representing the development of a wider range of metazoan neural architectures demonstrates that patterning gene network complexity is maintained in animals with more modest levels of neural complexity. Furthermore, a robust phylogenetic framework that provides the basis for testing the congruence of these homology hypotheses has been lacking since the advent of the field of 'evo-devo'. Recent progress in molecular phylogenetics is refining the necessary framework to test previous homology statements that span large evolutionary distances. In this review, we describe recent advances in animal phylogeny and exemplify for two neural characters-the partitioned brain of arthropods and the ventral centralized nerve cords of annelids-a test for congruence using this framework. The sequential sister taxa at the base of Ecdysozoa and Spiralia comprise small, interstitial groups. This topology is not consistent with the hypothesis of homology of tripartitioned brain of arthropods and vertebrates as well as the ventral arthropod and rope-like ladder nervous system of annelids. There can be exquisite conservation of gene regulatory networks between distantly related groups with contrasting levels of nervous system centralization and complexity. Consequently, the utility of molecular characters to reconstruct ancestral neural organization in deep time is limited. © 2015 The Authors.

  17. Embracing the comparative approach: how robust phylogenies and broader developmental sampling impacts the understanding of nervous system evolution

    PubMed Central

    Hejnol, Andreas; Lowe, Christopher J.

    2015-01-01

    Molecular biology has provided a rich dataset to develop hypotheses of nervous system evolution. The startling patterning similarities between distantly related animals during the development of their central nervous system (CNS) have resulted in the hypothesis that a CNS with a single centralized medullary cord and a partitioned brain is homologous across bilaterians. However, the ability to precisely reconstruct ancestral neural architectures from molecular genetic information requires that these gene networks specifically map with particular neural anatomies. A growing body of literature representing the development of a wider range of metazoan neural architectures demonstrates that patterning gene network complexity is maintained in animals with more modest levels of neural complexity. Furthermore, a robust phylogenetic framework that provides the basis for testing the congruence of these homology hypotheses has been lacking since the advent of the field of ‘evo-devo’. Recent progress in molecular phylogenetics is refining the necessary framework to test previous homology statements that span large evolutionary distances. In this review, we describe recent advances in animal phylogeny and exemplify for two neural characters—the partitioned brain of arthropods and the ventral centralized nerve cords of annelids—a test for congruence using this framework. The sequential sister taxa at the base of Ecdysozoa and Spiralia comprise small, interstitial groups. This topology is not consistent with the hypothesis of homology of tripartitioned brain of arthropods and vertebrates as well as the ventral arthropod and rope-like ladder nervous system of annelids. There can be exquisite conservation of gene regulatory networks between distantly related groups with contrasting levels of nervous system centralization and complexity. Consequently, the utility of molecular characters to reconstruct ancestral neural organization in deep time is limited. PMID:26554039

  18. The PMHT: solutions for some of its problems

    NASA Astrophysics Data System (ADS)

    Wieneke, Monika; Koch, Wolfgang

    2007-09-01

    Tracking multiple targets in a cluttered environment is a challenging task. Probabilistic Multiple Hypothesis Tracking (PMHT) is an efficient approach for dealing with it. Essentially PMHT is based on the method of Expectation-Maximization for handling with association conflicts. Linearity in the number of targets and measurements is the main motivation for a further development and extension of this methodology. Unfortunately, compared with the Probabilistic Data Association Filter (PDAF), PMHT has not yet shown its superiority in terms of track-lost statistics. Furthermore, the problem of track extraction and deletion is apparently not yet satisfactorily solved within this framework. Four properties of PMHT are responsible for its problems in track maintenance: Non-Adaptivity, Hospitality, Narcissism and Local Maxima. 1, 2 In this work we present a solution for each of them and derive an improved PMHT by integrating the solutions into the PMHT formalism. The new PMHT is evaluated by Monte-Carlo simulations. A sequential Likelihood-Ratio (LR) test for track extraction has been developed and already integrated into the framework of traditional Bayesian Multiple Hypothesis Tracking. 3 As a multi-scan approach, also the PMHT methodology has the potential for track extraction. In this paper an analogous integration of a sequential LR test into the PMHT framework is proposed. We present an LR formula for track extraction and deletion using the PMHT update formulae. As PMHT provides all required ingredients for a sequential LR calculation, the LR is thus a by-product of the PMHT iteration process. Therefore the resulting update formula for the sequential LR test affords the development of Track-Before-Detect algorithms for PMHT. The approach is illustrated by a simple example.

  19. Qa-1/HLA-E-restricted regulatory CD8+ T cells and self-nonself discrimination: an essay on peripheral T-cell regulation.

    PubMed

    Jiang, Hong; Chess, Leonard

    2008-11-01

    By discriminating self from nonself and controlling the magnitude and class of immune responses, the immune system mounts effective immunity against virtually any foreign antigens but avoids harmful immune responses to self. These are two equally important and related but distinct processes, which function in concert to ensure an optimal function of the immune system. Immunologically relevant clinical problems often occur because of failure of either process, especially the former. Currently, there is no unified conceptual framework to characterize the precise relationship between thymic negative selection and peripheral immune regulation, which is the basis for understanding self-non-self discrimination versus control of magnitude and class of immune responses. In this article, we explore a novel hypothesis of how the immune system discriminates self from nonself in the periphery during adaptive immunity. This hypothesis permits rational analysis of various seemingly unrelated biomedical problems inherent in immunologic disorders that cannot be uniformly interpreted by any currently existing paradigms. The proposed hypothesis is based on a unified conceptual framework of the "avidity model of peripheral T-cell regulation" that we originally proposed and tested, in both basic and clinical immunology, to understand how the immune system achieves self-nonself discrimination in the periphery.

  20. Attention and Conscious Perception in the Hypothesis Testing Brain

    PubMed Central

    Hohwy, Jakob

    2012-01-01

    Conscious perception and attention are difficult to study, partly because their relation to each other is not fully understood. Rather than conceiving and studying them in isolation from each other it may be useful to locate them in an independently motivated, general framework, from which a principled account of how they relate can then emerge. Accordingly, these mental phenomena are here reviewed through the prism of the increasingly influential predictive coding framework. On this framework, conscious perception can be seen as the upshot of prediction error minimization and attention as the optimization of precision expectations during such perceptual inference. This approach maps on well to a range of standard characteristics of conscious perception and attention, and can be used to interpret a range of empirical findings on their relation to each other. PMID:22485102

  1. Hologenomics: Systems-Level Host Biology.

    PubMed

    Theis, Kevin R

    2018-01-01

    The hologenome concept of evolution is a hypothesis explaining host evolution in the context of the host microbiomes. As a hypothesis, it needs to be evaluated, especially with respect to the extent of fidelity of transgenerational coassociation of host and microbial lineages and the relative fitness consequences of repeated associations within natural holobiont populations. Behavioral ecologists are in a prime position to test these predictions because they typically focus on animal phenotypes that are quantifiable, conduct studies over multiple generations within natural animal populations, and collect metadata on genetic relatedness and relative reproductive success within these populations. Regardless of the conclusion on the hologenome concept as an evolutionary hypothesis, a hologenomic perspective has applied value as a systems-level framework for host biology, including in medicine. Specifically, it emphasizes investigating the multivarious and dynamic interactions between patient genomes and the genomes of their diverse microbiota when attempting to elucidate etiologies of complex, noninfectious diseases.

  2. Evolution of Deep Brain Stimulation: Human Electrometer and Smart Devices Supporting the Next Generation of Therapy

    PubMed Central

    Lee, Kendall H.; Blaha, Charles D.; Garris, Paul A.; Mohseni, Pedram; Horne, April E.; Bennet, Kevin E.; Agnesi, Filippo; Bledsoe, Jonathan M.; Lester, Deranda B.; Kimble, Chris; Min, Hoon-Ki; Kim, Young-Bo; Cho, Zang-Hee

    2010-01-01

    Deep Brain Stimulation (DBS) provides therapeutic benefit for several neuropathologies including Parkinson’s disease (PD), epilepsy, chronic pain, and depression. Despite well established clinical efficacy, the mechanism(s) of DBS remains poorly understood. In this review we begin by summarizing the current understanding of the DBS mechanism. Using this knowledge as a framework, we then explore a specific hypothesis regarding DBS of the subthalamic nucleus (STN) for the treatment of PD. This hypothesis states that therapeutic benefit is provided, at least in part, by activation of surviving nigrostriatal dopaminergic neurons, subsequent striatal dopamine release, and resumption of striatal target cell control by dopamine. While highly controversial, we present preliminary data that are consistent with specific predications testing this hypothesis. We additionally propose that developing new technologies, e.g., human electrometer and closed-loop smart devices, for monitoring dopaminergic neurotransmission during STN DBS will further advance this treatment approach. PMID:20657744

  3. Emotions and Decisions: Beyond Conceptual Vagueness and the Rationality Muddle.

    PubMed

    Volz, Kirsten G; Hertwig, Ralph

    2016-01-01

    For centuries, decision scholars paid little attention to emotions: Decisions were modeled in normative and descriptive frameworks with little regard for affective processes. Recently, however, an "emotions revolution" has taken place, particularly in the neuroscientific study of decision making, putting emotional processes on an equal footing with cognitive ones. Yet disappointingly little theoretical progress has been made. The concepts and processes discussed often remain vague, and conclusions about the implications of emotions for rationality are contradictory and muddled. We discuss three complementary ways to move the neuroscientific study of emotion and decision making from agenda setting to theory building. The first is to use reverse inference as a hypothesis-discovery rather than a hypothesis-testing tool, unless its utility can be systematically quantified (e.g., through meta-analysis). The second is to capitalize on the conceptual inventory advanced by the behavioral science of emotions, testing those concepts and unveiling the underlying processes. The third is to model the interplay between emotions and decisions, harnessing existing cognitive frameworks of decision making and mapping emotions onto the postulated computational processes. To conclude, emotions (like cognitive strategies) are not rational or irrational per se: How (un)reasonable their influence is depends on their fit with the environment. © The Author(s) 2015.

  4. Hypothesis testing in functional linear regression models with Neyman's truncation and wavelet thresholding for longitudinal data.

    PubMed

    Yang, Xiaowei; Nie, Kun

    2008-03-15

    Longitudinal data sets in biomedical research often consist of large numbers of repeated measures. In many cases, the trajectories do not look globally linear or polynomial, making it difficult to summarize the data or test hypotheses using standard longitudinal data analysis based on various linear models. An alternative approach is to apply the approaches of functional data analysis, which directly target the continuous nonlinear curves underlying discretely sampled repeated measures. For the purposes of data exploration, many functional data analysis strategies have been developed based on various schemes of smoothing, but fewer options are available for making causal inferences regarding predictor-outcome relationships, a common task seen in hypothesis-driven medical studies. To compare groups of curves, two testing strategies with good power have been proposed for high-dimensional analysis of variance: the Fourier-based adaptive Neyman test and the wavelet-based thresholding test. Using a smoking cessation clinical trial data set, this paper demonstrates how to extend the strategies for hypothesis testing into the framework of functional linear regression models (FLRMs) with continuous functional responses and categorical or continuous scalar predictors. The analysis procedure consists of three steps: first, apply the Fourier or wavelet transform to the original repeated measures; then fit a multivariate linear model in the transformed domain; and finally, test the regression coefficients using either adaptive Neyman or thresholding statistics. Since a FLRM can be viewed as a natural extension of the traditional multiple linear regression model, the development of this model and computational tools should enhance the capacity of medical statistics for longitudinal data.

  5. [Hypothesis on the equilibrium point and variability of amplitude, speed and time of single-joint movement].

    PubMed

    Latash, M; Gottleib, G

    1990-01-01

    Problems of single-joint movement variability are analysed in the framework of the equilibrium-point hypothesis (the lambda-model). Control of the movements is described with three parameters related to movement amplitude speed, and time. Three strategies emerge from this description. Only one of them is likely to lead to a Fitts' type speed-accuracy trade-off. Experiments were performed to test one of the predictions of the model. Subjects performed identical sets of single-joint fast movements with open or closed eyes and some-what different instructions. Movements performed with closed eyes were characterized with higher peak speeds and unchanged variability in seeming violation of the Fitt's law and in a good correspondence to the model.

  6. Phylogeny and niche conservatism in North and Central American triatomine bugs (Hemiptera: Reduviidae: Triatominae), vectors of Chagas' disease.

    PubMed

    Ibarra-Cerdeña, Carlos N; Zaldívar-Riverón, Alejandro; Peterson, A Townsend; Sánchez-Cordero, Víctor; Ramsey, Janine M

    2014-10-01

    The niche conservatism hypothesis states that related species diverge in niche characteristics at lower rates than expected, given their lineage divergence. Here we analyze whether niche conservatism is a common pattern among vector species (Hemiptera: Reduviidae: Triatominae) of Trypanosoma cruzi that inhabit North and Central America, a highly heterogeneous landmass in terms of environmental gradients. Mitochondrial and nuclear loci were used in a multi-locus phylogenetic framework to reconstruct phylogenetic relationships among species and estimate time of divergence of selected clades to draw biogeographic inferences. Then, we estimated similarity between the ecological niche of sister species and tested the niche conservatism hypothesis using our best estimate of phylogeny. Triatoma is not monophyletic. A primary clade with all North and Central American (NCA) triatomine species from the genera Triatoma, Dipetalogaster, and Panstrongylus, was consistently recovered. Nearctic species within the NCA clade (T. p. protracta, T. r. rubida) diverged during the Pliocene, whereas the Neotropical species (T. phyllosoma, T. longipennis, T. dimidiata complex) are estimated to have diverged more recently, during the Pleistocene. The hypothesis of niche conservatism could not be rejected for any of six sister species pairs. Niche similarity between sister species best fits a retention model. While this framework is used here to infer niche evolution, it has a direct impact on spatial vector dynamics driven by human population movements, expansion of transportation networks and climate change scenarios.

  7. A pilot study for the analysis of dream reports using Maslow's need categories: an extension to the emotional selection hypothesis.

    PubMed

    Coutts, Richard

    2010-10-01

    The emotional selection hypothesis describes a cyclical process that uses dreams to modify and test select mental schemas. An extension is proposed that further characterizes these schemas as facilitators of human need satisfaction. A pilot study was conducted in which this hypothesis was tested by assigning 100 dream reports (10 randomly selected from 10 dream logs at an online web site) to one or more categories within Maslow's hierarchy of needs. A "match" was declared when at least two of three judges agreed both for category and for whether the identified need was satisfied or thwarted in the dream narrative. The interjudge reliability of the judged needs was good (92% of the reports contained at least one match). The number of needs judged as thwarted did not differ significantly from the number judged as satisfied (48 vs. 52%, respectively). The six "higher" needs (belongingness, esteem, cognitive, aesthetic, self-actualization, and transcendence) were scored significantly more frequently (81%) than were the two lowest or "basic" needs (physiological and safety, 19%). Basic needs were also more likely to be judged as thwarted, while higher needs were more likely to be judged as satisfied. These findings are discussed in the context of Maslow's hierarchy of needs as a framework for investigating theories of dream function, including the emotional selection hypothesis and other contemporary dream theories.

  8. Why are some dimensions integral? Testing two hypotheses through causal learning experiments.

    PubMed

    Soto, Fabián A; Quintana, Gonzalo R; Pérez-Acosta, Andrés M; Ponce, Fernando P; Vogel, Edgar H

    2015-10-01

    Compound generalization and dimensional generalization are traditionally studied independently by different groups of researchers, who have proposed separate theories to explain results from each area. A recent extension of Shepard's rational theory of dimensional generalization allows an explanation of data from both areas within a single framework. However, the conceptualization of dimensional integrality in this theory (the direction hypothesis) is different from that favored by Shepard in his original theory (the correlation hypothesis). Here, we report two experiments that test differential predictions of these two notions of integrality. Each experiment takes a design from compound generalization and translates it into a design for dimensional generalization by replacing discrete stimulus components with dimensional values. Experiment 1 showed that an effect analogous to summation is found in dimensional generalization with separable dimensions, but the opposite effect is found with integral dimensions. Experiment 2 showed that the analogue of a biconditional discrimination is solved faster when stimuli vary in integral dimensions than when stimuli vary in separable dimensions. These results, which are analogous to more "non-linear" processing with integral than with separable dimensions, were predicted by the direction hypothesis, but not by the correlation hypothesis. This confirms the assumptions of the unified rational theory of stimulus generalization and reveals interesting links between compound and dimensional generalization phenomena. Copyright © 2015 Elsevier B.V. All rights reserved.

  9. Dopamine prediction errors in reward learning and addiction: from theory to neural circuitry

    PubMed Central

    Keiflin, Ronald; Janak, Patricia H.

    2015-01-01

    Summary Midbrain dopamine (DA) neurons are proposed to signal reward prediction error (RPE), a fundamental parameter in associative learning models. This RPE hypothesis provides a compelling theoretical framework for understanding DA function in reward learning and addiction. New studies support a causal role for DA-mediated RPE activity in promoting learning about natural reward; however, this question has not been explicitly tested in the context of drug addiction. In this review, we integrate theoretical models with experimental findings on the activity of DA systems, and on the causal role of specific neuronal projections and cell types, to provide a circuit-based framework for probing DA-RPE function in addiction. By examining error-encoding DA neurons in the neural network in which they are embedded, hypotheses regarding circuit-level adaptations that possibly contribute to pathological error-signaling and addiction can be formulated and tested. PMID:26494275

  10. The science of research: the principles underlying the discovery of cognitive and other biological mechanisms.

    PubMed

    Silva, Alcino J

    2007-01-01

    Studies of cognitive function include a wide spectrum of disciplines, with very diverse theoretical and practical frameworks. For example, in Behavioral Neuroscience cognitive mechanisms are mostly inferred from loss of function (lesion) experiments while in Cognitive Neuroscience these mechanisms are commonly deduced from brain activation patterns. Although neuroscientists acknowledge the limitations of deriving conclusions using a limited scope of approaches, there are no systematically studied, objective and explicit criteria for what is required to test a given hypothesis of cognitive function. This problem plagues every discipline in science: scientific research lacks objective, systematic studies that validate the principles underlying even its most elemental practices. For example, scientists decide what experiments are best suited to test key ideas in their field, which hypotheses have sufficient supporting evidence and which require further investigation, which studies are important and which are not, based on intuitions derived from experience, implicit principles learned from mentors and colleagues, traditions in their fields, etc. Philosophers have made numerous attempts to articulate and frame the principles that guide research and innovation, but these speculative ideas have remained untested and have had a minimal impact on the work of scientists. Here, I propose the development of methods for systematically and objectively studying and improving the modus operandi of research and development. This effort (the science of scientific research or S2) will benefit all aspects of science, from education of young scientists to research, publishing and funding, since it will provide explicit and systematically tested frameworks for practices in science. To illustrate its goals, I will introduce a hypothesis (the Convergent Four) derived from experimental practices common in molecular and cellular biology. This S2 hypothesis proposes that there are at least four fundamentally distinct strategies that scientists can use to test the connection between two phenomena of interest (A and B), and that to establish a compelling connection between A and B it is crucial to develop independently confirmed lines of convergent evidence in each of these four categories. The four categories include negative alteration (decrease probability of A or p(A) and determine p(B)), positive alteration (increase p(A) and determine p(B)), non-intervention (examine whether A precedes B) and integration (develop ideas about how to get from A to B and integrate those ideas with other available information about A and B). I will discuss both strategies to test this hypothesis and its implications for studies of cognitive function.

  11. 5-SPICE: the application of an original framework for community health worker program design, quality improvement and research agenda setting

    PubMed Central

    Palazuelos, Daniel; DaEun Im, Dana; Peckarsky, Matthew; Schwarz, Dan; Farmer, Didi Bertrand; Dhillon, Ranu; Johnson, Ari; Orihuela, Claudia; Hackett, Jill; Bazile, Junior; Berman, Leslie; Ballard, Madeleine; Panjabi, Raj; Ternier, Ralph; Slavin, Sam; Lee, Scott; Selinsky, Steve; Mitnick, Carole Diane

    2013-01-01

    Introduction Despite decades of experience with community health workers (CHWs) in a wide variety of global health projects, there is no established conceptual framework that structures how implementers and researchers can understand, study and improve their respective programs based on lessons learned by other CHW programs. Objective To apply an original, non-linear framework and case study method, 5-SPICE, to multiple sister projects of a large, international non-governmental organization (NGO), and other CHW projects. Design Engaging a large group of implementers, researchers and the best available literature, the 5-SPICE framework was refined and then applied to a selection of CHW programs. Insights gleaned from the case study method were summarized in a tabular format named the ‘5×5-SPICE chart’. This format graphically lists the ways in which essential CHW program elements interact, both positively and negatively, in the implementation field. Results The 5×5-SPICE charts reveal a variety of insights that come from a more complex understanding of how essential CHW projects interact and influence each other in their unique context. Some have been well described in the literature previously, while others are exclusive to this article. An analysis of how best to compensate CHWs is also offered as an example of the type of insights that this method may yield. Conclusions The 5-SPICE framework is a novel instrument that can be used to guide discussions about CHW projects. Insights from this process can help guide quality improvement efforts, or be used as hypothesis that will form the basis of a program's research agenda. Recent experience with research protocols embedded into successfully implemented projects demonstrates how such hypothesis can be rigorously tested. PMID:23561023

  12. Multivariate cross-frequency coupling via generalized eigendecomposition

    PubMed Central

    Cohen, Michael X

    2017-01-01

    This paper presents a new framework for analyzing cross-frequency coupling in multichannel electrophysiological recordings. The generalized eigendecomposition-based cross-frequency coupling framework (gedCFC) is inspired by source-separation algorithms combined with dynamics of mesoscopic neurophysiological processes. It is unaffected by factors that confound traditional CFC methods—such as non-stationarities, non-sinusoidality, and non-uniform phase angle distributions—attractive properties considering that brain activity is neither stationary nor perfectly sinusoidal. The gedCFC framework opens new opportunities for conceptualizing CFC as network interactions with diverse spatial/topographical distributions. Five specific methods within the gedCFC framework are detailed, these are validated in simulated data and applied in several empirical datasets. gedCFC accurately recovers physiologically plausible CFC patterns embedded in noise that causes traditional CFC methods to perform poorly. The paper also demonstrates that spike-field coherence in multichannel local field potential data can be analyzed using the gedCFC framework, which provides significant advantages over traditional spike-field coherence analyses. Null-hypothesis testing is also discussed. DOI: http://dx.doi.org/10.7554/eLife.21792.001 PMID:28117662

  13. Effects of Cognitive Load on Driving Performance: The Cognitive Control Hypothesis.

    PubMed

    Engström, Johan; Markkula, Gustav; Victor, Trent; Merat, Natasha

    2017-08-01

    The objective of this paper was to outline an explanatory framework for understanding effects of cognitive load on driving performance and to review the existing experimental literature in the light of this framework. Although there is general consensus that taking the eyes off the forward roadway significantly impairs most aspects of driving, the effects of primarily cognitively loading tasks on driving performance are not well understood. Based on existing models of driver attention, an explanatory framework was outlined. This framework can be summarized in terms of the cognitive control hypothesis: Cognitive load selectively impairs driving subtasks that rely on cognitive control but leaves automatic performance unaffected. An extensive literature review was conducted wherein existing results were reinterpreted based on the proposed framework. It was demonstrated that the general pattern of experimental results reported in the literature aligns well with the cognitive control hypothesis and that several apparent discrepancies between studies can be reconciled based on the proposed framework. More specifically, performance on nonpracticed or inherently variable tasks, relying on cognitive control, is consistently impaired by cognitive load, whereas the performance on automatized (well-practiced and consistently mapped) tasks is unaffected and sometimes even improved. Effects of cognitive load on driving are strongly selective and task dependent. The present results have important implications for the generalization of results obtained from experimental studies to real-world driving. The proposed framework can also serve to guide future research on the potential causal role of cognitive load in real-world crashes.

  14. Controlling uncertainty: a review of human behavior in complex dynamic environments.

    PubMed

    Osman, Magda

    2010-01-01

    Complex dynamic control (CDC) tasks are a type of problem-solving environment used for examining many cognitive activities (e.g., attention, control, decision making, hypothesis testing, implicit learning, memory, monitoring, planning, and problem solving). Because of their popularity, there have been many findings from diverse domains of research (economics, engineering, ergonomics, human-computer interaction, management, psychology), but they remain largely disconnected from each other. The objective of this article is to review theoretical developments and empirical work on CDC tasks, and to introduce a novel framework (monitoring and control framework) as a tool for integrating theory and findings. The main thesis of the monitoring and control framework is that CDC tasks are characteristically uncertain environments, and subjective judgments of uncertainty guide the way in which monitoring and control behaviors attempt to reduce it. The article concludes by discussing new insights into continuing debates and future directions for research on CDC tasks.

  15. Stepwise and stagewise approaches for spatial cluster detection

    PubMed Central

    Xu, Jiale

    2016-01-01

    Spatial cluster detection is an important tool in many areas such as sociology, botany and public health. Previous work has mostly taken either hypothesis testing framework or Bayesian framework. In this paper, we propose a few approaches under a frequentist variable selection framework for spatial cluster detection. The forward stepwise methods search for multiple clusters by iteratively adding currently most likely cluster while adjusting for the effects of previously identified clusters. The stagewise methods also consist of a series of steps, but with tiny step size in each iteration. We study the features and performances of our proposed methods using simulations on idealized grids or real geographic area. From the simulations, we compare the performance of the proposed methods in terms of estimation accuracy and power of detections. These methods are applied to the the well-known New York leukemia data as well as Indiana poverty data. PMID:27246273

  16. Stepwise and stagewise approaches for spatial cluster detection.

    PubMed

    Xu, Jiale; Gangnon, Ronald E

    2016-05-01

    Spatial cluster detection is an important tool in many areas such as sociology, botany and public health. Previous work has mostly taken either a hypothesis testing framework or a Bayesian framework. In this paper, we propose a few approaches under a frequentist variable selection framework for spatial cluster detection. The forward stepwise methods search for multiple clusters by iteratively adding currently most likely cluster while adjusting for the effects of previously identified clusters. The stagewise methods also consist of a series of steps, but with a tiny step size in each iteration. We study the features and performances of our proposed methods using simulations on idealized grids or real geographic areas. From the simulations, we compare the performance of the proposed methods in terms of estimation accuracy and power. These methods are applied to the the well-known New York leukemia data as well as Indiana poverty data. Copyright © 2016 Elsevier Ltd. All rights reserved.

  17. A new method for sperm characterization for infertility treatment: hypothesis testing by using combination of watershed segmentation and graph theory.

    PubMed

    Shojaedini, Seyed Vahab; Heydari, Masoud

    2014-10-01

    Shape and movement features of sperms are important parameters for infertility study and treatment. In this article, a new method is introduced for characterization sperms in microscopic videos. In this method, first a hypothesis framework is defined to distinguish sperms from other particles in captured video. Then decision about each hypothesis is done in following steps: Selecting some primary regions as candidates for sperms by watershed-based segmentation, pruning of some false candidates during successive frames using graph theory concept and finally confirming correct sperms by using their movement trajectories. Performance of the proposed method is evaluated on real captured images belongs to semen with high density of sperms. The obtained results show the proposed method may detect 97% of sperms in presence of 5% false detections and track 91% of moving sperms. Furthermore, it can be shown that better characterization of sperms in proposed algorithm doesn't lead to extracting more false sperms compared to some present approaches.

  18. Sensor potency of the moonlighting enzyme-decorated cytoskeleton: the cytoskeleton as a metabolic sensor

    PubMed Central

    2013-01-01

    Background There is extensive evidence for the interaction of metabolic enzymes with the eukaryotic cytoskeleton. The significance of these interactions is far from clear. Presentation of the hypothesis In the cytoskeletal integrative sensor hypothesis presented here, the cytoskeleton senses and integrates the general metabolic activity of the cell. This activity depends on the binding to the cytoskeleton of enzymes and, depending on the nature of the enzyme, this binding may occur if the enzyme is either active or inactive but not both. This enzyme-binding is further proposed to stabilize microtubules and microfilaments and to alter rates of GTP and ATP hydrolysis and their levels. Testing the hypothesis Evidence consistent with the cytoskeletal integrative sensor hypothesis is presented in the case of glycolysis. Several testable predictions are made. There should be a relationship between post-translational modifications of tubulin and of actin and their interaction with metabolic enzymes. Different conditions of cytoskeletal dynamics and enzyme-cytoskeleton binding should reveal significant differences in local and perhaps global levels and ratios of ATP and GTP. The different functions of moonlighting enzymes should depend on cytoskeletal binding. Implications of the hypothesis The physical and chemical effects arising from metabolic sensing by the cytoskeleton would have major consequences on cell shape, dynamics and cell cycle progression. The hypothesis provides a framework that helps the significance of the enzyme-decorated cytoskeleton be determined. PMID:23398642

  19. A novel examination of atypical major depressive disorder based on attachment theory.

    PubMed

    Levitan, Robert D; Atkinson, Leslie; Pedersen, Rebecca; Buis, Tom; Kennedy, Sidney H; Chopra, Kevin; Leung, Eman M; Segal, Zindel V

    2009-06-01

    While a large body of descriptive work has thoroughly investigated the clinical correlates of atypical depression, little is known about its fundamental origins. This study examined atypical depression from an attachment theory framework. Our hypothesis was that, compared to adults with melancholic depression, those with atypical depression would report more anxious-ambivalent attachment and less secure attachment. As gender has been an important consideration in prior work on atypical depression, this same hypothesis was further tested in female subjects only. One hundred ninety-nine consecutive adults presenting to a tertiary mood disorders clinic with major depressive disorder with either atypical or melancholic features according to the Structured Clinical Interview for DSM-IV Axis-I Disorders were administered a self-report adult attachment questionnaire to assess the core dimensions of secure, anxious-ambivalent, and avoidant attachment. Attachment scores were compared across the 2 depressed groups defined by atypical and melancholic features using multivariate analysis of variance. The study was conducted between 1999 and 2004. When men and women were considered together, the multivariate test comparing attachment scores by depressive group was statistically significant at p < .05. Between-subjects testing indicated that atypical depression was associated with significantly lower secure attachment scores, with a trend toward higher anxious-ambivalent attachment scores, than was melancholia. When women were analyzed separately, the multivariate test was statistically significant at p < .01, with both secure and anxious-ambivalent attachment scores differing significantly across depressive groups. These preliminary findings suggest that attachment theory, and insecure and anxious-ambivalent attachment in particular, may be a useful framework from which to study the origins, clinical correlates, and treatment of atypical depression. Gender may be an important consideration when considering atypical depression from an attachment perspective. Copyright 2009 Physicians Postgraduate Press, Inc.

  20. Novel statistical framework to identify differentially expressed genes allowing transcriptomic background differences.

    PubMed

    Ling, Zhi-Qiang; Wang, Yi; Mukaisho, Kenichi; Hattori, Takanori; Tatsuta, Takeshi; Ge, Ming-Hua; Jin, Li; Mao, Wei-Min; Sugihara, Hiroyuki

    2010-06-01

    Tests of differentially expressed genes (DEGs) from microarray experiments are based on the null hypothesis that genes that are irrelevant to the phenotype/stimulus are expressed equally in the target and control samples. However, this strict hypothesis is not always true, as there can be several transcriptomic background differences between target and control samples, including different cell/tissue types, different cell cycle stages and different biological donors. These differences lead to increased false positives, which have little biological/medical significance. In this article, we propose a statistical framework to identify DEGs between target and control samples from expression microarray data allowing transcriptomic background differences between these samples by introducing a modified null hypothesis that the gene expression background difference is normally distributed. We use an iterative procedure to perform robust estimation of the null hypothesis and identify DEGs as outliers. We evaluated our method using our own triplicate microarray experiment, followed by validations with reverse transcription-polymerase chain reaction (RT-PCR) and on the MicroArray Quality Control dataset. The evaluations suggest that our technique (i) results in less false positive and false negative results, as measured by the degree of agreement with RT-PCR of the same samples, (ii) can be applied to different microarray platforms and results in better reproducibility as measured by the degree of DEG identification concordance both intra- and inter-platforms and (iii) can be applied efficiently with only a few microarray replicates. Based on these evaluations, we propose that this method not only identifies more reliable and biologically/medically significant DEG, but also reduces the power-cost tradeoff problem in the microarray field. Source code and binaries freely available for download at http://comonca.org.cn/fdca/resources/softwares/deg.zip.

  1. The role of effort in influencing the effect of anxiety on performance: testing the conflicting predictions of processing efficiency theory and the conscious processing hypothesis.

    PubMed

    Wilson, Mark; Smith, Nickolas C; Holmes, Paul S

    2007-08-01

    The aim of this study was to test the conflicting predictions of processing efficiency theory (PET) and the conscious processing hypothesis (CPH) regarding effort's role in influencing the effects of anxiety on a golf putting task. Mid-handicap golfers made a series of putts to target holes under two counterbalanced conditions designed to manipulate the level of anxiety experienced. The effort exerted on each putting task was assessed though self-report, psychophysiological (heart rate variability) and behavioural (pre-putt time and glances at the target) measures. Performance was assessed by putting error. Results were generally more supportive of the predictions of PET rather than the CPH as performance was maintained for some performers despite increased state anxiety and a reduction in processing efficiency. The findings of this study support previous research suggesting that both theories offer useful theoretical frameworks for examining the relationship between anxiety and performance in sport.

  2. The effect of urbanization and industrialization on carbon emissions in Turkey: evidence from ARDL bounds testing procedure.

    PubMed

    Pata, Ugur Korkut

    2018-03-01

    This paper examines the dynamic short- and long-term relationship between per capita GDP, per capita energy consumption, financial development, urbanization, industrialization, and per capita carbon dioxide (CO 2 ) emissions within the framework of the environmental Kuznets curve (EKC) hypothesis for Turkey covering the period from 1974 to 2013. According to the results of the autoregressive distributed lag bounds testing approach, an increase in per capita GDP, per capita energy consumption, financial development, urbanization, and industrialization has a positive effect on per capita CO 2 emissions in the long term, and also the variables other than urbanization increase per capita CO 2 emissions in the short term. In addition, the findings support the validity of the EKC hypothesis for Turkey in the short and long term. However, the turning points obtained from long-term regressions lie outside the sample period. Therefore, as the per capita GDP increases in Turkey, per capita CO 2 emissions continue to increase.

  3. New developments in the evolution and application of the WHO/IPCS framework on mode of action/species concordance analysis.

    PubMed

    Meek, M E; Boobis, A; Cote, I; Dellarco, V; Fotakis, G; Munn, S; Seed, J; Vickers, C

    2014-01-01

    The World Health Organization/International Programme on Chemical Safety mode of action/human relevance framework has been updated to reflect the experience acquired in its application and extend its utility to emerging areas in toxicity testing and non-testing methods. The underlying principles have not changed, but the framework's scope has been extended to enable integration of information at different levels of biological organization and reflect evolving experience in a much broader range of potential applications. Mode of action/species concordance analysis can also inform hypothesis-based data generation and research priorities in support of risk assessment. The modified framework is incorporated within a roadmap, with feedback loops encouraging continuous refinement of fit-for-purpose testing strategies and risk assessment. Important in this construct is consideration of dose-response relationships and species concordance analysis in weight of evidence. The modified Bradford Hill considerations have been updated and additionally articulated to reflect increasing experience in application for cases where the toxicological outcome of chemical exposure is known. The modified framework can be used as originally intended, where the toxicological effects of chemical exposure are known, or in hypothesizing effects resulting from chemical exposure, using information on putative key events in established modes of action from appropriate in vitro or in silico systems and other lines of evidence. This modified mode of action framework and accompanying roadmap and case examples are expected to contribute to improving transparency in explicitly addressing weight of evidence considerations in mode of action/species concordance analysis based on both conventional data sources and evolving methods. Copyright © 2013 John Wiley & Sons, Ltd. The World Health Organization retains copyright and all other rights in the manuscript of this article as submitted for publication.

  4. Angular spectral framework to test full corrections of paraxial solutions.

    PubMed

    Mahillo-Isla, R; González-Morales, M J

    2015-07-01

    Different correction methods for paraxial solutions have been used when such solutions extend out of the paraxial regime. The authors have used correction methods guided by either their experience or some educated hypothesis pertinent to the particular problem that they were tackling. This article provides a framework so as to classify full wave correction schemes. Thus, for a given solution of the paraxial wave equation, we can select the best correction scheme of those available. Some common correction methods are considered and evaluated under the proposed scope. Another remarkable contribution is obtained by giving the necessary conditions that two solutions of the Helmholtz equation must accomplish to accept a common solution of the parabolic wave equation as a paraxial approximation of both solutions.

  5. The relative importance of reproductive assurance and automatic selection as hypotheses for the evolution of self-fertilization

    PubMed Central

    Busch, Jeremiah W.; Delph, Lynda F.

    2012-01-01

    Background The field of plant mating-system evolution has long been interested in understanding why selfing evolves from outcrossing. Many possible mechanisms drive this evolutionary trend, but most research has focused upon the transmission advantage of selfing and its ability to provide reproductive assurance when cross-pollination is uncertain. We discuss the shared conceptual framework of these ideas and their empirical support that is emerging from tests of their predictions over the last 25 years. Scope These two hypotheses are derived from the same strategic framework. The transmission advantage hypothesis involves purely gene-level selection, with reproductive assurance involving an added component of individual-level selection. Support for both of these ideas has been garnered from population-genetic tests of their predictions. Studies in natural populations often show that selfing increases seed production, but it is not clear if this benefit is sufficient to favour the evolution of selfing, and the ecological agents limiting outcross pollen are often not identified. Pollen discounting appears to be highly variable and important in systems where selfing involves multiple floral adaptations, yet seed discounting has rarely been investigated. Although reproductive assurance appears likely as a leading factor facilitating the evolution of selfing, studies must account for both seed and pollen discounting to adequately test this hypothesis. Conclusions The transmission advantage and reproductive assurance ideas describe components of gene transmission that favour selfing. Future work should move beyond their dichotomous presentation and focus upon understanding whether selection through pollen, seed or both explains the spread of selfing-rate modifiers in plant populations. PMID:21937484

  6. The relative importance of reproductive assurance and automatic selection as hypotheses for the evolution of self-fertilization.

    PubMed

    Busch, Jeremiah W; Delph, Lynda F

    2012-02-01

    The field of plant mating-system evolution has long been interested in understanding why selfing evolves from outcrossing. Many possible mechanisms drive this evolutionary trend, but most research has focused upon the transmission advantage of selfing and its ability to provide reproductive assurance when cross-pollination is uncertain. We discuss the shared conceptual framework of these ideas and their empirical support that is emerging from tests of their predictions over the last 25 years. These two hypotheses are derived from the same strategic framework. The transmission advantage hypothesis involves purely gene-level selection, with reproductive assurance involving an added component of individual-level selection. Support for both of these ideas has been garnered from population-genetic tests of their predictions. Studies in natural populations often show that selfing increases seed production, but it is not clear if this benefit is sufficient to favour the evolution of selfing, and the ecological agents limiting outcross pollen are often not identified. Pollen discounting appears to be highly variable and important in systems where selfing involves multiple floral adaptations, yet seed discounting has rarely been investigated. Although reproductive assurance appears likely as a leading factor facilitating the evolution of selfing, studies must account for both seed and pollen discounting to adequately test this hypothesis. The transmission advantage and reproductive assurance ideas describe components of gene transmission that favour selfing. Future work should move beyond their dichotomous presentation and focus upon understanding whether selection through pollen, seed or both explains the spread of selfing-rate modifiers in plant populations.

  7. Two-condition within-participant statistical mediation analysis: A path-analytic framework.

    PubMed

    Montoya, Amanda K; Hayes, Andrew F

    2017-03-01

    Researchers interested in testing mediation often use designs where participants are measured on a dependent variable Y and a mediator M in both of 2 different circumstances. The dominant approach to assessing mediation in such a design, proposed by Judd, Kenny, and McClelland (2001), relies on a series of hypothesis tests about components of the mediation model and is not based on an estimate of or formal inference about the indirect effect. In this article we recast Judd et al.'s approach in the path-analytic framework that is now commonly used in between-participant mediation analysis. By so doing, it is apparent how to estimate the indirect effect of a within-participant manipulation on some outcome through a mediator as the product of paths of influence. This path-analytic approach eliminates the need for discrete hypothesis tests about components of the model to support a claim of mediation, as Judd et al.'s method requires, because it relies only on an inference about the product of paths-the indirect effect. We generalize methods of inference for the indirect effect widely used in between-participant designs to this within-participant version of mediation analysis, including bootstrap confidence intervals and Monte Carlo confidence intervals. Using this path-analytic approach, we extend the method to models with multiple mediators operating in parallel and serially and discuss the comparison of indirect effects in these more complex models. We offer macros and code for SPSS, SAS, and Mplus that conduct these analyses. (PsycINFO Database Record (c) 2017 APA, all rights reserved).

  8. The Epistemology of a Rule-Based Expert System: A Framework for Explanation.

    DTIC Science & Technology

    1982-01-01

    Hypothesis e.coli cryptococcus "concluded by" 3 Rule Rule543 Rule535 predicates" 4 Hypothesis meningitis bacterial steroids a3coholic "more general" 5...the hypothesis "e.coll Is causing meningitis" before " cryptococcus is causing meningitis" Is strategic. And recalling an earlier example

  9. Is there something quantum-like about the human mental lexicon?

    PubMed Central

    Bruza, Peter; Kitto, Kirsty; Nelson, Douglas; McEvoy, Cathy

    2010-01-01

    Following an early claim by Nelson & McEvoy (35) suggesting that word associations can display ‘spooky action at a distance behaviour’, a serious investigation of the potentially quantum nature of such associations is currently underway. In this paper quantum theory is proposed as a framework suitable for modelling the human mental lexicon, specifically the results obtained from both intralist and extralist word association experiments. Some initial models exploring this hypothesis are discussed, and experiments capable of testing these models proposed. PMID:20224806

  10. Temporal, spatial and ecological dynamics of speciation among amphi-Beringian small mammals

    USGS Publications Warehouse

    Hope, Andrew G.; Takebayashi, Naoki; Galbreath, Kurt E.; Talbot, Sandra L.; Cook, Joseph A.

    2013-01-01

    Quaternary climate cycles played an important role in promoting diversification across the Northern Hemisphere, although details of the mechanisms driving evolutionary change are still poorly resolved. In a comparative phylogeographical framework, we investigate temporal, spatial and ecological components of evolution within a suite of Holarctic small mammals. We test a hypothesis of simultaneous divergence among multiple taxon pairs, investigating time to coalescence and demographic change for each taxon in response to a combination of climate and geography.

  11. A framework for investigating geographical variation in diseases, based on a study of Legionnaires' disease.

    PubMed

    Bhopal, R S

    1991-11-01

    Demonstration of geographical variations in disease can yield powerful insight into the disease pathway, particularly for environmentally acquired conditions, but only if the many problems of data interpretation can be solved. This paper presents the framework, methods and principles guiding a study of the geographical epidemiology of Legionnaires' Disease in Scotland. A case-list was constructed and disease incidence rates were calculated by geographical area; these showed variation. Five categories of explanation for the variation were identified: short-term fluctuations of incidence in time masquerading as differences by place; artefact; and differences in host-susceptibility, agent virulence, or environment. The methods used to study these explanations, excepting agent virulence, are described, with an emphasis on the use of previously existing data to test hypotheses. Examples include the use of mortality, census and hospital morbidity data to assess the artefact and host-susceptibility explanations; and the use of ratios of serology tests to disease to examine the differential testing hypothesis. The reasoning and process by which the environmental focus of the study was narrowed and the technique for relating the geographical pattern of disease to the putative source are outlined. This framework allows the researcher to plan for the parallel collection of the data necessary both to demonstrate geographical variation and to point to the likely explanation.

  12. [Dilemma of null hypothesis in ecological hypothesis's experiment test.

    PubMed

    Li, Ji

    2016-06-01

    Experimental test is one of the major test methods of ecological hypothesis, though there are many arguments due to null hypothesis. Quinn and Dunham (1983) analyzed the hypothesis deduction model from Platt (1964) and thus stated that there is no null hypothesis in ecology that can be strictly tested by experiments. Fisher's falsificationism and Neyman-Pearson (N-P)'s non-decisivity inhibit statistical null hypothesis from being strictly tested. Moreover, since the null hypothesis H 0 (α=1, β=0) and alternative hypothesis H 1 '(α'=1, β'=0) in ecological progresses are diffe-rent from classic physics, the ecological null hypothesis can neither be strictly tested experimentally. These dilemmas of null hypothesis could be relieved via the reduction of P value, careful selection of null hypothesis, non-centralization of non-null hypothesis, and two-tailed test. However, the statistical null hypothesis significance testing (NHST) should not to be equivalent to the causality logistical test in ecological hypothesis. Hence, the findings and conclusions about methodological studies and experimental tests based on NHST are not always logically reliable.

  13. Phylogeny and Niche Conservatism in North and Central American Triatomine Bugs (Hemiptera: Reduviidae: Triatominae), Vectors of Chagas' Disease

    PubMed Central

    Ibarra-Cerdeña, Carlos N.; Zaldívar-Riverón, Alejandro; Peterson, A. Townsend; Sánchez-Cordero, Víctor; Ramsey, Janine M.

    2014-01-01

    The niche conservatism hypothesis states that related species diverge in niche characteristics at lower rates than expected, given their lineage divergence. Here we analyze whether niche conservatism is a common pattern among vector species (Hemiptera: Reduviidae: Triatominae) of Trypanosoma cruzi that inhabit North and Central America, a highly heterogeneous landmass in terms of environmental gradients. Mitochondrial and nuclear loci were used in a multi-locus phylogenetic framework to reconstruct phylogenetic relationships among species and estimate time of divergence of selected clades to draw biogeographic inferences. Then, we estimated similarity between the ecological niche of sister species and tested the niche conservatism hypothesis using our best estimate of phylogeny. Triatoma is not monophyletic. A primary clade with all North and Central American (NCA) triatomine species from the genera Triatoma, Dipetalogaster, and Panstrongylus, was consistently recovered. Nearctic species within the NCA clade (T. p. protracta, T. r. rubida) diverged during the Pliocene, whereas the Neotropical species (T. phyllosoma, T. longipennis, T. dimidiata complex) are estimated to have diverged more recently, during the Pleistocene. The hypothesis of niche conservatism could not be rejected for any of six sister species pairs. Niche similarity between sister species best fits a retention model. While this framework is used here to infer niche evolution, it has a direct impact on spatial vector dynamics driven by human population movements, expansion of transportation networks and climate change scenarios. PMID:25356550

  14. Oxytocin tempers calculated greed but not impulsive defense in predator–prey contests

    PubMed Central

    Scholte, H. Steven; van Winden, Frans A. A. M.; Ridderinkhof, K. Richard

    2015-01-01

    Human cooperation and competition is modulated by oxytocin, a hypothalamic neuropeptide that functions as both hormone and neurotransmitter. Oxytocin’s functions can be captured in two explanatory yet largely contradictory frameworks: the fear-dampening (FD) hypothesis that oxytocin has anxiolytic effects and reduces fear-motivated action; and the social approach/avoidance (SAA) hypothesis that oxytocin increases cooperative approach and facilitates protection against aversive stimuli and threat. We tested derivations from both frameworks in a novel predator–prey contest game. Healthy males given oxytocin or placebo invested as predator to win their prey’s endowment, or as prey to protect their endowment against predation. Neural activity was registered using 3T-MRI. In prey, (fear-motivated) investments were fast and conditioned on the amygdala. Inconsistent with FD, oxytocin did not modulate neural and behavioral responding in prey. In predators, (greed-motivated) investments were slower, and conditioned on the superior frontal gyrus (SFG). Consistent with SAA, oxytocin reduced predator investment, time to decide and activation in SFG. Thus, whereas oxytocin does not incapacitate the impulsive ability to protect and defend oneself, it lowers the greedy and more calculated appetite for coming out ahead. PMID:25140047

  15. Bayesian enhancement two-stage design for single-arm phase II clinical trials with binary and time-to-event endpoints.

    PubMed

    Shi, Haolun; Yin, Guosheng

    2018-02-21

    Simon's two-stage design is one of the most commonly used methods in phase II clinical trials with binary endpoints. The design tests the null hypothesis that the response rate is less than an uninteresting level, versus the alternative hypothesis that the response rate is greater than a desirable target level. From a Bayesian perspective, we compute the posterior probabilities of the null and alternative hypotheses given that a promising result is declared in Simon's design. Our study reveals that because the frequentist hypothesis testing framework places its focus on the null hypothesis, a potentially efficacious treatment identified by rejecting the null under Simon's design could have only less than 10% posterior probability of attaining the desirable target level. Due to the indifference region between the null and alternative, rejecting the null does not necessarily mean that the drug achieves the desirable response level. To clarify such ambiguity, we propose a Bayesian enhancement two-stage (BET) design, which guarantees a high posterior probability of the response rate reaching the target level, while allowing for early termination and sample size saving in case that the drug's response rate is smaller than the clinically uninteresting level. Moreover, the BET design can be naturally adapted to accommodate survival endpoints. We conduct extensive simulation studies to examine the empirical performance of our design and present two trial examples as applications. © 2018, The International Biometric Society.

  16. Questioning the social intelligence hypothesis.

    PubMed

    Holekamp, Kay E

    2007-02-01

    The social intelligence hypothesis posits that complex cognition and enlarged "executive brains" evolved in response to challenges that are associated with social complexity. This hypothesis has been well supported, but some recent data are inconsistent with its predictions. It is becoming increasingly clear that multiple selective agents, and non-selective constraints, must have acted to shape cognitive abilities in humans and other animals. The task now is to develop a larger theoretical framework that takes into account both inter-specific differences and similarities in cognition. This new framework should facilitate consideration of how selection pressures that are associated with sociality interact with those that are imposed by non-social forms of environmental complexity, and how both types of functional demands interact with phylogenetic and developmental constraints.

  17. HYPOTHESIS SETTING AND ORDER STATISTIC FOR ROBUST GENOMIC META-ANALYSIS.

    PubMed

    Song, Chi; Tseng, George C

    2014-01-01

    Meta-analysis techniques have been widely developed and applied in genomic applications, especially for combining multiple transcriptomic studies. In this paper, we propose an order statistic of p-values ( r th ordered p-value, rOP) across combined studies as the test statistic. We illustrate different hypothesis settings that detect gene markers differentially expressed (DE) "in all studies", "in the majority of studies", or "in one or more studies", and specify rOP as a suitable method for detecting DE genes "in the majority of studies". We develop methods to estimate the parameter r in rOP for real applications. Statistical properties such as its asymptotic behavior and a one-sided testing correction for detecting markers of concordant expression changes are explored. Power calculation and simulation show better performance of rOP compared to classical Fisher's method, Stouffer's method, minimum p-value method and maximum p-value method under the focused hypothesis setting. Theoretically, rOP is found connected to the naïve vote counting method and can be viewed as a generalized form of vote counting with better statistical properties. The method is applied to three microarray meta-analysis examples including major depressive disorder, brain cancer and diabetes. The results demonstrate rOP as a more generalizable, robust and sensitive statistical framework to detect disease-related markers.

  18. Zero- vs. one-dimensional, parametric vs. non-parametric, and confidence interval vs. hypothesis testing procedures in one-dimensional biomechanical trajectory analysis.

    PubMed

    Pataky, Todd C; Vanrenterghem, Jos; Robinson, Mark A

    2015-05-01

    Biomechanical processes are often manifested as one-dimensional (1D) trajectories. It has been shown that 1D confidence intervals (CIs) are biased when based on 0D statistical procedures, and the non-parametric 1D bootstrap CI has emerged in the Biomechanics literature as a viable solution. The primary purpose of this paper was to clarify that, for 1D biomechanics datasets, the distinction between 0D and 1D methods is much more important than the distinction between parametric and non-parametric procedures. A secondary purpose was to demonstrate that a parametric equivalent to the 1D bootstrap exists in the form of a random field theory (RFT) correction for multiple comparisons. To emphasize these points we analyzed six datasets consisting of force and kinematic trajectories in one-sample, paired, two-sample and regression designs. Results showed, first, that the 1D bootstrap and other 1D non-parametric CIs were qualitatively identical to RFT CIs, and all were very different from 0D CIs. Second, 1D parametric and 1D non-parametric hypothesis testing results were qualitatively identical for all six datasets. Last, we highlight the limitations of 1D CIs by demonstrating that they are complex, design-dependent, and thus non-generalizable. These results suggest that (i) analyses of 1D data based on 0D models of randomness are generally biased unless one explicitly identifies 0D variables before the experiment, and (ii) parametric and non-parametric 1D hypothesis testing provide an unambiguous framework for analysis when one׳s hypothesis explicitly or implicitly pertains to whole 1D trajectories. Copyright © 2015 Elsevier Ltd. All rights reserved.

  19. A new framework of statistical inferences based on the valid joint sampling distribution of the observed counts in an incomplete contingency table.

    PubMed

    Tian, Guo-Liang; Li, Hui-Qiong

    2017-08-01

    Some existing confidence interval methods and hypothesis testing methods in the analysis of a contingency table with incomplete observations in both margins entirely depend on an underlying assumption that the sampling distribution of the observed counts is a product of independent multinomial/binomial distributions for complete and incomplete counts. However, it can be shown that this independency assumption is incorrect and can result in unreliable conclusions because of the under-estimation of the uncertainty. Therefore, the first objective of this paper is to derive the valid joint sampling distribution of the observed counts in a contingency table with incomplete observations in both margins. The second objective is to provide a new framework for analyzing incomplete contingency tables based on the derived joint sampling distribution of the observed counts by developing a Fisher scoring algorithm to calculate maximum likelihood estimates of parameters of interest, the bootstrap confidence interval methods, and the bootstrap testing hypothesis methods. We compare the differences between the valid sampling distribution and the sampling distribution under the independency assumption. Simulation studies showed that average/expected confidence-interval widths of parameters based on the sampling distribution under the independency assumption are shorter than those based on the new sampling distribution, yielding unrealistic results. A real data set is analyzed to illustrate the application of the new sampling distribution for incomplete contingency tables and the analysis results again confirm the conclusions obtained from the simulation studies.

  20. Automatic Ability Attribution after Failure: A Dual Process View of Achievement Attribution

    PubMed Central

    Sakaki, Michiko; Murayama, Kou

    2013-01-01

    Causal attribution has been one of the most influential frameworks in the literature of achievement motivation, but previous studies considered achievement attribution as relatively deliberate and effortful processes. In the current study, we tested the hypothesis that people automatically attribute their achievement failure to their ability, but reduce the ability attribution in a controlled manner. To address this hypothesis, we measured participants’ causal attribution belief for their task failure either under the cognitive load (load condition) or with full attention (no-load condition). Across two studies, participants attributed task performance to their ability more in the load than in the no-load condition. The increased ability attribution under cognitive load further affected intrinsic motivation. These results indicate that cognitive resources available after feedback play crucial roles in determining causal attribution belief, as well as achievement motivations. PMID:23667576

  1. Extended target recognition in cognitive radar networks.

    PubMed

    Wei, Yimin; Meng, Huadong; Liu, Yimin; Wang, Xiqin

    2010-01-01

    We address the problem of adaptive waveform design for extended target recognition in cognitive radar networks. A closed-loop active target recognition radar system is extended to the case of a centralized cognitive radar network, in which a generalized likelihood ratio (GLR) based sequential hypothesis testing (SHT) framework is employed. Using Doppler velocities measured by multiple radars, the target aspect angle for each radar is calculated. The joint probability of each target hypothesis is then updated using observations from different radar line of sights (LOS). Based on these probabilities, a minimum correlation algorithm is proposed to adaptively design the transmit waveform for each radar in an amplitude fluctuation situation. Simulation results demonstrate performance improvements due to the cognitive radar network and adaptive waveform design. Our minimum correlation algorithm outperforms the eigen-waveform solution and other non-cognitive waveform design approaches.

  2. Automatic ability attribution after failure: a dual process view of achievement attribution.

    PubMed

    Sakaki, Michiko; Murayama, Kou

    2013-01-01

    Causal attribution has been one of the most influential frameworks in the literature of achievement motivation, but previous studies considered achievement attribution as relatively deliberate and effortful processes. In the current study, we tested the hypothesis that people automatically attribute their achievement failure to their ability, but reduce the ability attribution in a controlled manner. To address this hypothesis, we measured participants' causal attribution belief for their task failure either under the cognitive load (load condition) or with full attention (no-load condition). Across two studies, participants attributed task performance to their ability more in the load than in the no-load condition. The increased ability attribution under cognitive load further affected intrinsic motivation. These results indicate that cognitive resources available after feedback play crucial roles in determining causal attribution belief, as well as achievement motivations.

  3. Capturing farm diversity with hypothesis-based typologies: An innovative methodological framework for farming system typology development

    PubMed Central

    Alvarez, Stéphanie; Timler, Carl J.; Michalscheck, Mirja; Paas, Wim; Descheemaeker, Katrien; Tittonell, Pablo; Andersson, Jens A.; Groot, Jeroen C. J.

    2018-01-01

    Creating typologies is a way to summarize the large heterogeneity of smallholder farming systems into a few farm types. Various methods exist, commonly using statistical analysis, to create these typologies. We demonstrate that the methodological decisions on data collection, variable selection, data-reduction and clustering techniques can bear a large impact on the typology results. We illustrate the effects of analysing the diversity from different angles, using different typology objectives and different hypotheses, on typology creation by using an example from Zambia’s Eastern Province. Five separate typologies were created with principal component analysis (PCA) and hierarchical clustering analysis (HCA), based on three different expert-informed hypotheses. The greatest overlap between typologies was observed for the larger, wealthier farm types but for the remainder of the farms there were no clear overlaps between typologies. Based on these results, we argue that the typology development should be guided by a hypothesis on the local agriculture features and the drivers and mechanisms of differentiation among farming systems, such as biophysical and socio-economic conditions. That hypothesis is based both on the typology objective and on prior expert knowledge and theories of the farm diversity in the study area. We present a methodological framework that aims to integrate participatory and statistical methods for hypothesis-based typology construction. This is an iterative process whereby the results of the statistical analysis are compared with the reality of the target population as hypothesized by the local experts. Using a well-defined hypothesis and the presented methodological framework, which consolidates the hypothesis through local expert knowledge for the creation of typologies, warrants development of less subjective and more contextualized quantitative farm typologies. PMID:29763422

  4. Capturing farm diversity with hypothesis-based typologies: An innovative methodological framework for farming system typology development.

    PubMed

    Alvarez, Stéphanie; Timler, Carl J; Michalscheck, Mirja; Paas, Wim; Descheemaeker, Katrien; Tittonell, Pablo; Andersson, Jens A; Groot, Jeroen C J

    2018-01-01

    Creating typologies is a way to summarize the large heterogeneity of smallholder farming systems into a few farm types. Various methods exist, commonly using statistical analysis, to create these typologies. We demonstrate that the methodological decisions on data collection, variable selection, data-reduction and clustering techniques can bear a large impact on the typology results. We illustrate the effects of analysing the diversity from different angles, using different typology objectives and different hypotheses, on typology creation by using an example from Zambia's Eastern Province. Five separate typologies were created with principal component analysis (PCA) and hierarchical clustering analysis (HCA), based on three different expert-informed hypotheses. The greatest overlap between typologies was observed for the larger, wealthier farm types but for the remainder of the farms there were no clear overlaps between typologies. Based on these results, we argue that the typology development should be guided by a hypothesis on the local agriculture features and the drivers and mechanisms of differentiation among farming systems, such as biophysical and socio-economic conditions. That hypothesis is based both on the typology objective and on prior expert knowledge and theories of the farm diversity in the study area. We present a methodological framework that aims to integrate participatory and statistical methods for hypothesis-based typology construction. This is an iterative process whereby the results of the statistical analysis are compared with the reality of the target population as hypothesized by the local experts. Using a well-defined hypothesis and the presented methodological framework, which consolidates the hypothesis through local expert knowledge for the creation of typologies, warrants development of less subjective and more contextualized quantitative farm typologies.

  5. Test of a hypothesis of realism in quantum theory using a Bayesian approach

    NASA Astrophysics Data System (ADS)

    Nikitin, N.; Toms, K.

    2017-05-01

    In this paper we propose a time-independent equality and time-dependent inequality, suitable for an experimental test of the hypothesis of realism. The derivation of these relations is based on the concept of conditional probability and on Bayes' theorem in the framework of Kolmogorov's axiomatics of probability theory. The equality obtained is intrinsically different from the well-known Greenberger-Horne-Zeilinger (GHZ) equality and its variants, because violation of the proposed equality might be tested in experiments with only two microsystems in a maximally entangled Bell state |Ψ-> , while a test of the GHZ equality requires at least three quantum systems in a special state |ΨGHZ> . The obtained inequality differs from Bell's, Wigner's, and Leggett-Garg inequalities, because it deals with spin s =1 /2 projections onto only two nonparallel directions at two different moments of time, while a test of the Bell and Wigner inequalities requires at least three nonparallel directions, and a test of the Leggett-Garg inequalities requires at least three distinct moments of time. Hence, the proposed inequality seems to open an additional experimental possibility to avoid the "contextuality loophole." Violation of the proposed equality and inequality is illustrated with the behavior of a pair of anticorrelated spins in an external magnetic field and also with the oscillations of flavor-entangled pairs of neutral pseudoscalar mesons.

  6. Fluid cognitive ability is a resource for successful emotion regulation in older and younger adults

    PubMed Central

    Opitz, Philipp C.; Lee, Ihno A.; Gross, James J.; Urry, Heather L.

    2014-01-01

    The Selection, Optimization, and Compensation with Emotion Regulation (SOC-ER) framework suggests that (1) emotion regulation (ER) strategies require resources and that (2) higher levels of relevant resources may increase ER success. In the current experiment, we tested the specific hypothesis that individual differences in one internal class of resources, namely cognitive ability, would contribute to greater success using cognitive reappraisal (CR), a form of ER in which one reinterprets the meaning of emotion-eliciting situations. To test this hypothesis, 60 participants (30 younger and 30 older adults) completed standardized neuropsychological tests that assess fluid and crystallized cognitive ability, as well as a CR task in which participants reinterpreted the meaning of sad pictures in order to alter (increase or decrease) their emotions. In a control condition, they viewed the pictures without trying to change how they felt. Throughout the task, we indexed subjective emotional experience (self-reported ratings of emotional intensity), expressive behavior (corrugator muscle activity), and autonomic physiology (heart rate and electrodermal activity) as measures of emotional responding. Multilevel models were constructed to explain within-subjects variation in emotional responding as a function of ER contrasts comparing increase or decrease conditions with the view control condition and between-subjects variation as a function of cognitive ability and/or age group (older, younger). As predicted, higher fluid cognitive ability—indexed by perceptual reasoning, processing speed, and working memory—was associated with greater success using reappraisal to alter emotional responding. Reappraisal success did not vary as a function of crystallized cognitive ability or age group. Collectively, our results provide support for a key tenet of the SOC-ER framework that higher levels of relevant resources may confer greater success at emotion regulation. PMID:24987387

  7. Fluid cognitive ability is a resource for successful emotion regulation in older and younger adults.

    PubMed

    Opitz, Philipp C; Lee, Ihno A; Gross, James J; Urry, Heather L

    2014-01-01

    The Selection, Optimization, and Compensation with Emotion Regulation (SOC-ER) framework suggests that (1) emotion regulation (ER) strategies require resources and that (2) higher levels of relevant resources may increase ER success. In the current experiment, we tested the specific hypothesis that individual differences in one internal class of resources, namely cognitive ability, would contribute to greater success using cognitive reappraisal (CR), a form of ER in which one reinterprets the meaning of emotion-eliciting situations. To test this hypothesis, 60 participants (30 younger and 30 older adults) completed standardized neuropsychological tests that assess fluid and crystallized cognitive ability, as well as a CR task in which participants reinterpreted the meaning of sad pictures in order to alter (increase or decrease) their emotions. In a control condition, they viewed the pictures without trying to change how they felt. Throughout the task, we indexed subjective emotional experience (self-reported ratings of emotional intensity), expressive behavior (corrugator muscle activity), and autonomic physiology (heart rate and electrodermal activity) as measures of emotional responding. Multilevel models were constructed to explain within-subjects variation in emotional responding as a function of ER contrasts comparing increase or decrease conditions with the view control condition and between-subjects variation as a function of cognitive ability and/or age group (older, younger). As predicted, higher fluid cognitive ability-indexed by perceptual reasoning, processing speed, and working memory-was associated with greater success using reappraisal to alter emotional responding. Reappraisal success did not vary as a function of crystallized cognitive ability or age group. Collectively, our results provide support for a key tenet of the SOC-ER framework that higher levels of relevant resources may confer greater success at emotion regulation.

  8. A Unified Mixed-Effects Model for Rare-Variant Association in Sequencing Studies

    PubMed Central

    Sun, Jianping; Zheng, Yingye; Hsu, Li

    2013-01-01

    For rare-variant association analysis, due to extreme low frequencies of these variants, it is necessary to aggregate them by a prior set (e.g., genes and pathways) in order to achieve adequate power. In this paper, we consider hierarchical models to relate a set of rare variants to phenotype by modeling the effects of variants as a function of variant characteristics while allowing for variant-specific effect (heterogeneity). We derive a set of two score statistics, testing the group effect by variant characteristics and the heterogeneity effect. We make a novel modification to these score statistics so that they are independent under the null hypothesis and their asymptotic distributions can be derived. As a result, the computational burden is greatly reduced compared with permutation-based tests. Our approach provides a general testing framework for rare variants association, which includes many commonly used tests, such as the burden test [Li and Leal, 2008] and the sequence kernel association test [Wu et al., 2011], as special cases. Furthermore, in contrast to these tests, our proposed test has an added capacity to identify which components of variant characteristics and heterogeneity contribute to the association. Simulations under a wide range of scenarios show that the proposed test is valid, robust and powerful. An application to the Dallas Heart Study illustrates that apart from identifying genes with significant associations, the new method also provides additional information regarding the source of the association. Such information may be useful for generating hypothesis in future studies. PMID:23483651

  9. Validation of educational assessments: a primer for simulation and beyond.

    PubMed

    Cook, David A; Hatala, Rose

    2016-01-01

    Simulation plays a vital role in health professions assessment. This review provides a primer on assessment validation for educators and education researchers. We focus on simulation-based assessment of health professionals, but the principles apply broadly to other assessment approaches and topics. Validation refers to the process of collecting validity evidence to evaluate the appropriateness of the interpretations, uses, and decisions based on assessment results. Contemporary frameworks view validity as a hypothesis, and validity evidence is collected to support or refute the validity hypothesis (i.e., that the proposed interpretations and decisions are defensible). In validation, the educator or researcher defines the proposed interpretations and decisions, identifies and prioritizes the most questionable assumptions in making these interpretations and decisions (the "interpretation-use argument"), empirically tests those assumptions using existing or newly-collected evidence, and then summarizes the evidence as a coherent "validity argument." A framework proposed by Messick identifies potential evidence sources: content, response process, internal structure, relationships with other variables, and consequences. Another framework proposed by Kane identifies key inferences in generating useful interpretations: scoring, generalization, extrapolation, and implications/decision. We propose an eight-step approach to validation that applies to either framework: Define the construct and proposed interpretation, make explicit the intended decision(s), define the interpretation-use argument and prioritize needed validity evidence, identify candidate instruments and/or create/adapt a new instrument, appraise existing evidence and collect new evidence as needed, keep track of practical issues, formulate the validity argument, and make a judgment: does the evidence support the intended use? Rigorous validation first prioritizes and then empirically evaluates key assumptions in the interpretation and use of assessment scores. Validation science would be improved by more explicit articulation and prioritization of the interpretation-use argument, greater use of formal validation frameworks, and more evidence informing the consequences and implications of assessment.

  10. Conducting Human Research

    DTIC Science & Technology

    2009-08-05

    Socio-cultural data acquisition, extraction, and management.??? First the idea of a theoretical framework will be very briefly discussed as well as...SUBJECT TERMS human behavior, theoretical framework , hypothesis development, experimental design, ethical research, statistical power, human laboratory...who throw rocks? • How can we make them stay too far away to throw rocks? UNCLASSIFIED – Approved for Public Release Theoretical Framework / Conceptual

  11. Avoiding false discoveries in association studies.

    PubMed

    Sabatti, Chiara

    2007-01-01

    We consider the problem of controlling false discoveries in association studies. We assume that the design of the study is adequate so that the "false discoveries" are potentially only because of random chance, not to confounding or other flaws. Under this premise, we review the statistical framework for hypothesis testing and correction for multiple comparisons. We consider in detail the currently accepted strategies in linkage analysis. We then examine the underlying similarities and differences between linkage and association studies and document some of the most recent methodological developments for association mapping.

  12. A mechanistic hypothesis of the factors that enhance vulnerability to nicotine use in females

    PubMed Central

    O'Dell, Laura E.; Torres, Oscar V.

    2013-01-01

    Women are particularly more vulnerable to tobacco use than men. This review proposes a unifying hypothesis that females experience greater rewarding effects of nicotine and more intense stress produced by withdrawal than males. We also provide a neural framework whereby estrogen promotes greater rewarding effects of nicotine in females via enhanced dopamine release in the nucleus accumbens (NAcc). During withdrawal, we suggest that corticotropin-releasing factor (CRF) stress systems are sensitized and promote a greater suppression of dopamine release in the NAcc of females versus males. Taken together, females display enhanced nicotine reward via estrogen and amplified effects of withdrawal via stress systems. Although this framework focuses on sex differences in adult rats, it is also applied to adolescent females who display enhanced rewarding effects of nicotine, but reduced effects of withdrawal from this drug. Since females experience strong rewarding effects of nicotine, a clinical implication of our hypothesis is that specific strategies to prevent smoking initiation among females are critical. Also, anxiolytic medications may be more effective in females that experience intense stress during withdrawal. Furthermore, medications that target withdrawal should not be applied in a unilateral manner across age and sex, given that nicotine withdrawal is lower during adolescence. This review highlights key factors that promote nicotine use in females, and future studies on sex-dependent interactions of stress and reward systems are needed to test our mechanistic hypotheses. Future studies in this area will have important translational value toward reducing health disparities produced by nicotine use in females. PMID:23684991

  13. Thinking in Pictures as a Cognitive Account of Autism

    ERIC Educational Resources Information Center

    Kunda, Maithilee; Goel, Ashok K.

    2011-01-01

    We analyze the hypothesis that some individuals on the autism spectrum may use visual mental representations and processes to perform certain tasks that typically developing individuals perform verbally. We present a framework for interpreting empirical evidence related to this "Thinking in Pictures" hypothesis and then provide…

  14. Dam operations may improve aquatic habitat and offset negative effects of climate change.

    PubMed

    Benjankar, Rohan; Tonina, Daniele; McKean, James A; Sohrabi, Mohammad M; Chen, Quiwen; Vidergar, Dmitri

    2018-05-01

    Dam operation impacts on stream hydraulics and ecological processes are well documented, but their effect depends on geographical regions and varies spatially and temporally. Many studies have quantified their effects on aquatic ecosystem based mostly on flow hydraulics overlooking stream water temperature and climatic conditions. Here, we used an integrated modeling framework, an ecohydraulics virtual watershed, that links catchment hydrology, hydraulics, stream water temperature and aquatic habitat models to test the hypothesis that reservoir management may help to mitigate some impacts caused by climate change on downstream flows and temperature. To address this hypothesis we applied the model to analyze the impact of reservoir operation (regulated flows) on Bull Trout, a cold water obligate salmonid, habitat, against unregulated flows for dry, average, and wet climatic conditions in the South Fork Boise River (SFBR), Idaho, USA. Copyright © 2018 Elsevier Ltd. All rights reserved.

  15. Oxytocin tempers calculated greed but not impulsive defense in predator-prey contests.

    PubMed

    De Dreu, Carsten K W; Scholte, H Steven; van Winden, Frans A A M; Ridderinkhof, K Richard

    2015-05-01

    Human cooperation and competition is modulated by oxytocin, a hypothalamic neuropeptide that functions as both hormone and neurotransmitter. Oxytocin's functions can be captured in two explanatory yet largely contradictory frameworks: the fear-dampening (FD) hypothesis that oxytocin has anxiolytic effects and reduces fear-motivated action; and the social approach/avoidance (SAA) hypothesis that oxytocin increases cooperative approach and facilitates protection against aversive stimuli and threat. We tested derivations from both frameworks in a novel predator-prey contest game. Healthy males given oxytocin or placebo invested as predator to win their prey's endowment, or as prey to protect their endowment against predation. Neural activity was registered using 3T-MRI. In prey, (fear-motivated) investments were fast and conditioned on the amygdala. Inconsistent with FD, oxytocin did not modulate neural and behavioral responding in prey. In predators, (greed-motivated) investments were slower, and conditioned on the superior frontal gyrus (SFG). Consistent with SAA, oxytocin reduced predator investment, time to decide and activation in SFG. Thus, whereas oxytocin does not incapacitate the impulsive ability to protect and defend oneself, it lowers the greedy and more calculated appetite for coming out ahead. © The Author (2014). Published by Oxford University Press. For Permissions, please email: journals.permissions@oup.com.

  16. Spatiotemporal integration of molecular and anatomical data in virtual reality using semantic mapping.

    PubMed

    Soh, Jung; Turinsky, Andrei L; Trinh, Quang M; Chang, Jasmine; Sabhaney, Ajay; Dong, Xiaoli; Gordon, Paul Mk; Janzen, Ryan Pw; Hau, David; Xia, Jianguo; Wishart, David S; Sensen, Christoph W

    2009-01-01

    We have developed a computational framework for spatiotemporal integration of molecular and anatomical datasets in a virtual reality environment. Using two case studies involving gene expression data and pharmacokinetic data, respectively, we demonstrate how existing knowledge bases for molecular data can be semantically mapped onto a standardized anatomical context of human body. Our data mapping methodology uses ontological representations of heterogeneous biomedical datasets and an ontology reasoner to create complex semantic descriptions of biomedical processes. This framework provides a means to systematically combine an increasing amount of biomedical imaging and numerical data into spatiotemporally coherent graphical representations. Our work enables medical researchers with different expertise to simulate complex phenomena visually and to develop insights through the use of shared data, thus paving the way for pathological inference, developmental pattern discovery and biomedical hypothesis testing.

  17. A Multi-Year Plan for Research, Development, and Prototype Testing of Standard Modular Hydropower Technology

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Smith, Brennan T.; Welch, Tim; Witt, Adam M.

    The Multi-Year Plan for Research, Development, and Prototype Testing of Standard Modular Hydropower Technology (MYRP) presents a strategy for specifying, designing, testing, and demonstrating the efficacy of standard modular hydropower (SMH) as an environmentally compatible and cost-optimized renewable electricity generation technology. The MYRP provides the context, background, and vision for testing the SMH hypothesis: if standardization, modularity, and preservation of stream functionality become essential and fully realized features of hydropower technology, project design, and regulatory processes, they will enable previously unrealized levels of new project development with increased acceptance, reduced costs, increased predictability of outcomes, and increased value to stakeholders.more » To achieve success in this effort, the MYRP outlines a framework of stakeholder-validated criteria, models, design tools, testing facilities, and assessment protocols that will facilitate the development of next-generation hydropower technologies.« less

  18. IQ as moderator of terminal decline in perceptual and motor speed, spatial, and verbal ability: Testing the cognitive reserve hypothesis in a population-based sample followed from age 70 until death.

    PubMed

    Thorvaldsson, Valgeir; Skoog, Ingmar; Johansson, Boo

    2017-03-01

    Terminal decline (TD) refers to acceleration in within-person cognitive decline prior to death. The cognitive reserve hypothesis postulates that individuals with higher IQ are able to better tolerate age-related increase in brain pathologies. On average, they will exhibit a later onset of TD, but once they start to decline, their trajectory is steeper relative to those with lower IQ. We tested these predictions using data from initially nondemented individuals (n = 179) in the H70-study repeatedly measured at ages 70, 75, 79, 81, 85, 88, 90, 92, 95, 97, 99, and 100, or until death, on cognitive tests of perceptual-and-motor-speed and spatial and verbal ability. We quantified IQ using the Raven's Coloured Progressive Matrices (RCPM) test administrated at age 70. We fitted random change point TD models to the data, within a Bayesian framework, conditioned on IQ, age of death, education, and sex. In line with predictions, we found that 1 additional standard deviation on the IQ scale was associated with a delay in onset of TD by 1.87 (95% highest density interval [HDI; 0.20, 4.08]) years on speed, 1.96 (95% HDI [0.15, 3.54]) years on verbal ability, but only 0.88 (95% HDI [-0.93, 3.49]) year on spatial ability. Higher IQ was associated with steeper rate of decline within the TD phase on measures of speed and verbal ability, whereas results on spatial ability were nonconclusive. Our findings provide partial support for the cognitive reserve hypothesis and demonstrate that IQ can be a significant moderator of cognitive change trajectories in old age. (PsycINFO Database Record (c) 2017 APA, all rights reserved).

  19. Explorations in statistics: hypothesis tests and P values.

    PubMed

    Curran-Everett, Douglas

    2009-06-01

    Learning about statistics is a lot like learning about science: the learning is more meaningful if you can actively explore. This second installment of Explorations in Statistics delves into test statistics and P values, two concepts fundamental to the test of a scientific null hypothesis. The essence of a test statistic is that it compares what we observe in the experiment to what we expect to see if the null hypothesis is true. The P value associated with the magnitude of that test statistic answers this question: if the null hypothesis is true, what proportion of possible values of the test statistic are at least as extreme as the one I got? Although statisticians continue to stress the limitations of hypothesis tests, there are two realities we must acknowledge: hypothesis tests are ingrained within science, and the simple test of a null hypothesis can be useful. As a result, it behooves us to explore the notions of hypothesis tests, test statistics, and P values.

  20. Development of an Online and Offline Integration Hypothesis for Healthy Internet Use: Theory and Preliminary Evidence

    PubMed Central

    Lin, Xiaoyan; Su, Wenliang; Potenza, Marc N.

    2018-01-01

    The Internet has become an integral part of our daily life, and how to make the best use of the Internet is important to both individuals and the society. Based on previous studies, an Online and Offline Integration Hypothesis is proposed to suggest a framework for considering harmonious and balanced Internet use. The Integration Hypothesis proposes that healthier patterns of Internet usage may be achieved through harmonious integration of people’s online and offline worlds. An online/offline integration is proposed to unite self-identity, interpersonal relationships, and social functioning with both cognitive and behavioral aspects by following the principles of communication, transfer, consistency, and “offline-first” priorities. To begin to test the hypothesis regarding the relationship between integration level and psychological outcomes, data for the present study were collected from 626 undergraduate students (41.5% males). Participants completed scales for online and offline integration, Internet addiction, pros and cons of Internet use, loneliness, extraversion, and life satisfaction. The findings revealed that subjects with higher level of online/offline integration have higher life satisfaction, greater extraversion, and more positive perceptions of the Internet and less loneliness, lower Internet addiction, and fewer negative perceptions of the Internet. Integration mediates the link between extraversion and psychological outcomes, and it may be the mechanism underlying the difference between the “rich get richer” and social compensation hypotheses. The implications of the online and offline integration hypothesis are discussed. PMID:29706910

  1. Development of an Online and Offline Integration Hypothesis for Healthy Internet Use: Theory and Preliminary Evidence.

    PubMed

    Lin, Xiaoyan; Su, Wenliang; Potenza, Marc N

    2018-01-01

    The Internet has become an integral part of our daily life, and how to make the best use of the Internet is important to both individuals and the society. Based on previous studies, an Online and Offline Integration Hypothesis is proposed to suggest a framework for considering harmonious and balanced Internet use. The Integration Hypothesis proposes that healthier patterns of Internet usage may be achieved through harmonious integration of people's online and offline worlds. An online/offline integration is proposed to unite self-identity, interpersonal relationships, and social functioning with both cognitive and behavioral aspects by following the principles of communication, transfer, consistency, and "offline-first" priorities. To begin to test the hypothesis regarding the relationship between integration level and psychological outcomes, data for the present study were collected from 626 undergraduate students (41.5% males). Participants completed scales for online and offline integration, Internet addiction, pros and cons of Internet use, loneliness, extraversion, and life satisfaction. The findings revealed that subjects with higher level of online/offline integration have higher life satisfaction, greater extraversion, and more positive perceptions of the Internet and less loneliness, lower Internet addiction, and fewer negative perceptions of the Internet. Integration mediates the link between extraversion and psychological outcomes, and it may be the mechanism underlying the difference between the "rich get richer" and social compensation hypotheses. The implications of the online and offline integration hypothesis are discussed.

  2. Not Just a Sum? Identifying Different Types of Interplay between Constituents in Combined Interventions

    PubMed Central

    Van Deun, Katrijn; Thorrez, Lieven; van den Berg, Robert A.; Smilde, Age K.; Van Mechelen, Iven

    2015-01-01

    Motivation Experiments in which the effect of combined manipulations is compared with the effects of their pure constituents have received a great deal of attention. Examples include the study of combination therapies and the comparison of double and single knockout model organisms. Often the effect of the combined manipulation is not a mere addition of the effects of its constituents, with quite different forms of interplay between the constituents being possible. Yet, a well-formalized taxonomy of possible forms of interplay is lacking, let alone a statistical methodology to test for their presence in empirical data. Results Starting from a taxonomy of a broad range of forms of interplay between constituents of a combined manipulation, we propose a sound statistical hypothesis testing framework to test for the presence of each particular form of interplay. We illustrate the framework with analyses of public gene expression data on the combined treatment of dendritic cells with curdlan and GM-CSF and show that these lead to valuable insights into the mode of action of the constituent treatments and their combination. Availability and Implementation R code implementing the statistical testing procedure for microarray gene expression data is available as supplementary material. The data are available from the Gene Expression Omnibus with accession number GSE32986. PMID:25965065

  3. Not Just a Sum? Identifying Different Types of Interplay between Constituents in Combined Interventions.

    PubMed

    Van Deun, Katrijn; Thorrez, Lieven; van den Berg, Robert A; Smilde, Age K; Van Mechelen, Iven

    2015-01-01

    Experiments in which the effect of combined manipulations is compared with the effects of their pure constituents have received a great deal of attention. Examples include the study of combination therapies and the comparison of double and single knockout model organisms. Often the effect of the combined manipulation is not a mere addition of the effects of its constituents, with quite different forms of interplay between the constituents being possible. Yet, a well-formalized taxonomy of possible forms of interplay is lacking, let alone a statistical methodology to test for their presence in empirical data. Starting from a taxonomy of a broad range of forms of interplay between constituents of a combined manipulation, we propose a sound statistical hypothesis testing framework to test for the presence of each particular form of interplay. We illustrate the framework with analyses of public gene expression data on the combined treatment of dendritic cells with curdlan and GM-CSF and show that these lead to valuable insights into the mode of action of the constituent treatments and their combination. R code implementing the statistical testing procedure for microarray gene expression data is available as supplementary material. The data are available from the Gene Expression Omnibus with accession number GSE32986.

  4. Self organising hypothesis networks: a new approach for representing and structuring SAR knowledge

    PubMed Central

    2014-01-01

    Background Combining different sources of knowledge to build improved structure activity relationship models is not easy owing to the variety of knowledge formats and the absence of a common framework to interoperate between learning techniques. Most of the current approaches address this problem by using consensus models that operate at the prediction level. We explore the possibility to directly combine these sources at the knowledge level, with the aim to harvest potentially increased synergy at an earlier stage. Our goal is to design a general methodology to facilitate knowledge discovery and produce accurate and interpretable models. Results To combine models at the knowledge level, we propose to decouple the learning phase from the knowledge application phase using a pivot representation (lingua franca) based on the concept of hypothesis. A hypothesis is a simple and interpretable knowledge unit. Regardless of its origin, knowledge is broken down into a collection of hypotheses. These hypotheses are subsequently organised into hierarchical network. This unification permits to combine different sources of knowledge into a common formalised framework. The approach allows us to create a synergistic system between different forms of knowledge and new algorithms can be applied to leverage this unified model. This first article focuses on the general principle of the Self Organising Hypothesis Network (SOHN) approach in the context of binary classification problems along with an illustrative application to the prediction of mutagenicity. Conclusion It is possible to represent knowledge in the unified form of a hypothesis network allowing interpretable predictions with performances comparable to mainstream machine learning techniques. This new approach offers the potential to combine knowledge from different sources into a common framework in which high level reasoning and meta-learning can be applied; these latter perspectives will be explored in future work. PMID:24959206

  5. Framework for adaptive multiscale analysis of nonhomogeneous point processes.

    PubMed

    Helgason, Hannes; Bartroff, Jay; Abry, Patrice

    2011-01-01

    We develop the methodology for hypothesis testing and model selection in nonhomogeneous Poisson processes, with an eye toward the application of modeling and variability detection in heart beat data. Modeling the process' non-constant rate function using templates of simple basis functions, we develop the generalized likelihood ratio statistic for a given template and a multiple testing scheme to model-select from a family of templates. A dynamic programming algorithm inspired by network flows is used to compute the maximum likelihood template in a multiscale manner. In a numerical example, the proposed procedure is nearly as powerful as the super-optimal procedures that know the true template size and true partition, respectively. Extensions to general history-dependent point processes is discussed.

  6. Crime, shame, reintegration, and cross-national homicide: a partial test of reintegrative shaming theory.

    PubMed

    Schaible, Lonnie M; Hughes, Lorine A

    2011-01-01

    Reintegrative shaming theory (RST) argues that social aggregates characterized by high levels of communitarianism and nonstigmatizing shaming practices benefit from relatively low levels of crime. We combine aggregate measures from the World Values Survey with available macro-level data to test this hypothesis. Additionally, we examine the extent to which communitarianism and shaming mediate the effects of cultural and structural factors featured prominently in other macro-level theoretical frameworks (e.g., inequality, modernity, sex ratio, etc.). Findings provide some support for RST, showing homicide to vary with societal levels of communitarianism and informal stigmatization. However, while the effects of modernity and sex ratio were mediated by RST processes, suppression was indicated for economic inequality. Implications for theory and research are discussed.

  7. Numerical test of the Edwards conjecture shows that all packings are equally probable at jamming

    NASA Astrophysics Data System (ADS)

    Martiniani, Stefano; Schrenk, K. Julian; Ramola, Kabir; Chakraborty, Bulbul; Frenkel, Daan

    2017-09-01

    In the late 1980s, Sam Edwards proposed a possible statistical-mechanical framework to describe the properties of disordered granular materials. A key assumption underlying the theory was that all jammed packings are equally likely. In the intervening years it has never been possible to test this bold hypothesis directly. Here we present simulations that provide direct evidence that at the unjamming point, all packings of soft repulsive particles are equally likely, even though generically, jammed packings are not. Typically, jammed granular systems are observed precisely at the unjamming point since grains are not very compressible. Our results therefore support Edwards’ original conjecture. We also present evidence that at unjamming the configurational entropy of the system is maximal.

  8. Earthquake likelihood model testing

    USGS Publications Warehouse

    Schorlemmer, D.; Gerstenberger, M.C.; Wiemer, S.; Jackson, D.D.; Rhoades, D.A.

    2007-01-01

    INTRODUCTIONThe Regional Earthquake Likelihood Models (RELM) project aims to produce and evaluate alternate models of earthquake potential (probability per unit volume, magnitude, and time) for California. Based on differing assumptions, these models are produced to test the validity of their assumptions and to explore which models should be incorporated in seismic hazard and risk evaluation. Tests based on physical and geological criteria are useful but we focus on statistical methods using future earthquake catalog data only. We envision two evaluations: a test of consistency with observed data and a comparison of all pairs of models for relative consistency. Both tests are based on the likelihood method, and both are fully prospective (i.e., the models are not adjusted to fit the test data). To be tested, each model must assign a probability to any possible event within a specified region of space, time, and magnitude. For our tests the models must use a common format: earthquake rates in specified “bins” with location, magnitude, time, and focal mechanism limits.Seismology cannot yet deterministically predict individual earthquakes; however, it should seek the best possible models for forecasting earthquake occurrence. This paper describes the statistical rules of an experiment to examine and test earthquake forecasts. The primary purposes of the tests described below are to evaluate physical models for earthquakes, assure that source models used in seismic hazard and risk studies are consistent with earthquake data, and provide quantitative measures by which models can be assigned weights in a consensus model or be judged as suitable for particular regions.In this paper we develop a statistical method for testing earthquake likelihood models. A companion paper (Schorlemmer and Gerstenberger 2007, this issue) discusses the actual implementation of these tests in the framework of the RELM initiative.Statistical testing of hypotheses is a common task and a wide range of possible testing procedures exist. Jolliffe and Stephenson (2003) present different forecast verifications from atmospheric science, among them likelihood testing of probability forecasts and testing the occurrence of binary events. Testing binary events requires that for each forecasted event, the spatial, temporal and magnitude limits be given. Although major earthquakes can be considered binary events, the models within the RELM project express their forecasts on a spatial grid and in 0.1 magnitude units; thus the results are a distribution of rates over space and magnitude. These forecasts can be tested with likelihood tests.In general, likelihood tests assume a valid null hypothesis against which a given hypothesis is tested. The outcome is either a rejection of the null hypothesis in favor of the test hypothesis or a nonrejection, meaning the test hypothesis cannot outperform the null hypothesis at a given significance level. Within RELM, there is no accepted null hypothesis and thus the likelihood test needs to be expanded to allow comparable testing of equipollent hypotheses.To test models against one another, we require that forecasts are expressed in a standard format: the average rate of earthquake occurrence within pre-specified limits of hypocentral latitude, longitude, depth, magnitude, time period, and focal mechanisms. Focal mechanisms should either be described as the inclination of P-axis, declination of P-axis, and inclination of the T-axis, or as strike, dip, and rake angles. Schorlemmer and Gerstenberger (2007, this issue) designed classes of these parameters such that similar models will be tested against each other. These classes make the forecasts comparable between models. Additionally, we are limited to testing only what is precisely defined and consistently reported in earthquake catalogs. Therefore it is currently not possible to test such information as fault rupture length or area, asperity location, etc. Also, to account for data quality issues, we allow for location and magnitude uncertainties as well as the probability that an event is dependent on another event.As we mentioned above, only models with comparable forecasts can be tested against each other. Our current tests are designed to examine grid-based models. This requires that any fault-based model be adapted to a grid before testing is possible. While this is a limitation of the testing, it is an inherent difficulty in any such comparative testing. Please refer to appendix B for a statistical evaluation of the application of the Poisson hypothesis to fault-based models.The testing suite we present consists of three different tests: L-Test, N-Test, and R-Test. These tests are defined similarily to Kagan and Jackson (1995). The first two tests examine the consistency of the hypotheses with the observations while the last test compares the spatial performances of the models.

  9. Retrieval attempts enhance learning, but retrieval success (versus failure) does not matter.

    PubMed

    Kornell, Nate; Klein, Patricia Jacobs; Rawson, Katherine A

    2015-01-01

    Retrieving information from memory enhances learning. We propose a 2-stage framework to explain the benefits of retrieval. Stage 1 takes place as one attempts to retrieve an answer, which activates knowledge related to the retrieval cue. Stage 2 begins when the answer becomes available, at which point appropriate connections are strengthened and inappropriate connections may be weakened. This framework raises a basic question: Does it matter whether Stage 2 is initiated via successful retrieval or via an external presentation of the answer? To test this question, we asked participants to attempt retrieval and then randomly assigned items (which were equivalent otherwise) to be retrieved successfully or to be copied (i.e., not retrieved). Experiments 1, 2, 4, and 5 tested assumptions necessary for interpreting Experiments 3a, 3b, and 6. Experiments 3a, 3b, and 6 did not support the hypothesis that retrieval success produces more learning than does retrieval failure followed by feedback. It appears that retrieval attempts promote learning but retrieval success per se does not. PsycINFO Database Record (c) 2015 APA, all rights reserved.

  10. Rationality and drug use: an experimental approach.

    PubMed

    Blondel, Serge; Lohéac, Youenn; Rinaudo, Stéphane

    2007-05-01

    In rational addiction theory, higher discount rates encourage drug use. We test this hypothesis in the general framework of rationality and behaviour under risk. We do so using an experimental design with real monetary incentives. The decisions of 34 drug addicts are compared with those of a control group. The decisions of drug users (DU) are not any less consistent with standard theories of behaviour over time and under risk. Further, there is no difference in the estimated discount rate between drug users and the control group, but the former do appear to be more risk-seeking.

  11. The topographical model of multiple sclerosis

    PubMed Central

    Cook, Karin; De Nino, Scott; Fletcher, Madhuri

    2016-01-01

    Relapses and progression contribute to multiple sclerosis (MS) disease course, but neither the relationship between them nor the spectrum of clinical heterogeneity has been fully characterized. A hypothesis-driven, biologically informed model could build on the clinical phenotypes to encompass the dynamic admixture of factors underlying MS disease course. In this medical hypothesis, we put forth a dynamic model of MS disease course that incorporates localization and other drivers of disability to propose a clinical manifestation framework that visualizes MS in a clinically individualized way. The topographical model encapsulates 5 factors (localization of relapses and causative lesions; relapse frequency, severity, and recovery; and progression rate), visualized utilizing dynamic 3-dimensional renderings. The central hypothesis is that, like symptom recrudescence in Uhthoff phenomenon and pseudoexacerbations, progression clinically recapitulates prior relapse symptoms and unmasks previously silent lesions, incrementally revealing underlying lesion topography. The model uses real-time simulation software to depict disease course archetypes and illuminate several well-described but poorly reconciled phenomena including the clinical/MRI paradox and prognostic significance of lesion location and burden on disease outcomes. Utilization of this model could allow for earlier and more clinically precise identification of progressive MS and predictive implications can be empirically tested. PMID:27648465

  12. Managing complexity in simulations of land surface and near-surface processes

    DOE PAGES

    Coon, Ethan T.; Moulton, J. David; Painter, Scott L.

    2016-01-12

    Increasing computing power and the growing role of simulation in Earth systems science have led to an increase in the number and complexity of processes in modern simulators. We present a multiphysics framework that specifies interfaces for coupled processes and automates weak and strong coupling strategies to manage this complexity. Process management is enabled by viewing the system of equations as a tree, where individual equations are associated with leaf nodes and coupling strategies with internal nodes. A dynamically generated dependency graph connects a variable to its dependencies, streamlining and automating model evaluation, easing model development, and ensuring models aremore » modular and flexible. Additionally, the dependency graph is used to ensure that data requirements are consistent between all processes in a given simulation. Here we discuss the design and implementation of these concepts within the Arcos framework, and demonstrate their use for verification testing and hypothesis evaluation in numerical experiments.« less

  13. Integrative data analysis in clinical psychology research.

    PubMed

    Hussong, Andrea M; Curran, Patrick J; Bauer, Daniel J

    2013-01-01

    Integrative data analysis (IDA), a novel framework for conducting the simultaneous analysis of raw data pooled from multiple studies, offers many advantages including economy (i.e., reuse of extant data), power (i.e., large combined sample sizes), the potential to address new questions not answerable by a single contributing study (e.g., combining longitudinal studies to cover a broader swath of the lifespan), and the opportunity to build a more cumulative science (i.e., examining the similarity of effects across studies and potential reasons for dissimilarities). There are also methodological challenges associated with IDA, including the need to account for sampling heterogeneity across studies, to develop commensurate measures across studies, and to account for multiple sources of study differences as they impact hypothesis testing. In this review, we outline potential solutions to these challenges and describe future avenues for developing IDA as a framework for studies in clinical psychology.

  14. Integrative Data Analysis in Clinical Psychology Research

    PubMed Central

    Hussong, Andrea M.; Curran, Patrick J.; Bauer, Daniel J.

    2013-01-01

    Integrative Data Analysis (IDA), a novel framework for conducting the simultaneous analysis of raw data pooled from multiple studies, offers many advantages including economy (i.e., reuse of extant data), power (i.e., large combined sample sizes), the potential to address new questions not answerable by a single contributing study (e.g., combining longitudinal studies to cover a broader swath of the lifespan), and the opportunity to build a more cumulative science (i.e., examining the similarity of effects across studies and potential reasons for dissimilarities). There are also methodological challenges associated with IDA, including the need to account for sampling heterogeneity across studies, to develop commensurate measures across studies, and to account for multiple sources of study differences as they impact hypothesis testing. In this review, we outline potential solutions to these challenges and describe future avenues for developing IDA as a framework for studies in clinical psychology. PMID:23394226

  15. Riemannian geometry of Hamiltonian chaos: hints for a general theory.

    PubMed

    Cerruti-Sola, Monica; Ciraolo, Guido; Franzosi, Roberto; Pettini, Marco

    2008-10-01

    We aim at assessing the validity limits of some simplifying hypotheses that, within a Riemmannian geometric framework, have provided an explanation of the origin of Hamiltonian chaos and have made it possible to develop a method of analytically computing the largest Lyapunov exponent of Hamiltonian systems with many degrees of freedom. Therefore, a numerical hypotheses testing has been performed for the Fermi-Pasta-Ulam beta model and for a chain of coupled rotators. These models, for which analytic computations of the largest Lyapunov exponents have been carried out in the mentioned Riemannian geometric framework, appear as paradigmatic examples to unveil the reason why the main hypothesis of quasi-isotropy of the mechanical manifolds sometimes breaks down. The breakdown is expected whenever the topology of the mechanical manifolds is nontrivial. This is an important step forward in view of developing a geometric theory of Hamiltonian chaos of general validity.

  16. Translating Behavioral Science into Practice: A Framework to Determine Science Quality and Applicability for Police Organizations.

    PubMed

    McClure, Kimberley A; McGuire, Katherine L; Chapan, Denis M

    2018-05-07

    Policy on officer-involved shootings is critically reviewed and errors in applying scientific knowledge identified. Identifying and evaluating the most relevant science to a field-based problem is challenging. Law enforcement administrators with a clear understanding of valid science and application are in a better position to utilize scientific knowledge for the benefit of their organizations and officers. A recommended framework is proposed for considering the validity of science and its application. Valid science emerges via hypothesis testing, replication, extension and marked by peer review, known error rates, and general acceptance in its field of origin. Valid application of behavioral science requires an understanding of the methodology employed, measures used, and participants recruited to determine whether the science is ready for application. Fostering a science-practitioner partnership and an organizational culture that embraces quality, empirically based policy, and practices improves science-to-practice translation. © 2018 American Academy of Forensic Sciences.

  17. The Uses of the Term Hypothesis and the Inquiry Emphasis Conflation in Science Teacher Education

    ERIC Educational Resources Information Center

    Gyllenpalm, Jakob; Wickman, Per-Olof

    2011-01-01

    This paper examines the use and role of the term "hypothesis" in science teacher education as described by teacher students. Data were collected through focus group interviews conducted at seven occasions with 32 students from six well-known Swedish universities. The theoretical framework is a sociocultural and pragmatist perspective on…

  18. Seeking health information on the web: positive hypothesis testing.

    PubMed

    Kayhan, Varol Onur

    2013-04-01

    The goal of this study is to investigate positive hypothesis testing among consumers of health information when they search the Web. After demonstrating the extent of positive hypothesis testing using Experiment 1, we conduct Experiment 2 to test the effectiveness of two debiasing techniques. A total of 60 undergraduate students searched a tightly controlled online database developed by the authors to test the validity of a hypothesis. The database had four abstracts that confirmed the hypothesis and three abstracts that disconfirmed it. Findings of Experiment 1 showed that majority of participants (85%) exhibited positive hypothesis testing. In Experiment 2, we found that the recommendation technique was not effective in reducing positive hypothesis testing since none of the participants assigned to this server could retrieve disconfirming evidence. Experiment 2 also showed that the incorporation technique successfully reduced positive hypothesis testing since 75% of the participants could retrieve disconfirming evidence. Positive hypothesis testing on the Web is an understudied topic. More studies are needed to validate the effectiveness of the debiasing techniques discussed in this study and develop new techniques. Search engine developers should consider developing new options for users so that both confirming and disconfirming evidence can be presented in search results as users test hypotheses using search engines. Copyright © 2012 Elsevier Ireland Ltd. All rights reserved.

  19. A comparative analysis of Science-Technology-Society standards in elementary, middle and high school state science curriculum frameworks

    NASA Astrophysics Data System (ADS)

    Tobias, Karen Marie

    An analysis of curriculum frameworks from the fifty states to ascertain the compliance with the National Science Education Standards for integrating Science-Technology-Society (STS) themes is reported within this dissertation. Science standards for all fifty states were analyzed to determine if the STS criteria were integrated at the elementary, middle, and high school levels of education. The analysis determined the compliance level for each state, then compared each educational level to see if the compliance was similar across the levels. Compliance is important because research shows that using STS themes in the science classroom increases the student's understanding of the concepts, increases the student's problem solving skills, increases the student's self-efficacy with respect to science, and students instructed using STS themes score well on science high stakes tests. The two hypotheses for this study are: (1) There is no significant difference in the degree of compliance to Science-Technology-Society themes (derived from National Science Education Standards) between the elementary, middle, and high school levels. (2) There is no significant difference in the degree of compliance to Science-Technology-Society themes (derived from National Science Education Standards) between the elementary, middle, and high school level when examined individually. The Analysis of Variance F ratio was used to determine the variance between and within the three educational levels. This analysis addressed hypothesis one. The Analysis of Variance results refused to reject the null hypothesis, meaning there is significant difference in the compliance to STS themes between the elementary, middle and high school educational levels. The Chi-Square test was the statistical analysis used to compare the educational levels for each individual criterion. This analysis addressed hypothesis two. The Chi-Squared results showed that none of the states were equally compliant with each individual criterion across the elementary, middle, and high school levels. The National Science Education Standards were created with the input of thousands of people and over twenty scientific and educational societies. The standards were tested in numerous classrooms and showed an increase in science literacy for the students. With the No Child Left Behind legislation and Project 2061, the attainment of a science literate society will be helped by the adoption of the NSES standards and the STS themes into the American classrooms.

  20. A Dynamic Circuit Hypothesis for the Pathogenesis of Blepharospasm.

    PubMed

    Peterson, David A; Sejnowski, Terrence J

    2017-01-01

    Blepharospasm (sometimes called "benign essential blepharospasm," BEB) is one of the most common focal dystonias. It involves involuntary eyelid spasms, eye closure, and increased blinking. Despite the success of botulinum toxin injections and, in some cases, pharmacologic or surgical interventions, BEB treatments are not completely efficacious and only symptomatic. We could develop principled strategies for preventing and reversing the disease if we knew the pathogenesis of primary BEB. The objective of this study was to develop a conceptual framework and dynamic circuit hypothesis for the pathogenesis of BEB. The framework extends our overarching theory for the multifactorial pathogenesis of focal dystonias (Peterson et al., 2010) to incorporate a two-hit rodent model specifically of BEB (Schicatano et al., 1997). We incorporate in the framework three features critical to cranial motor control: (1) the joint influence of motor cortical regions and direct descending projections from one of the basal ganglia output nuclei, the substantia nigra pars reticulata, on brainstem motor nuclei, (2) nested loops composed of the trigeminal blink reflex arc and the long sensorimotor loop from trigeminal nucleus through thalamus to somatosensory cortex back through basal ganglia to the same brainstem nuclei modulating the reflex arc, and (3) abnormalities in the basal ganglia dopamine system that provide a sensorimotor learning substrate which, when combined with patterns of increased blinking, leads to abnormal sensorimotor mappings manifest as BEB. The framework explains experimental data on the trigeminal reflex blink excitability (TRBE) from Schicatano et al. and makes predictions that can be tested in new experimental animal models based on emerging genetics in dystonia, including the recently characterized striatal-specific D1R dopamine transduction alterations caused by the GNAL mutation. More broadly, the model will provide a guide for future efforts to mechanistically link multiple factors in the pathogenesis of BEB and facilitate simulations of how exogenous manipulations of the pathogenic factors could ultimately be used to prevent and reverse the disorder.

  1. A Dynamic Circuit Hypothesis for the Pathogenesis of Blepharospasm

    PubMed Central

    Peterson, David A.; Sejnowski, Terrence J.

    2017-01-01

    Blepharospasm (sometimes called “benign essential blepharospasm,” BEB) is one of the most common focal dystonias. It involves involuntary eyelid spasms, eye closure, and increased blinking. Despite the success of botulinum toxin injections and, in some cases, pharmacologic or surgical interventions, BEB treatments are not completely efficacious and only symptomatic. We could develop principled strategies for preventing and reversing the disease if we knew the pathogenesis of primary BEB. The objective of this study was to develop a conceptual framework and dynamic circuit hypothesis for the pathogenesis of BEB. The framework extends our overarching theory for the multifactorial pathogenesis of focal dystonias (Peterson et al., 2010) to incorporate a two-hit rodent model specifically of BEB (Schicatano et al., 1997). We incorporate in the framework three features critical to cranial motor control: (1) the joint influence of motor cortical regions and direct descending projections from one of the basal ganglia output nuclei, the substantia nigra pars reticulata, on brainstem motor nuclei, (2) nested loops composed of the trigeminal blink reflex arc and the long sensorimotor loop from trigeminal nucleus through thalamus to somatosensory cortex back through basal ganglia to the same brainstem nuclei modulating the reflex arc, and (3) abnormalities in the basal ganglia dopamine system that provide a sensorimotor learning substrate which, when combined with patterns of increased blinking, leads to abnormal sensorimotor mappings manifest as BEB. The framework explains experimental data on the trigeminal reflex blink excitability (TRBE) from Schicatano et al. and makes predictions that can be tested in new experimental animal models based on emerging genetics in dystonia, including the recently characterized striatal-specific D1R dopamine transduction alterations caused by the GNAL mutation. More broadly, the model will provide a guide for future efforts to mechanistically link multiple factors in the pathogenesis of BEB and facilitate simulations of how exogenous manipulations of the pathogenic factors could ultimately be used to prevent and reverse the disorder. PMID:28326032

  2. Conjoint representation of texture ensemble and location in the parahippocampal place area.

    PubMed

    Park, Jeongho; Park, Soojin

    2017-04-01

    Texture provides crucial information about the category or identity of a scene. Nonetheless, not much is known about how the texture information in a scene is represented in the brain. Previous studies have shown that the parahippocampal place area (PPA), a scene-selective part of visual cortex, responds to simple patches of texture ensemble. However, in natural scenes textures exist in spatial context within a scene. Here we tested two hypotheses that make different predictions on how textures within a scene context are represented in the PPA. The Texture-Only hypothesis suggests that the PPA represents texture ensemble (i.e., the kind of texture) as is, irrespective of its location in the scene. On the other hand, the Texture and Location hypothesis suggests that the PPA represents texture and its location within a scene (e.g., ceiling or wall) conjointly. We tested these two hypotheses across two experiments, using different but complementary methods. In experiment 1 , by using multivoxel pattern analysis (MVPA) and representational similarity analysis, we found that the representational similarity of the PPA activation patterns was significantly explained by the Texture-Only hypothesis but not by the Texture and Location hypothesis. In experiment 2 , using a repetition suppression paradigm, we found no repetition suppression for scenes that had the same texture ensemble but differed in location (supporting the Texture and Location hypothesis). On the basis of these results, we propose a framework that reconciles contrasting results from MVPA and repetition suppression and draw conclusions about how texture is represented in the PPA. NEW & NOTEWORTHY This study investigates how the parahippocampal place area (PPA) represents texture information within a scene context. We claim that texture is represented in the PPA at multiple levels: the texture ensemble information at the across-voxel level and the conjoint information of texture and its location at the within-voxel level. The study proposes a working hypothesis that reconciles contrasting results from multivoxel pattern analysis and repetition suppression, suggesting that the methods are complementary to each other but not necessarily interchangeable. Copyright © 2017 the American Physiological Society.

  3. Fast Poisson noise removal by biorthogonal Haar domain hypothesis testing

    NASA Astrophysics Data System (ADS)

    Zhang, B.; Fadili, M. J.; Starck, J.-L.; Digel, S. W.

    2008-07-01

    Methods based on hypothesis tests (HTs) in the Haar domain are widely used to denoise Poisson count data. Facing large datasets or real-time applications, Haar-based denoisers have to use the decimated transform to meet limited-memory or computation-time constraints. Unfortunately, for regular underlying intensities, decimation yields discontinuous estimates and strong “staircase” artifacts. In this paper, we propose to combine the HT framework with the decimated biorthogonal Haar (Bi-Haar) transform instead of the classical Haar. The Bi-Haar filter bank is normalized such that the p-values of Bi-Haar coefficients (p) provide good approximation to those of Haar (pH) for high-intensity settings or large scales; for low-intensity settings and small scales, we show that p are essentially upper-bounded by pH. Thus, we may apply the Haar-based HTs to Bi-Haar coefficients to control a prefixed false positive rate. By doing so, we benefit from the regular Bi-Haar filter bank to gain a smooth estimate while always maintaining a low computational complexity. A Fisher-approximation-based threshold implementing the HTs is also established. The efficiency of this method is illustrated on an example of hyperspectral-source-flux estimation.

  4. Effects of Phasor Measurement Uncertainty on Power Line Outage Detection

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Chen, Chen; Wang, Jianhui; Zhu, Hao

    2014-12-01

    Phasor measurement unit (PMU) technology provides an effective tool to enhance the wide-area monitoring systems (WAMSs) in power grids. Although extensive studies have been conducted to develop several PMU applications in power systems (e.g., state estimation, oscillation detection and control, voltage stability analysis, and line outage detection), the uncertainty aspects of PMUs have not been adequately investigated. This paper focuses on quantifying the impact of PMU uncertainty on power line outage detection and identification, in which a limited number of PMUs installed at a subset of buses are utilized to detect and identify the line outage events. Specifically, the linemore » outage detection problem is formulated as a multi-hypothesis test, and a general Bayesian criterion is used for the detection procedure, in which the PMU uncertainty is analytically characterized. We further apply the minimum detection error criterion for the multi-hypothesis test and derive the expected detection error probability in terms of PMU uncertainty. The framework proposed provides fundamental guidance for quantifying the effects of PMU uncertainty on power line outage detection. Case studies are provided to validate our analysis and show how PMU uncertainty influences power line outage detection.« less

  5. Phylogenetic evidence from freshwater crayfishes that cave adaptation is not an evolutionary dead‐end

    PubMed Central

    Stern, David B.; Breinholt, Jesse; Pedraza‐Lara, Carlos; López‐Mejía, Marilú; Owen, Christopher L.; Bracken‐Grissom, Heather; Fetzner, James W.; Crandall, Keith A.

    2017-01-01

    Abstract Caves are perceived as isolated, extreme habitats with a uniquely specialized biota, which long ago led to the idea that caves are “evolutionary dead‐ends.” This implies that cave‐adapted taxa may be doomed for extinction before they can diversify or transition to a more stable state. However, this hypothesis has not been explicitly tested in a phylogenetic framework with multiple independently evolved cave‐dwelling groups. Here, we use the freshwater crayfish, a group with dozens of cave‐dwelling species in multiple lineages, as a system to test this hypothesis. We consider historical patterns of lineage diversification and habitat transition as well as current patterns of geographic range size. We find that while cave‐dwelling lineages have small relative range sizes and rarely transition back to the surface, they exhibit remarkably similar diversification patterns to those of other habitat types and appear to be able to maintain a diversity of lineages through time. This suggests that cave adaptation is not a “dead‐end” for freshwater crayfish, which has positive implications for our understanding of biodiversity and conservation in cave habitats. PMID:28804900

  6. Sources of organisational resiliency during the Thailand floods of 2011: a test of the bonding and bridging hypotheses.

    PubMed

    Andrew, Simon; Arlikatti, Sudha; Siebeneck, Laura; Pongponrat, Kannapa; Jaikampan, Kraiwuth

    2016-01-01

    Based on the Institutional Collective Action framework, this research tests the impact of two competing hypotheses--bonding and bridging--on enhancing organisational resiliency. The bonding hypothesis posits that organisational resiliency can be achieved if an organisation works closely with others, whereas the bridging hypothesis argues that such a structure places considerable stress on an organisation and advocates for an organisation to position itself as a central actor to gain access to novel resources from a diverse set of entities to achieve resiliency. The paper analyses data gathered from semi-structured interviews with 44 public, private, and non-profit organisations serving communities affected by the Great Floods of 2011 in the Thai capital, Bangkok (urban), and in Pathum Thani (suburban) and Ayutthaya (rural) provinces. The findings suggest that: organisational resiliency was associated with the bridging effect; organisations in the rural province were more resilient than those in the suburban and urban centres; and private and non-governmental organisations generally were more resilient than public sector organisations. The findings highlight the importance of fostering multi-sector partnerships to enhance organisational resiliency for disaster response. © 2016 The Author(s). Disasters © Overseas Development Institute, 2016.

  7. A generalized Levene's scale test for variance heterogeneity in the presence of sample correlation and group uncertainty.

    PubMed

    Soave, David; Sun, Lei

    2017-09-01

    We generalize Levene's test for variance (scale) heterogeneity between k groups for more complex data, when there are sample correlation and group membership uncertainty. Following a two-stage regression framework, we show that least absolute deviation regression must be used in the stage 1 analysis to ensure a correct asymptotic χk-12/(k-1) distribution of the generalized scale (gS) test statistic. We then show that the proposed gS test is independent of the generalized location test, under the joint null hypothesis of no mean and no variance heterogeneity. Consequently, we generalize the recently proposed joint location-scale (gJLS) test, valuable in settings where there is an interaction effect but one interacting variable is not available. We evaluate the proposed method via an extensive simulation study and two genetic association application studies. © 2017 The Authors Biometrics published by Wiley Periodicals, Inc. on behalf of International Biometric Society.

  8. Testing goodness of fit in regression: a general approach for specified alternatives.

    PubMed

    Solari, Aldo; le Cessie, Saskia; Goeman, Jelle J

    2012-12-10

    When fitting generalized linear models or the Cox proportional hazards model, it is important to have tools to test for lack of fit. Because lack of fit comes in all shapes and sizes, distinguishing among different types of lack of fit is of practical importance. We argue that an adequate diagnosis of lack of fit requires a specified alternative model. Such specification identifies the type of lack of fit the test is directed against so that if we reject the null hypothesis, we know the direction of the departure from the model. The goodness-of-fit approach of this paper allows to treat different types of lack of fit within a unified general framework and to consider many existing tests as special cases. Connections with penalized likelihood and random effects are discussed, and the application of the proposed approach is illustrated with medical examples. Tailored functions for goodness-of-fit testing have been implemented in the R package global test. Copyright © 2012 John Wiley & Sons, Ltd.

  9. Pervasive Sound Sensing: A Weakly Supervised Training Approach.

    PubMed

    Kelly, Daniel; Caulfield, Brian

    2016-01-01

    Modern smartphones present an ideal device for pervasive sensing of human behavior. Microphones have the potential to reveal key information about a person's behavior. However, they have been utilized to a significantly lesser extent than other smartphone sensors in the context of human behavior sensing. We postulate that, in order for microphones to be useful in behavior sensing applications, the analysis techniques must be flexible and allow easy modification of the types of sounds to be sensed. A simplification of the training data collection process could allow a more flexible sound classification framework. We hypothesize that detailed training, a prerequisite for the majority of sound sensing techniques, is not necessary and that a significantly less detailed and time consuming data collection process can be carried out, allowing even a nonexpert to conduct the collection, labeling, and training process. To test this hypothesis, we implement a diverse density-based multiple instance learning framework, to identify a target sound, and a bag trimming algorithm, which, using the target sound, automatically segments weakly labeled sound clips to construct an accurate training set. Experiments reveal that our hypothesis is a valid one and results show that classifiers, trained using the automatically segmented training sets, were able to accurately classify unseen sound samples with accuracies comparable to supervised classifiers, achieving an average F -measure of 0.969 and 0.87 for two weakly supervised datasets.

  10. Statistical analysis of water-quality data containing multiple detection limits II: S-language software for nonparametric distribution modeling and hypothesis testing

    USGS Publications Warehouse

    Lee, L.; Helsel, D.

    2007-01-01

    Analysis of low concentrations of trace contaminants in environmental media often results in left-censored data that are below some limit of analytical precision. Interpretation of values becomes complicated when there are multiple detection limits in the data-perhaps as a result of changing analytical precision over time. Parametric and semi-parametric methods, such as maximum likelihood estimation and robust regression on order statistics, can be employed to model distributions of multiply censored data and provide estimates of summary statistics. However, these methods are based on assumptions about the underlying distribution of data. Nonparametric methods provide an alternative that does not require such assumptions. A standard nonparametric method for estimating summary statistics of multiply-censored data is the Kaplan-Meier (K-M) method. This method has seen widespread usage in the medical sciences within a general framework termed "survival analysis" where it is employed with right-censored time-to-failure data. However, K-M methods are equally valid for the left-censored data common in the geosciences. Our S-language software provides an analytical framework based on K-M methods that is tailored to the needs of the earth and environmental sciences community. This includes routines for the generation of empirical cumulative distribution functions, prediction or exceedance probabilities, and related confidence limits computation. Additionally, our software contains K-M-based routines for nonparametric hypothesis testing among an unlimited number of grouping variables. A primary characteristic of K-M methods is that they do not perform extrapolation and interpolation. Thus, these routines cannot be used to model statistics beyond the observed data range or when linear interpolation is desired. For such applications, the aforementioned parametric and semi-parametric methods must be used.

  11. From computers to cultivation: reconceptualizing evolutionary psychology.

    PubMed

    Barrett, Louise; Pollet, Thomas V; Stulp, Gert

    2014-01-01

    Does evolutionary theorizing have a role in psychology? This is a more contentious issue than one might imagine, given that, as evolved creatures, the answer must surely be yes. The contested nature of evolutionary psychology lies not in our status as evolved beings, but in the extent to which evolutionary ideas add value to studies of human behavior, and the rigor with which these ideas are tested. This, in turn, is linked to the framework in which particular evolutionary ideas are situated. While the framing of the current research topic places the brain-as-computer metaphor in opposition to evolutionary psychology, the most prominent school of thought in this field (born out of cognitive psychology, and often known as the Santa Barbara school) is entirely wedded to the computational theory of mind as an explanatory framework. Its unique aspect is to argue that the mind consists of a large number of functionally specialized (i.e., domain-specific) computational mechanisms, or modules (the massive modularity hypothesis). Far from offering an alternative to, or an improvement on, the current perspective, we argue that evolutionary psychology is a mainstream computational theory, and that its arguments for domain-specificity often rest on shaky premises. We then go on to suggest that the various forms of e-cognition (i.e., embodied, embedded, enactive) represent a true alternative to standard computational approaches, with an emphasis on "cognitive integration" or the "extended mind hypothesis" in particular. We feel this offers the most promise for human psychology because it incorporates the social and historical processes that are crucial to human "mind-making" within an evolutionarily informed framework. In addition to linking to other research areas in psychology, this approach is more likely to form productive links to other disciplines within the social sciences, not least by encouraging a healthy pluralism in approach.

  12. A two-hypothesis approach to establishing a life detection/biohazard protocol for planetary samples

    NASA Astrophysics Data System (ADS)

    Conley, Catharine; Steele, Andrew

    2016-07-01

    The COSPAR policy on performing a biohazard assessment on samples brought from Mars to Earth is framed in the context of a concern for false-positive results. However, as noted during the 2012 Workshop for Life Detection in Samples from Mars (ref. Kminek et al., 2014), a more significant concern for planetary samples brought to Earth is false-negative results, because an undetected biohazard could increase risk to the Earth. This is the reason that stringent contamination control must be a high priority for all Category V Restricted Earth Return missions. A useful conceptual framework for addressing these concerns involves two complementary 'null' hypotheses: testing both of them, together, would allow statistical and community confidence to be developed regarding one or the other conclusion. As noted above, false negatives are of primary concern for safety of the Earth, so the 'Earth Safety null hypothesis' -- that must be disproved to assure low risk to the Earth from samples introduced by Category V Restricted Earth Return missions -- is 'There is native life in these samples.' False positives are of primary concern for Astrobiology, so the 'Astrobiology null hypothesis' -- that must be disproved in order to demonstrate the existence of extraterrestrial life is 'There is no life in these samples.' The presence of Earth contamination would render both of these hypotheses more difficult to disprove. Both these hypotheses can be tested following a strict science protocol; analyse, interprete, test the hypotheses and repeat. The science measurements undertaken are then done in an iterative fashion that responds to discovery with both hypotheses testable from interpretation of the scientific data. This is a robust, community involved activity that ensures maximum science return with minimal sample use.

  13. The "U" Curve Hypothesis: A Framework for Making Sense of Learning to Teach in Diverse Settings

    ERIC Educational Resources Information Center

    Birrell, James R.; Tinney, Mari Vawn

    2008-01-01

    Experiences in this research study started in 1991 before many teacher educators were aware of the "?U"? curve hypothesis or predictable stages of culture shock and the recognizable stages used on the path to gaining intercultural competence. This study of student teachers is used here as an illustration of what happens when teachers are…

  14. Functional morphology of the bovid astragalus in relation to habitat: controlling phylogenetic signal in ecomorphology.

    PubMed

    Barr, W Andrew

    2014-11-01

    Bovid astragali are one of the most commonly preserved bones in the fossil record. Accordingly, astragali are an important target for studies seeking to predict the habitat preferences of fossil bovids based on bony anatomy. However, previous work has not tested functional hypotheses linking astragalar morphology with habitat while controlling for body size and phylogenetic signal. This article presents a functional framework relating the morphology of the bovid astragalus to habitat-specific locomotor ecology and tests four hypotheses emanating from this framework. Highly cursorial bovids living in structurally open habitats are hypothesized to differ from their less cursorial closed-habitat dwelling relatives in having (1) relatively short astragali to maintain rotational speed throughout the camming motion of the rotating astragalus, (2) a greater range of angular excursion at the hock, (3) relatively larger joint surface areas, and (4) a more pronounced "spline-and-groove" morphology promoting lateral joint stability. A diverse sample of 181 astragali from 50 extant species was scanned using a Next Engine laser scanner. Species were assigned to one of four habitat categories based on the published ecological literature. A series of 11 linear measurements and three joint surface areas were measured on each astragalus. A geometric mean body size proxy was used to size-correct the measurement data. Phylogenetic generalized least squares (PGLS) was used to test for differences between habitat categories while controlling for body size differences and phylogenetic signal. Statistically significant PGLS results support Hypotheses 1 and 2 (which are not mutually exclusive) as well as Hypothesis 3. No support was found for Hypothesis 4. These findings confirm that the morphology of the bovid astragalus is related to habitat-specific locomotor ecology, and that this relationship is statistically significant after controlling for body size and phylogeny. Thus, this study validates the use of this bone as an ecomorphological indicator. © 2014 Wiley Periodicals, Inc.

  15. The roles of categorical and coordinate spatial relations in recognizing buildings.

    PubMed

    Palermo, Liana; Piccardi, Laura; Nori, Raffaella; Giusberti, Fiorella; Guariglia, Cecilia

    2012-11-01

    Categorical spatial information is considered more useful for recognizing objects, and coordinate spatial information for guiding actions--for example, during navigation or grasping. In contrast with this assumption, we hypothesized that buildings, unlike other categories of objects, require both categorical and coordinate spatial information in order to be recognized. This hypothesis arose from evidence that right-brain-damaged patients have deficits in both coordinate judgments and recognition of buildings and from the fact that buildings are very useful for guiding navigation in urban environments. To test this hypothesis, we assessed 210 healthy college students while they performed four different tasks that required categorical and coordinate judgments and the recognition of common objects and buildings. Our results showed that both categorical and coordinate spatial representations are necessary to recognize a building, whereas only categorical representations are necessary to recognize an object. We discuss our data in view of a recent neural framework for visuospatial processing, suggesting that recognizing buildings may specifically activate the parieto-medial-temporal pathway.

  16. Unbiased estimation in seamless phase II/III trials with unequal treatment effect variances and hypothesis-driven selection rules.

    PubMed

    Robertson, David S; Prevost, A Toby; Bowden, Jack

    2016-09-30

    Seamless phase II/III clinical trials offer an efficient way to select an experimental treatment and perform confirmatory analysis within a single trial. However, combining the data from both stages in the final analysis can induce bias into the estimates of treatment effects. Methods for bias adjustment developed thus far have made restrictive assumptions about the design and selection rules followed. In order to address these shortcomings, we apply recent methodological advances to derive the uniformly minimum variance conditionally unbiased estimator for two-stage seamless phase II/III trials. Our framework allows for the precision of the treatment arm estimates to take arbitrary values, can be utilised for all treatments that are taken forward to phase III and is applicable when the decision to select or drop treatment arms is driven by a multiplicity-adjusted hypothesis testing procedure. © 2016 The Authors. Statistics in Medicine Published by John Wiley & Sons Ltd. © 2016 The Authors. Statistics in Medicine Published by John Wiley & Sons Ltd.

  17. Impact assessment: Eroding benefits through streamlining?

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Bond, Alan, E-mail: alan.bond@uea.ac.uk; School of Geo and Spatial Sciences, North-West University; Pope, Jenny, E-mail: jenny@integral-sustainability.net

    This paper argues that Governments have sought to streamline impact assessment in recent years (defined as the last five years) to counter concerns over the costs and potential for delays to economic development. We hypothesise that this has had some adverse consequences on the benefits that subsequently accrue from the assessments. This hypothesis is tested using a framework developed from arguments for the benefits brought by Environmental Impact Assessment made in 1982 in the face of the UK Government opposition to its implementation in a time of economic recession. The particular benefits investigated are ‘consistency and fairness’, ‘early warning’, ‘environmentmore » and development’, and ‘public involvement’. Canada, South Africa, the United Kingdom and Western Australia are the jurisdictions tested using this framework. The conclusions indicate that significant streamlining has been undertaken which has had direct adverse effects on some of the benefits that impact assessment should deliver, particularly in Canada and the UK. The research has not examined whether streamlining has had implications for the effectiveness of impact assessment, but the causal link between streamlining and benefits does sound warning bells that merit further investigation. -- Highlights: • Investigation of the extent to which government has streamlined IA. • Evaluation framework was developed based on benefits of impact assessment. • Canada, South Africa, the United Kingdom, and Western Australia were examined. • Trajectory in last five years is attrition of benefits of impact assessment.« less

  18. Age-Related Impairment on a Forced-Choice Version of the Mnemonic Similarity Task

    PubMed Central

    Huffman, Derek J.; Stark, Craig E. L.

    2018-01-01

    Previous studies from our lab have indicated that healthy older adults are impaired in their ability to mnemonically discriminate between previously viewed objects and similar lure objects in the Mnemonic Similarity Task (MST). These studies have used either old/similar/new or old/new test formats. The forced-choice test format (e.g., “Did you see object A or object A’ during the encoding phase?”) relies on different assumptions than the old/new test format (e.g., “Did you see this object during the encoding phase?”); hence, converging evidence from these approaches would bolster the conclusion that healthy aging is accompanied by impaired performance on the MST. Consistent with our hypothesis, healthy older adults exhibited impaired performance on a forced-choice test format that required discriminating between a target and a similar lure. We also tested the hypothesis that age-related impairments on the MST could be modeled within a global matching computational framework. We found that decreasing the probability of successful feature encoding in the models caused changes that were similar to the empirical data in healthy older adults. Collectively, our behavioral results extend to the forced-choice test format the finding that healthy aging is accompanied by an impaired ability to discriminate between targets and similar lures, and our modeling results suggest that a diminished probability of encoding stimulus features is a candidate mechanism for memory changes in healthy aging. We also discuss the ability of global matching models to account for findings in other studies that have used variants on mnemonic similarity tasks. PMID:28004951

  19. New methods of testing nonlinear hypothesis using iterative NLLS estimator

    NASA Astrophysics Data System (ADS)

    Mahaboob, B.; Venkateswarlu, B.; Mokeshrayalu, G.; Balasiddamuni, P.

    2017-11-01

    This research paper discusses the method of testing nonlinear hypothesis using iterative Nonlinear Least Squares (NLLS) estimator. Takeshi Amemiya [1] explained this method. However in the present research paper, a modified Wald test statistic due to Engle, Robert [6] is proposed to test the nonlinear hypothesis using iterative NLLS estimator. An alternative method for testing nonlinear hypothesis using iterative NLLS estimator based on nonlinear hypothesis using iterative NLLS estimator based on nonlinear studentized residuals has been proposed. In this research article an innovative method of testing nonlinear hypothesis using iterative restricted NLLS estimator is derived. Pesaran and Deaton [10] explained the methods of testing nonlinear hypothesis. This paper uses asymptotic properties of nonlinear least squares estimator proposed by Jenrich [8]. The main purpose of this paper is to provide very innovative methods of testing nonlinear hypothesis using iterative NLLS estimator, iterative NLLS estimator based on nonlinear studentized residuals and iterative restricted NLLS estimator. Eakambaram et al. [12] discussed least absolute deviation estimations versus nonlinear regression model with heteroscedastic errors and also they studied the problem of heteroscedasticity with reference to nonlinear regression models with suitable illustration. William Grene [13] examined the interaction effect in nonlinear models disused by Ai and Norton [14] and suggested ways to examine the effects that do not involve statistical testing. Peter [15] provided guidelines for identifying composite hypothesis and addressing the probability of false rejection for multiple hypotheses.

  20. Device-independent tests of quantum channels

    NASA Astrophysics Data System (ADS)

    Dall'Arno, Michele; Brandsen, Sarah; Buscemi, Francesco

    2017-03-01

    We develop a device-independent framework for testing quantum channels. That is, we falsify a hypothesis about a quantum channel based only on an observed set of input-output correlations. Formally, the problem consists of characterizing the set of input-output correlations compatible with any arbitrary given quantum channel. For binary (i.e. two input symbols, two output symbols) correlations, we show that extremal correlations are always achieved by orthogonal encodings and measurements, irrespective of whether or not the channel preserves commutativity. We further provide a full, closed-form characterization of the sets of binary correlations in the case of: (i) any dihedrally covariant qubit channel (such as any Pauli and amplitude-damping channels) and (ii) any universally-covariant commutativity-preserving channel in an arbitrary dimension (such as any erasure, depolarizing, universal cloning and universal transposition channels).

  1. Regression Models For Multivariate Count Data

    PubMed Central

    Zhang, Yiwen; Zhou, Hua; Zhou, Jin; Sun, Wei

    2016-01-01

    Data with multivariate count responses frequently occur in modern applications. The commonly used multinomial-logit model is limiting due to its restrictive mean-variance structure. For instance, analyzing count data from the recent RNA-seq technology by the multinomial-logit model leads to serious errors in hypothesis testing. The ubiquity of over-dispersion and complicated correlation structures among multivariate counts calls for more flexible regression models. In this article, we study some generalized linear models that incorporate various correlation structures among the counts. Current literature lacks a treatment of these models, partly due to the fact that they do not belong to the natural exponential family. We study the estimation, testing, and variable selection for these models in a unifying framework. The regression models are compared on both synthetic and real RNA-seq data. PMID:28348500

  2. Regression Models For Multivariate Count Data.

    PubMed

    Zhang, Yiwen; Zhou, Hua; Zhou, Jin; Sun, Wei

    2017-01-01

    Data with multivariate count responses frequently occur in modern applications. The commonly used multinomial-logit model is limiting due to its restrictive mean-variance structure. For instance, analyzing count data from the recent RNA-seq technology by the multinomial-logit model leads to serious errors in hypothesis testing. The ubiquity of over-dispersion and complicated correlation structures among multivariate counts calls for more flexible regression models. In this article, we study some generalized linear models that incorporate various correlation structures among the counts. Current literature lacks a treatment of these models, partly due to the fact that they do not belong to the natural exponential family. We study the estimation, testing, and variable selection for these models in a unifying framework. The regression models are compared on both synthetic and real RNA-seq data.

  3. Device-independent tests of quantum channels.

    PubMed

    Dall'Arno, Michele; Brandsen, Sarah; Buscemi, Francesco

    2017-03-01

    We develop a device-independent framework for testing quantum channels. That is, we falsify a hypothesis about a quantum channel based only on an observed set of input-output correlations. Formally, the problem consists of characterizing the set of input-output correlations compatible with any arbitrary given quantum channel. For binary (i.e. two input symbols, two output symbols) correlations, we show that extremal correlations are always achieved by orthogonal encodings and measurements, irrespective of whether or not the channel preserves commutativity. We further provide a full, closed-form characterization of the sets of binary correlations in the case of: (i) any dihedrally covariant qubit channel (such as any Pauli and amplitude-damping channels) and (ii) any universally-covariant commutativity-preserving channel in an arbitrary dimension (such as any erasure, depolarizing, universal cloning and universal transposition channels).

  4. Continuous improvement in the Industrial and Management Systems Engineering programme at Kuwait University

    NASA Astrophysics Data System (ADS)

    Aldowaisan, Tariq; Allahverdi, Ali

    2016-07-01

    This paper describes the process employed by the Industrial and Management Systems Engineering programme at Kuwait University to continuously improve the programme. Using a continuous improvement framework, the paper demonstrates how various qualitative and quantitative analyses methods, such as hypothesis testing and control charts, have been applied to the results of four assessment tools and other data sources to improve performance. Important improvements include the need to reconsider two student outcomes as they were difficult to implement in courses. In addition, through benchmarking and the engagement of Alumni and Employers, key decisions were made to improve the curriculum and enhance employability.

  5. Latest Results From the QuakeFinder Statistical Analysis Framework

    NASA Astrophysics Data System (ADS)

    Kappler, K. N.; MacLean, L. S.; Schneider, D.; Bleier, T.

    2017-12-01

    Since 2005 QuakeFinder (QF) has acquired an unique dataset with outstanding spatial and temporal sampling of earth's magnetic field along several active fault systems. This QF network consists of 124 stations in California and 45 stations along fault zones in Greece, Taiwan, Peru, Chile and Indonesia. Each station is equipped with three feedback induction magnetometers, two ion sensors, a 4 Hz geophone, a temperature sensor, and a humidity sensor. Data are continuously recorded at 50 Hz with GPS timing and transmitted daily to the QF data center in California for analysis. QF is attempting to detect and characterize anomalous EM activity occurring ahead of earthquakes. There have been many reports of anomalous variations in the earth's magnetic field preceding earthquakes. Specifically, several authors have drawn attention to apparent anomalous pulsations seen preceding earthquakes. Often studies in long term monitoring of seismic activity are limited by availability of event data. It is particularly difficult to acquire a large dataset for rigorous statistical analyses of the magnetic field near earthquake epicenters because large events are relatively rare. Since QF has acquired hundreds of earthquakes in more than 70 TB of data, we developed an automated approach for finding statistical significance of precursory behavior and developed an algorithm framework. Previously QF reported on the development of an Algorithmic Framework for data processing and hypothesis testing. The particular instance of algorithm we discuss identifies and counts magnetic variations from time series data and ranks each station-day according to the aggregate number of pulses in a time window preceding the day in question. If the hypothesis is true that magnetic field activity increases over some time interval preceding earthquakes, this should reveal itself by the station-days on which earthquakes occur receiving higher ranks than they would if the ranking scheme were random. This can be analysed using the Receiver Operating Characteristic test. In this presentation we give a status report of our latest results, largely focussed on reproducibility of results, robust statistics in the presence of missing data, and exploring optimization landscapes in our parameter space.

  6. Role of Livelihood Capital in Reducing Climatic Vulnerability: Insights of Australian Wheat from 1990-2010.

    PubMed

    Huai, Jianjun

    2016-01-01

    In many agricultural countries, development of rural livelihood through increasing capital is a major regional policy to adapt to climate change. However, the role of livelihood capital in reducing climatic vulnerability is uncertain. This study assesses vulnerability and identifies the effects of common capital indicators on it, using Australian wheat as an example. We calculate exposure (a climate index) and sensitivity (a wheat failure index) to measure vulnerability and classify the resilient and sensitive cases, and express adaptive capacity through financial, human, natural, physical, and social capital indicators for 12 regions in the Australian wheat-sheep production zone from 1991-2010. We identify relationships between 12 indicators of five types of capital and vulnerability with t-tests and six logistic models considering the capital indicator itself, its first-order lag and its square as dependent variables to test the hypothesis that a high level of each capital metric results in low vulnerability. Through differing adaptive capacities between resilient and sensitive groups, we found that only four of the 12 (e.g., the access to finance, cash income level, total crop gross revenues, and family share of farm income) relate to vulnerability, which challenges the hypothesis that increasing capital reduces vulnerability. We conclude that further empirical reexaminations are required to test the relationships between capital measures and vulnerability under the sustainable livelihood framework (SLF).

  7. Role of Livelihood Capital in Reducing Climatic Vulnerability: Insights of Australian Wheat from 1990–2010

    PubMed Central

    Huai, Jianjun

    2016-01-01

    In many agricultural countries, development of rural livelihood through increasing capital is a major regional policy to adapt to climate change. However, the role of livelihood capital in reducing climatic vulnerability is uncertain. This study assesses vulnerability and identifies the effects of common capital indicators on it, using Australian wheat as an example. We calculate exposure (a climate index) and sensitivity (a wheat failure index) to measure vulnerability and classify the resilient and sensitive cases, and express adaptive capacity through financial, human, natural, physical, and social capital indicators for 12 regions in the Australian wheat–sheep production zone from 1991–2010. We identify relationships between 12 indicators of five types of capital and vulnerability with t-tests and six logistic models considering the capital indicator itself, its first-order lag and its square as dependent variables to test the hypothesis that a high level of each capital metric results in low vulnerability. Through differing adaptive capacities between resilient and sensitive groups, we found that only four of the 12 (e.g., the access to finance, cash income level, total crop gross revenues, and family share of farm income) relate to vulnerability, which challenges the hypothesis that increasing capital reduces vulnerability. We conclude that further empirical reexaminations are required to test the relationships between capital measures and vulnerability under the sustainable livelihood framework (SLF). PMID:27022910

  8. Measurement of latent cognitive abilities involved in concept identification learning.

    PubMed

    Thomas, Michael L; Brown, Gregory G; Gur, Ruben C; Moore, Tyler M; Patt, Virginie M; Nock, Matthew K; Naifeh, James A; Heeringa, Steven; Ursano, Robert J; Stein, Murray B

    2015-01-01

    We used cognitive and psychometric modeling techniques to evaluate the construct validity and measurement precision of latent cognitive abilities measured by a test of concept identification learning: the Penn Conditional Exclusion Test (PCET). Item response theory parameters were embedded within classic associative- and hypothesis-based Markov learning models and were fitted to 35,553 Army soldiers' PCET data from the Army Study to Assess Risk and Resilience in Servicemembers (Army STARRS). Data were consistent with a hypothesis-testing model with multiple latent abilities-abstraction and set shifting. Latent abstraction ability was positively correlated with number of concepts learned, and latent set-shifting ability was negatively correlated with number of perseverative errors, supporting the construct validity of the two parameters. Abstraction was most precisely assessed for participants with abilities ranging from 1.5 standard deviations below the mean to the mean itself. Measurement of set shifting was acceptably precise only for participants making a high number of perseverative errors. The PCET precisely measures latent abstraction ability in the Army STARRS sample, especially within the range of mildly impaired to average ability. This precision pattern is ideal for a test developed to measure cognitive impairment as opposed to cognitive strength. The PCET also measures latent set-shifting ability, but reliable assessment is limited to the impaired range of ability, reflecting that perseverative errors are rare among cognitively healthy adults. Integrating cognitive and psychometric models can provide information about construct validity and measurement precision within a single analytical framework.

  9. Child welfare organizations: Do specialization and service integration impact placement decisions?

    PubMed

    Smith, Carrie; Fluke, John; Fallon, Barbara; Mishna, Faye; Decker Pierce, Barbara

    2018-02-01

    The objective of this study was to contribute to the understanding of the child welfare organization by testing the hypothesis that the characteristics of organizations influence decisions made by child protection staff for vulnerable children. The influence of two aspects of organizational structure on the decision to place a child in out-of-home care were examined: service integration and worker specialization. A theoretical framework that integrated the Decision-Making Ecology Framework (Baumann et al., 2011) and Yoo et al. (2007) conceptual framework of organizational constructs as predictors of service effectiveness was tested. Secondary data analysis of the Ontario Incidence Study of Reported Child Abuse and Neglect - 2013 (OIS-2013) was conducted. A subsample of 4949 investigations from 16 agencies was included in this study. Given the nested structure of the data, multi-level modelling was used to test the relative contribution of case and organizational factors to the decision to place. Despite the reported differences among child welfare organizations and research that has demonstrated variance in the placement decision as a result of organizational factors, the structure of the organization (i.e., worker specialization and service integration) showed no predictive power in the final models. The lack of variance may be explained by the relatively low frequency of placements during the investigation phase of service, the hierarchical impact of the factors of the DME and the limited information available regarding the structure of child welfare organizations in Ontario. Suggestions for future research are provided. Copyright © 2017 Elsevier Ltd. All rights reserved.

  10. Hypothesis testing in hydrology: Theory and practice

    NASA Astrophysics Data System (ADS)

    Kirchner, James; Pfister, Laurent

    2017-04-01

    Well-posed hypothesis tests have spurred major advances in hydrological theory. However, a random sample of recent research papers suggests that in hydrology, as in other fields, hypothesis formulation and testing rarely correspond to the idealized model of the scientific method. Practices such as "p-hacking" or "HARKing" (Hypothesizing After the Results are Known) are major obstacles to more rigorous hypothesis testing in hydrology, along with the well-known problem of confirmation bias - the tendency to value and trust confirmations more than refutations - among both researchers and reviewers. Hypothesis testing is not the only recipe for scientific progress, however: exploratory research, driven by innovations in measurement and observation, has also underlain many key advances. Further improvements in observation and measurement will be vital to both exploratory research and hypothesis testing, and thus to advancing the science of hydrology.

  11. Microplastic as a Vector for Chemicals in the Aquatic Environment: Critical Review and Model-Supported Reinterpretation of Empirical Studies.

    PubMed

    Koelmans, Albert A; Bakir, Adil; Burton, G Allen; Janssen, Colin R

    2016-04-05

    The hypothesis that 'microplastic will transfer hazardous hydrophobic organic chemicals (HOC) to marine animals' has been central to the perceived hazard and risk of plastic in the marine environment. The hypothesis is often cited and has gained momentum, turning it into paradigm status. We provide a critical evaluation of the scientific literature regarding this hypothesis. Using new calculations based on published studies, we explain the sometimes contrasting views and unify them in one interpretive framework. One explanation for the contrasting views among studies is that they test different hypotheses. When reframed in the context of the above hypothesis, the available data become consistent. We show that HOC microplastic-water partitioning can be assumed to be at equilibrium for most microplastic residing in the oceans. We calculate the fraction of total HOC sorbed by plastics to be small compared to that sorbed by other media in the ocean. We further demonstrate consistency among (a) measured HOC transfer from microplastic to organisms in the laboratory, (b) measured HOC desorption rates for polymers in artificial gut fluids (c) simulations by plastic-inclusive bioaccumulation models and (d) HOC desorption rates for polymers inferred from first principles. We conclude that overall the flux of HOCs bioaccumulated from natural prey overwhelms the flux from ingested microplastic for most habitats, which implies that microplastic ingestion is not likely to increase the exposure to and thus risks of HOCs in the marine environment.

  12. El uso de las simulaciones educativas en la ensenanza de conceptos de ciencias y su importancia desde la perspectiva de los estudiantes candidatos a maestros

    NASA Astrophysics Data System (ADS)

    Crespo Ramos, Edwin O.

    This research was aimed at establishing the differences, if any, between traditional direct teaching and constructive teaching through the use of computer simulations and their effect on pre-service teachers. It's also intended to gain feedback on the users of these simulations as providers of constructive teaching and learning experiences. The experimental framework used a quantitative method with a descriptive focus. The research was guided by two hypothesis and five inquiries. The data was obtained from a group composed of twenty-nine students from a private Metropolitan University in Puerto Rico and elementary school pre-service teachers. They were divided into two sub-groups: experimental and control. Two means were used to collect data: tests and surveys. Quantitative data was analyzed through test "t" for paired samples and the non-parametric Wilcoxon test. The results of the pre and post tests do not provide enough evidence to conclude that using the simulations as learning tools was more effective than traditional teaching. However, the quantitative results obtained were not enough to reject or dismiss the hypothesis Ho1. On the other hand, an overall positive attitude towards these simulations was obtained from the surveys. The importance of including hands-on activities in daily lesson planning was proven and well recognized among practice teachers. After participating and working with these simulations, the practice teachers expressed being convinced that they would definitely use them as teaching tools in the classroom. Due to these results, hypothesis Ho2 was rejected. Evidence also proved that practice teachers need further professional development to improve their skills in the application of these simulations in the classroom environment. The majority of these practice teachers showed concern about not being instructed on important aspects of the use of simulation as part of their college education curriculum towards becoming teachers.

  13. Phase Transitions in Living Neural Networks

    NASA Astrophysics Data System (ADS)

    Williams-Garcia, Rashid Vladimir

    Our nervous systems are composed of intricate webs of interconnected neurons interacting in complex ways. These complex interactions result in a wide range of collective behaviors with implications for features of brain function, e.g., information processing. Under certain conditions, such interactions can drive neural network dynamics towards critical phase transitions, where power-law scaling is conjectured to allow optimal behavior. Recent experimental evidence is consistent with this idea and it seems plausible that healthy neural networks would tend towards optimality. This hypothesis, however, is based on two problematic assumptions, which I describe and for which I present alternatives in this thesis. First, critical transitions may vanish due to the influence of an environment, e.g., a sensory stimulus, and so living neural networks may be incapable of achieving "critical" optimality. I develop a framework known as quasicriticality, in which a relative optimality can be achieved depending on the strength of the environmental influence. Second, the power-law scaling supporting this hypothesis is based on statistical analysis of cascades of activity known as neuronal avalanches, which conflate causal and non-causal activity, thus confounding important dynamical information. In this thesis, I present a new method to unveil causal links, known as causal webs, between neuronal activations, thus allowing for experimental tests of the quasicriticality hypothesis and other practical applications.

  14. Estimating breeding proportions and testing hypotheses about costs of reproduction with capture-recapture data

    USGS Publications Warehouse

    Nichols, James D.; Hines, James E.; Pollock, Kenneth H.; Hinz, Robert L.; Link, William A.

    1994-01-01

    The proportion of animals in a population that breeds is an important determinant of population growth rate. Usual estimates of this quantity from field sampling data assume that the probability of appearing in the capture or count statistic is the same for animals that do and do not breed. A similar assumption is required by most existing methods used to test ecologically interesting hypotheses about reproductive costs using field sampling data. However, in many field sampling situations breeding and nonbreeding animals are likely to exhibit different probabilities of being seen or caught. In this paper, we propose the use of multistate capture-recapture models for these estimation and testing problems. This methodology permits a formal test of the hypothesis of equal capture/sighting probabilities for breeding and nonbreeding individuals. Two estimators of breeding proportion (and associated standard errors) are presented, one for the case of equal capture probabilities and one for the case of unequal capture probabilities. The multistate modeling framework also yields formal tests of hypotheses about reproductive costs to future reproduction or survival or both fitness components. The general methodology is illustrated using capture-recapture data on female meadow voles, Microtus pennsylvanicus. Resulting estimates of the proportion of reproductively active females showed strong seasonal variation, as expected, with low breeding proportions in midwinter. We found no evidence of reproductive costs extracted in subsequent survival or reproduction. We believe that this methodological framework has wide application to problems in animal ecology concerning breeding proportions and phenotypic reproductive costs.

  15. Testing the null hypothesis: the forgotten legacy of Karl Popper?

    PubMed

    Wilkinson, Mick

    2013-01-01

    Testing of the null hypothesis is a fundamental aspect of the scientific method and has its basis in the falsification theory of Karl Popper. Null hypothesis testing makes use of deductive reasoning to ensure that the truth of conclusions is irrefutable. In contrast, attempting to demonstrate the new facts on the basis of testing the experimental or research hypothesis makes use of inductive reasoning and is prone to the problem of the Uniformity of Nature assumption described by David Hume in the eighteenth century. Despite this issue and the well documented solution provided by Popper's falsification theory, the majority of publications are still written such that they suggest the research hypothesis is being tested. This is contrary to accepted scientific convention and possibly highlights a poor understanding of the application of conventional significance-based data analysis approaches. Our work should remain driven by conjecture and attempted falsification such that it is always the null hypothesis that is tested. The write up of our studies should make it clear that we are indeed testing the null hypothesis and conforming to the established and accepted philosophical conventions of the scientific method.

  16. Unscaled Bayes factors for multiple hypothesis testing in microarray experiments.

    PubMed

    Bertolino, Francesco; Cabras, Stefano; Castellanos, Maria Eugenia; Racugno, Walter

    2015-12-01

    Multiple hypothesis testing collects a series of techniques usually based on p-values as a summary of the available evidence from many statistical tests. In hypothesis testing, under a Bayesian perspective, the evidence for a specified hypothesis against an alternative, conditionally on data, is given by the Bayes factor. In this study, we approach multiple hypothesis testing based on both Bayes factors and p-values, regarding multiple hypothesis testing as a multiple model selection problem. To obtain the Bayes factors we assume default priors that are typically improper. In this case, the Bayes factor is usually undetermined due to the ratio of prior pseudo-constants. We show that ignoring prior pseudo-constants leads to unscaled Bayes factor which do not invalidate the inferential procedure in multiple hypothesis testing, because they are used within a comparative scheme. In fact, using partial information from the p-values, we are able to approximate the sampling null distribution of the unscaled Bayes factor and use it within Efron's multiple testing procedure. The simulation study suggests that under normal sampling model and even with small sample sizes, our approach provides false positive and false negative proportions that are less than other common multiple hypothesis testing approaches based only on p-values. The proposed procedure is illustrated in two simulation studies, and the advantages of its use are showed in the analysis of two microarray experiments. © The Author(s) 2011.

  17. Age-related impairment on a forced-choice version of the Mnemonic Similarity Task.

    PubMed

    Huffman, Derek J; Stark, Craig E L

    2017-02-01

    Previous studies from our lab have indicated that healthy older adults are impaired in their ability to mnemonically discriminate between previously viewed objects and similar lure objects in the Mnemonic Similarity Task (MST). These studies have used either old/similar/new or old/new test formats. The forced-choice test format (e.g., "Did you see object A or object A' during the encoding phase?") relies on different assumptions than the old/new test format (e.g., "Did you see this object during the encoding phase?"); hence, converging evidence from these approaches would bolster the conclusion that healthy aging is accompanied by impaired performance on the MST. Consistent with our hypothesis, healthy older adults exhibited impaired performance on a forced-choice test format that required discriminating between a target and a similar lure. We also tested the hypothesis that age-related impairments on the MST could be modeled within a global matching computational framework. We found that decreasing the probability of successful feature encoding in the models caused changes that were similar to the empirical data in healthy older adults. Collectively, our behavioral results using the forced-choice format extend the finding that healthy aging is accompanied by an impaired ability to discriminate between targets and similar lures, and our modeling results suggest that a diminished probability of encoding stimulus features is a candidate mechanism for memory changes in healthy aging. We also discuss the ability of global matching models to account for findings in other studies that have used variants on mnemonic similarity tasks. (PsycINFO Database Record (c) 2017 APA, all rights reserved).

  18. A Critique of One-Tailed Hypothesis Test Procedures in Business and Economics Statistics Textbooks.

    ERIC Educational Resources Information Center

    Liu, Tung; Stone, Courtenay C.

    1999-01-01

    Surveys introductory business and economics statistics textbooks and finds that they differ over the best way to explain one-tailed hypothesis tests: the simple null-hypothesis approach or the composite null-hypothesis approach. Argues that the composite null-hypothesis approach contains methodological shortcomings that make it more difficult for…

  19. Modeling error in assessment of mammographic image features for improved computer-aided mammography training: initial experience

    NASA Astrophysics Data System (ADS)

    Mazurowski, Maciej A.; Tourassi, Georgia D.

    2011-03-01

    In this study we investigate the hypothesis that there exist patterns in erroneous assessment of BI-RADS image features among radiology trainees when performing diagnostic interpretation of mammograms. We also investigate whether these error making patterns can be captured by individual user models. To test our hypothesis we propose a user modeling algorithm that uses the previous readings of a trainee to identify whether certain BI-RADS feature values (e.g. "spiculated" value for "margin" feature) are associated with higher than usual likelihood that the feature will be assessed incorrectly. In our experiments we used readings of 3 radiology residents and 7 breast imaging experts for 33 breast masses for the following BI-RADS features: parenchyma density, mass margin, mass shape and mass density. The expert readings were considered as the gold standard. Rule-based individual user models were developed and tested using the leave one-one-out crossvalidation scheme. Our experimental evaluation showed that the individual user models are accurate in identifying cases for which errors are more likely to be made. The user models captured regularities in error making for all 3 residents. This finding supports our hypothesis about existence of individual error making patterns in assessment of mammographic image features using the BI-RADS lexicon. Explicit user models identifying the weaknesses of each resident could be of great use when developing and adapting a personalized training plan to meet the resident's individual needs. Such approach fits well with the framework of adaptive computer-aided educational systems in mammography we have proposed before.

  20. A Design Support Framework through Dynamic Deployment of Hypothesis and Verification in the Design Process

    NASA Astrophysics Data System (ADS)

    Nomaguch, Yutaka; Fujita, Kikuo

    This paper proposes a design support framework, named DRIFT (Design Rationale Integration Framework of Three layers), which dynamically captures and manages hypothesis and verification in the design process. A core of DRIFT is a three-layered design process model of action, model operation and argumentation. This model integrates various design support tools and captures design operations performed on them. Action level captures the sequence of design operations. Model operation level captures the transition of design states, which records a design snapshot over design tools. Argumentation level captures the process of setting problems and alternatives. The linkage of three levels enables to automatically and efficiently capture and manage iterative hypothesis and verification processes through design operations over design tools. In DRIFT, such a linkage is extracted through the templates of design operations, which are extracted from the patterns embeded in design tools such as Design-For-X (DFX) approaches, and design tools are integrated through ontology-based representation of design concepts. An argumentation model, gIBIS (graphical Issue-Based Information System), is used for representing dependencies among problems and alternatives. A mechanism of TMS (Truth Maintenance System) is used for managing multiple hypothetical design stages. This paper also demonstrates a prototype implementation of DRIFT and its application to a simple design problem. Further, it is concluded with discussion of some future issues.

  1. Continuum theory of fibrous tissue damage mechanics using bond kinetics: application to cartilage tissue engineering.

    PubMed

    Nims, Robert J; Durney, Krista M; Cigan, Alexander D; Dusséaux, Antoine; Hung, Clark T; Ateshian, Gerard A

    2016-02-06

    This study presents a damage mechanics framework that employs observable state variables to describe damage in isotropic or anisotropic fibrous tissues. In this mixture theory framework, damage is tracked by the mass fraction of bonds that have broken. Anisotropic damage is subsumed in the assumption that multiple bond species may coexist in a material, each having its own damage behaviour. This approach recovers the classical damage mechanics formulation for isotropic materials, but does not appeal to a tensorial damage measure for anisotropic materials. In contrast with the classical approach, the use of observable state variables for damage allows direct comparison of model predictions to experimental damage measures, such as biochemical assays or Raman spectroscopy. Investigations of damage in discrete fibre distributions demonstrate that the resilience to damage increases with the number of fibre bundles; idealizing fibrous tissues using continuous fibre distribution models precludes the modelling of damage. This damage framework was used to test and validate the hypothesis that growth of cartilage constructs can lead to damage of the synthesized collagen matrix due to excessive swelling caused by synthesized glycosaminoglycans. Therefore, alternative strategies must be implemented in tissue engineering studies to prevent collagen damage during the growth process.

  2. Continuum theory of fibrous tissue damage mechanics using bond kinetics: application to cartilage tissue engineering

    PubMed Central

    Nims, Robert J.; Durney, Krista M.; Cigan, Alexander D.; Hung, Clark T.; Ateshian, Gerard A.

    2016-01-01

    This study presents a damage mechanics framework that employs observable state variables to describe damage in isotropic or anisotropic fibrous tissues. In this mixture theory framework, damage is tracked by the mass fraction of bonds that have broken. Anisotropic damage is subsumed in the assumption that multiple bond species may coexist in a material, each having its own damage behaviour. This approach recovers the classical damage mechanics formulation for isotropic materials, but does not appeal to a tensorial damage measure for anisotropic materials. In contrast with the classical approach, the use of observable state variables for damage allows direct comparison of model predictions to experimental damage measures, such as biochemical assays or Raman spectroscopy. Investigations of damage in discrete fibre distributions demonstrate that the resilience to damage increases with the number of fibre bundles; idealizing fibrous tissues using continuous fibre distribution models precludes the modelling of damage. This damage framework was used to test and validate the hypothesis that growth of cartilage constructs can lead to damage of the synthesized collagen matrix due to excessive swelling caused by synthesized glycosaminoglycans. Therefore, alternative strategies must be implemented in tissue engineering studies to prevent collagen damage during the growth process. PMID:26855751

  3. Mixture models for detecting differentially expressed genes in microarrays.

    PubMed

    Jones, Liat Ben-Tovim; Bean, Richard; McLachlan, Geoffrey J; Zhu, Justin Xi

    2006-10-01

    An important and common problem in microarray experiments is the detection of genes that are differentially expressed in a given number of classes. As this problem concerns the selection of significant genes from a large pool of candidate genes, it needs to be carried out within the framework of multiple hypothesis testing. In this paper, we focus on the use of mixture models to handle the multiplicity issue. With this approach, a measure of the local FDR (false discovery rate) is provided for each gene. An attractive feature of the mixture model approach is that it provides a framework for the estimation of the prior probability that a gene is not differentially expressed, and this probability can subsequently be used in forming a decision rule. The rule can also be formed to take the false negative rate into account. We apply this approach to a well-known publicly available data set on breast cancer, and discuss our findings with reference to other approaches.

  4. P value and the theory of hypothesis testing: an explanation for new researchers.

    PubMed

    Biau, David Jean; Jolles, Brigitte M; Porcher, Raphaël

    2010-03-01

    In the 1920s, Ronald Fisher developed the theory behind the p value and Jerzy Neyman and Egon Pearson developed the theory of hypothesis testing. These distinct theories have provided researchers important quantitative tools to confirm or refute their hypotheses. The p value is the probability to obtain an effect equal to or more extreme than the one observed presuming the null hypothesis of no effect is true; it gives researchers a measure of the strength of evidence against the null hypothesis. As commonly used, investigators will select a threshold p value below which they will reject the null hypothesis. The theory of hypothesis testing allows researchers to reject a null hypothesis in favor of an alternative hypothesis of some effect. As commonly used, investigators choose Type I error (rejecting the null hypothesis when it is true) and Type II error (accepting the null hypothesis when it is false) levels and determine some critical region. If the test statistic falls into that critical region, the null hypothesis is rejected in favor of the alternative hypothesis. Despite similarities between the two, the p value and the theory of hypothesis testing are different theories that often are misunderstood and confused, leading researchers to improper conclusions. Perhaps the most common misconception is to consider the p value as the probability that the null hypothesis is true rather than the probability of obtaining the difference observed, or one that is more extreme, considering the null is true. Another concern is the risk that an important proportion of statistically significant results are falsely significant. Researchers should have a minimum understanding of these two theories so that they are better able to plan, conduct, interpret, and report scientific experiments.

  5. A theoretical framework for the associations between identity and psychopathology.

    PubMed

    Klimstra, Theo A; Denissen, Jaap J A

    2017-11-01

    Identity research largely emerged from clinical observations. Decades of empirical work advanced the field in refining existing approaches and adding new approaches. Furthermore, the existence of linkages of identity with psychopathology is now well established. Unfortunately, both the directionality of effects between identity aspects and psychopathology symptoms, and the mechanisms underlying associations are unclear. In the present paper, we present a new framework to inspire hypothesis-driven empirical research to overcome this limitation. The framework has a basic resemblance to theoretical models for the study of personality and psychopathology, so we provide examples of how these might apply to the study of identity. Next, we explain that unique features of identity may come into play in individuals suffering from psychopathology that are mostly related to the content of one's identity. These include pros and cons of identifying with one's diagnostic label. Finally, inspired by Hermans' dialogical self theory and principles derived from Piaget's, Swann's and Kelly's work, we delineate a framework with identity at the core of an individual multidimensional space. In this space, psychopathology symptoms have a known distance (representing relevance) to one's identity, and individual multidimensional spaces are connected to those of other individuals in one's social network. We discuss methodological (quantitative and qualitative, idiographic and nomothetic) and statistical procedures (multilevel models and network models) to test the framework. Resulting evidence can boost the field of identity research in demonstrating its high practical relevance for the emergence and conservation of psychopathology. (PsycINFO Database Record (c) 2017 APA, all rights reserved).

  6. Debates—Hypothesis testing in hydrology: Theory and practice

    NASA Astrophysics Data System (ADS)

    Pfister, Laurent; Kirchner, James W.

    2017-03-01

    The basic structure of the scientific method—at least in its idealized form—is widely championed as a recipe for scientific progress, but the day-to-day practice may be different. Here, we explore the spectrum of current practice in hypothesis formulation and testing in hydrology, based on a random sample of recent research papers. This analysis suggests that in hydrology, as in other fields, hypothesis formulation and testing rarely correspond to the idealized model of the scientific method. Practices such as "p-hacking" or "HARKing" (Hypothesizing After the Results are Known) are major obstacles to more rigorous hypothesis testing in hydrology, along with the well-known problem of confirmation bias—the tendency to value and trust confirmations more than refutations—among both researchers and reviewers. Nonetheless, as several examples illustrate, hypothesis tests have played an essential role in spurring major advances in hydrological theory. Hypothesis testing is not the only recipe for scientific progress, however. Exploratory research, driven by innovations in measurement and observation, has also underlain many key advances. Further improvements in observation and measurement will be vital to both exploratory research and hypothesis testing, and thus to advancing the science of hydrology.

  7. Bayesian inference for psychology. Part II: Example applications with JASP.

    PubMed

    Wagenmakers, Eric-Jan; Love, Jonathon; Marsman, Maarten; Jamil, Tahira; Ly, Alexander; Verhagen, Josine; Selker, Ravi; Gronau, Quentin F; Dropmann, Damian; Boutin, Bruno; Meerhoff, Frans; Knight, Patrick; Raj, Akash; van Kesteren, Erik-Jan; van Doorn, Johnny; Šmíra, Martin; Epskamp, Sacha; Etz, Alexander; Matzke, Dora; de Jong, Tim; van den Bergh, Don; Sarafoglou, Alexandra; Steingroever, Helen; Derks, Koen; Rouder, Jeffrey N; Morey, Richard D

    2018-02-01

    Bayesian hypothesis testing presents an attractive alternative to p value hypothesis testing. Part I of this series outlined several advantages of Bayesian hypothesis testing, including the ability to quantify evidence and the ability to monitor and update this evidence as data come in, without the need to know the intention with which the data were collected. Despite these and other practical advantages, Bayesian hypothesis tests are still reported relatively rarely. An important impediment to the widespread adoption of Bayesian tests is arguably the lack of user-friendly software for the run-of-the-mill statistical problems that confront psychologists for the analysis of almost every experiment: the t-test, ANOVA, correlation, regression, and contingency tables. In Part II of this series we introduce JASP ( http://www.jasp-stats.org ), an open-source, cross-platform, user-friendly graphical software package that allows users to carry out Bayesian hypothesis tests for standard statistical problems. JASP is based in part on the Bayesian analyses implemented in Morey and Rouder's BayesFactor package for R. Armed with JASP, the practical advantages of Bayesian hypothesis testing are only a mouse click away.

  8. Teaching Hypothesis Testing by Debunking a Demonstration of Telepathy.

    ERIC Educational Resources Information Center

    Bates, John A.

    1991-01-01

    Discusses a lesson designed to demonstrate hypothesis testing to introductory college psychology students. Explains that a psychology instructor demonstrated apparent psychic abilities to students. Reports that students attempted to explain the instructor's demonstrations through hypothesis testing and revision. Provides instructions on performing…

  9. Shape analysis of moss (Bryophyta) sporophytes: Insights into land plant evolution.

    PubMed

    Rose, Jeffrey P; Kriebel, Ricardo; Sytsma, Kenneth J

    2016-04-01

    The alternation of generations life cycle represents a key feature of land-plant evolution and has resulted in a diverse array of sporophyte forms and modifications in all groups of land plants. We test the hypothesis that evolution of sporangium (capsule) shape of the mosses-the second most diverse land-plant lineage-has been driven by differing physiological demands of life in diverse habitats. This study provides an important conceptual framework for analyzing the evolution of a single, homologous character in a continuous framework across a deep expanse of time, across all branches of the tree of life. We reconstruct ancestral sporangium shape and ancestral habitat on the largest phylogeny of mosses to date, and use phylogenetic generalized least squares regression to test the association between habitat and sporangium shape. In addition, we examine the association between shifts in sporangium shape and species diversification. We demonstrate that sporangium shape is convergent, under natural selection, and associated with habitat type, and that many shifts in speciation rate are associated with shifts in sporangium shape. Our results suggest that natural selection in different microhabitats results in the diversity of sporangium shape found in mosses, and that many increasing shifts in speciation rate result in changes in sporangium shape across their 480 million year history. Our framework provides a way to examine if diversification shifts in other land plants are also associated with massive changes in sporophyte form, among other morphological traits. © 2016 Botanical Society of America.

  10. Trends in hypothesis testing and related variables in nursing research: a retrospective exploratory study.

    PubMed

    Lash, Ayhan Aytekin; Plonczynski, Donna J; Sehdev, Amikar

    2011-01-01

    To compare the inclusion and the influences of selected variables on hypothesis testing during the 1980s and 1990s. In spite of the emphasis on conducting inquiry consistent with the tenets of logical positivism, there have been no studies investigating the frequency and patterns of hypothesis testing in nursing research The sample was obtained from the journal Nursing Research which was the research journal with the highest circulation during the study period under study. All quantitative studies published during the two decades including briefs and historical studies were included in the analyses A retrospective design was used to select the sample. Five years from the 1980s and 1990s each were randomly selected from the journal, Nursing Research. Of the 582 studies, 517 met inclusion criteria. Findings suggest that there has been a decline in the use of hypothesis testing in the last decades of the 20th century. Further research is needed to identify the factors that influence the conduction of research with hypothesis testing. Hypothesis testing in nursing research showed a steady decline from the 1980s to 1990s. Research purposes of explanation, and prediction/ control increased the likelihood of hypothesis testing. Hypothesis testing strengthens the quality of the quantitative studies, increases the generality of findings and provides dependable knowledge. This is particularly true for quantitative studies that aim to explore, explain and predict/control phenomena and/or test theories. The findings also have implications for doctoral programmes, research preparation of nurse-investigators, and theory testing.

  11. Economy and job contract as contexts of sickness absence practices: revisiting locality and habitus.

    PubMed

    Virtanen, P; Vahtera, J; Nakari, R; Pentti, J; Kivimäki, M

    2004-04-01

    This study revisits two Finnish local governments-Raisio and Nokia-that in an earlier study showed different sickness absence rates in the early 1990s. The locality difference was interpreted sociologically, within a framework inspired by Bourdieu's theory of social field, habitus and practice. The same framework is applied in the present study, starting out from the hypothesis that a constant historical and cultural locality context tends to reproduce prevailing sickness absence practices. The hypothesis was tested by extending the context beyond the locality to the macroeconomic fluctuations that occurred during the 1990s and to the type of employment contract. In both localities a 30% rise was observed in levels of sickness absence from 1991-1993 to 1997-2000. At the beginning of the 1990s the absence rate among permanent employees was 1.86 times higher in Nokia than in Raisio; at the end of the decade the corresponding rate ratio was 1.88. The absence rates were significantly lower among fixed-term employees than permanent employees, but the locality difference was seen in their case, too. Both results support the hypothesis. In spite of major changes taking place in the national economy, the differences between the two towns' sickness absence rates persisted, which in this particular case probably reflects the persisting working-class character of Nokia and middle-class character of Raisio. The theory also applies to the difference between permanent and fixed-term employees: the peripheral power position of the latter on work related social fields leads to the observed practices, i.e. to the relatively low absence rate. The results of our revisit give reason to recapitulate and elaborate upon our theoretical interpretation with a view to deepening our understanding of the social origins of sickness absence practices in the post-industrial workplace, which is characterised by increasing atypical employment and growing job insecurity.

  12. A shift from significance test to hypothesis test through power analysis in medical research.

    PubMed

    Singh, G

    2006-01-01

    Medical research literature until recently, exhibited substantial dominance of the Fisher's significance test approach of statistical inference concentrating more on probability of type I error over Neyman-Pearson's hypothesis test considering both probability of type I and II error. Fisher's approach dichotomises results into significant or not significant results with a P value. The Neyman-Pearson's approach talks of acceptance or rejection of null hypothesis. Based on the same theory these two approaches deal with same objective and conclude in their own way. The advancement in computing techniques and availability of statistical software have resulted in increasing application of power calculations in medical research and thereby reporting the result of significance tests in the light of power of the test also. Significance test approach, when it incorporates power analysis contains the essence of hypothesis test approach. It may be safely argued that rising application of power analysis in medical research may have initiated a shift from Fisher's significance test to Neyman-Pearson's hypothesis test procedure.

  13. Three dimensions of the amyloid hypothesis: time, space and 'wingmen'.

    PubMed

    Musiek, Erik S; Holtzman, David M

    2015-06-01

    The amyloid hypothesis, which has been the predominant framework for research in Alzheimer's disease (AD), has been the source of considerable controversy. The amyloid hypothesis postulates that amyloid-β peptide (Aβ) is the causative agent in AD. It is strongly supported by data from rare autosomal dominant forms of AD. However, the evidence that Aβ causes or contributes to age-associated sporadic AD is more complex and less clear, prompting criticism of the hypothesis. We provide an overview of the major arguments for and against the amyloid hypothesis. We conclude that Aβ likely is the key initiator of a complex pathogenic cascade that causes AD. However, we argue that Aβ acts primarily as a trigger of other downstream processes, particularly tau aggregation, which mediate neurodegeneration. Aβ appears to be necessary, but not sufficient, to cause AD. Its major pathogenic effects may occur very early in the disease process.

  14. Toward Joint Hypothesis-Tests Seismic Event Screening Analysis: Ms|mb and Event Depth

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Anderson, Dale; Selby, Neil

    2012-08-14

    Well established theory can be used to combine single-phenomenology hypothesis tests into a multi-phenomenology event screening hypothesis test (Fisher's and Tippett's tests). Commonly used standard error in Ms:mb event screening hypothesis test is not fully consistent with physical basis. Improved standard error - Better agreement with physical basis, and correctly partitions error to include Model Error as a component of variance, correctly reduces station noise variance through network averaging. For 2009 DPRK test - Commonly used standard error 'rejects' H0 even with better scaling slope ({beta} = 1, Selby et al.), improved standard error 'fails to rejects' H0.

  15. Exploring Evolving Media Discourse Through Event Cueing.

    PubMed

    Lu, Yafeng; Steptoe, Michael; Burke, Sarah; Wang, Hong; Tsai, Jiun-Yi; Davulcu, Hasan; Montgomery, Douglas; Corman, Steven R; Maciejewski, Ross

    2016-01-01

    Online news, microblogs and other media documents all contain valuable insight regarding events and responses to events. Underlying these documents is the concept of framing, a process in which communicators act (consciously or unconsciously) to construct a point of view that encourages facts to be interpreted by others in a particular manner. As media discourse evolves, how topics and documents are framed can undergo change, shifting the discussion to different viewpoints or rhetoric. What causes these shifts can be difficult to determine directly; however, by linking secondary datasets and enabling visual exploration, we can enhance the hypothesis generation process. In this paper, we present a visual analytics framework for event cueing using media data. As discourse develops over time, our framework applies a time series intervention model which tests to see if the level of framing is different before or after a given date. If the model indicates that the times before and after are statistically significantly different, this cues an analyst to explore related datasets to help enhance their understanding of what (if any) events may have triggered these changes in discourse. Our framework consists of entity extraction and sentiment analysis as lenses for data exploration and uses two different models for intervention analysis. To demonstrate the usage of our framework, we present a case study on exploring potential relationships between climate change framing and conflicts in Africa.

  16. Toward evidence-based medical statistics: a Bayesian analysis of double-blind placebo-controlled antidepressant trials in the treatment of anxiety disorders.

    PubMed

    Monden, Rei; de Vos, Stijn; Morey, Richard; Wagenmakers, Eric-Jan; de Jonge, Peter; Roest, Annelieke M

    2016-12-01

    The Food and Drug Administration (FDA) uses a p < 0.05 null-hypothesis significance testing framework to evaluate "substantial evidence" for drug efficacy. This framework only allows dichotomous conclusions and does not quantify the strength of evidence supporting efficacy. The efficacy of FDA-approved antidepressants for the treatment of anxiety disorders was re-evaluated in a Bayesian framework that quantifies the strength of the evidence. Data from 58 double-blind placebo-controlled trials were retrieved from the FDA for the second-generation antidepressants for the treatment of anxiety disorders. Bayes factors (BFs) were calculated for all treatment arms compared to placebo and were compared with the corresponding p-values and the FDA conclusion categories. BFs ranged from 0.07 to 131,400, indicating a range of no support of evidence to strong evidence for the efficacy. Results also indicate a varying strength of evidence between the trials with p < 0.05. In sum, there were large differences in BFs across trials. Among trials providing "substantial evidence" according to the FDA, only 27 out of 59 dose groups obtained strong support for efficacy according to the typically used cutoff of BF ≥ 20. The Bayesian framework can provide valuable information on the strength of the evidence for drug efficacy. Copyright © 2016 John Wiley & Sons, Ltd. Copyright © 2016 John Wiley & Sons, Ltd.

  17. The Measurand Framework: Scaling Exploratory Data Analysis

    NASA Astrophysics Data System (ADS)

    Schneider, D.; MacLean, L. S.; Kappler, K. N.; Bleier, T.

    2017-12-01

    Since 2005 QuakeFinder (QF) has acquired a unique dataset with outstanding spatial and temporal sampling of earth's time varying magnetic field along several active fault systems. This QF network consists of 124 stations in California and 45 stations along fault zones in Greece, Taiwan, Peru, Chile and Indonesia. Each station is equipped with three feedback induction magnetometers, two ion sensors, a 4 Hz geophone, a temperature sensor, and a humidity sensor. Data are continuously recorded at 50 Hz with GPS timing and transmitted daily to the QF data center in California for analysis. QF is attempting to detect and characterize anomalous EM activity occurring ahead of earthquakes. In order to analyze this sizable dataset, QF has developed an analytical framework to support processing the time series input data and hypothesis testing to evaluate the statistical significance of potential precursory signals. The framework was developed with a need to support legacy, in-house processing but with an eye towards big-data processing with Apache Spark and other modern big data technologies. In this presentation, we describe our framework, which supports rapid experimentation and iteration of candidate signal processing techniques via modular data transformation stages, tracking of provenance, and automatic re-computation of downstream data when upstream data is updated. Furthermore, we discuss how the processing modules can be ported to big data platforms like Apache Spark and demonstrate a migration path from local, in-house processing to cloud-friendly processing.

  18. Aesthetic and Emotional Effects of Meter and Rhyme in Poetry

    PubMed Central

    Obermeier, Christian; Menninghaus, Winfried; von Koppenfels, Martin; Raettig, Tim; Schmidt-Kassow, Maren; Otterbein, Sascha; Kotz, Sonja A.

    2013-01-01

    Metrical patterning and rhyme are frequently employed in poetry but also in infant-directed speech, play, rites, and festive events. Drawing on four line-stanzas from nineteenth and twentieth German poetry that feature end rhyme and regular meter, the present study tested the hypothesis that meter and rhyme have an impact on aesthetic liking, emotional involvement, and affective valence attributions. Hypotheses that postulate such effects have been advocated ever since ancient rhetoric and poetics, yet they have barely been empirically tested. More recently, in the field of cognitive poetics, these traditional assumptions have been readopted into a general cognitive framework. In the present experiment, we tested the influence of meter and rhyme as well as their interaction with lexicality in the aesthetic and emotional perception of poetry. Participants listened to stanzas that were systematically modified with regard to meter and rhyme and rated them. Both rhyme and regular meter led to enhanced aesthetic appreciation, higher intensity in processing, and more positively perceived and felt emotions, with the latter finding being mediated by lexicality. Together these findings clearly show that both features significantly contribute to the aesthetic and emotional perception of poetry and thus confirm assumptions about their impact put forward by cognitive poetics. The present results are explained within the theoretical framework of cognitive fluency, which links structural features of poetry with aesthetic and emotional appraisal. PMID:23386837

  19. The effects of burnout and supervisory social support on the relationship between work-family conflict and intention to leave: a study of Australian cancer workers.

    PubMed

    Thanacoody, P Rani; Bartram, Timothy; Casimir, Gian

    2009-01-01

    The aim of this paper is to examine the effects of burnout and supervisory social support on the relationship between work-family conflict, and intention to leave of cancer workers in an Australian health care setting. Data collected from a public hospital of 114 cancer workers were used to test a model of the consequences of work-family conflict. The strength of the indirect effects of work-family conflict on intention to leave via burnout will depend on supervisor support was tested by conducting a moderated mediation analysis. Path analytic tests of moderated mediation supported the hypothesis that burnout mediates the relationship between work-family conflict (i.e., work-in-family conflict and family-in-work) and intention to leave the organisation and that the mediation framework is stronger in the presence of higher social supervisory support. Implications are drawn for theory, research and practice. This study applies the innovative statistical technique of moderated mediation analysis to demonstrate that burnout mediates the relationship between work-family conflict and intention to leave the organisation and that the mediation framework is stronger in the presence of lower social supervisory support. In the context of the continued shortage of many clinician groups theses results shed further light on the appropriate course of action for hospital management.

  20. Thinking in Pictures as a cognitive account of autism.

    PubMed

    Kunda, Maithilee; Goel, Ashok K

    2011-09-01

    We analyze the hypothesis that some individuals on the autism spectrum may use visual mental representations and processes to perform certain tasks that typically developing individuals perform verbally. We present a framework for interpreting empirical evidence related to this "Thinking in Pictures" hypothesis and then provide comprehensive reviews of data from several different cognitive tasks, including the n-back task, serial recall, dual task studies, Raven's Progressive Matrices, semantic processing, false belief tasks, visual search, spatial recall, and visual recall. We also discuss the relationships between the Thinking in Pictures hypothesis and other cognitive theories of autism including Mindblindness, Executive Dysfunction, Weak Central Coherence, and Enhanced Perceptual Functioning.

  1. AhR-mediated gene expression in the developing mouse telencephalon.

    PubMed

    Gohlke, Julia M; Stockton, Pat S; Sieber, Stella; Foley, Julie; Portier, Christopher J

    2009-11-01

    We hypothesize that TCDD-induced developmental neurotoxicity is modulated through an AhR-dependent interaction with key regulatory neuronal differentiation pathways during telencephalon development. To test this hypothesis we examined global gene expression in both dorsal and ventral telencephalon tissues in E13.5 AhR-/- and wildtype mice exposed to TCDD or vehicle. Consistent with previous biochemical, pathological and behavioral studies, our results suggest TCDD initiated changes in gene expression in the developing telencephalon are primarily AhR-dependent, as no statistically significant gene expression changes are evident after TCDD exposure in AhR-/- mice. Based on a gene regulatory network for neuronal specification in the developing telencephalon, the present analysis suggests differentiation of GABAergic neurons in the ventral telencephalon is compromised in TCDD exposed and AhR-/- mice. In addition, our analysis suggests Sox11 may be directly regulated by AhR based on gene expression and comparative genomics analyses. In conclusion, this analysis supports the hypothesis that AhR has a specific role in the normal development of the telencephalon and provides a mechanistic framework for neurodevelopmental toxicity of chemicals that perturb AhR signaling.

  2. A layered abduction model of perception: Integrating bottom-up and top-down processing in a multi-sense agent

    NASA Technical Reports Server (NTRS)

    Josephson, John R.

    1989-01-01

    A layered-abduction model of perception is presented which unifies bottom-up and top-down processing in a single logical and information-processing framework. The process of interpreting the input from each sense is broken down into discrete layers of interpretation, where at each layer a best explanation hypothesis is formed of the data presented by the layer or layers below, with the help of information available laterally and from above. The formation of this hypothesis is treated as a problem of abductive inference, similar to diagnosis and theory formation. Thus this model brings a knowledge-based problem-solving approach to the analysis of perception, treating perception as a kind of compiled cognition. The bottom-up passing of information from layer to layer defines channels of information flow, which separate and converge in a specific way for any specific sense modality. Multi-modal perception occurs where channels converge from more than one sense. This model has not yet been implemented, though it is based on systems which have been successful in medical and mechanical diagnosis and medical test interpretation.

  3. Do we represent intentional action as recursively embedded? The answer must be empirical. A comment on Vicari and Adenzato (2014).

    PubMed

    Martins, Mauricio D; Fitch, W Tecumseh

    2015-12-15

    The relationship between linguistic syntax and action planning is of considerable interest in cognitive science because many researchers suggest that "motor syntax" shares certain key traits with language. In a recent manuscript in this journal, Vicari and Adenzato (henceforth VA) critiqued Hauser, Chomsky and Fitch's 2002 (henceforth HCF's) hypothesis that recursion is language-specific, and that its usage in other domains is parasitic on language resources. VA's main argument is that HCF's hypothesis is falsified by the fact that recursion typifies the structure of intentional action, and recursion in the domain of action is independent of language. Here, we argue that VA's argument is incomplete, and that their formalism can be contrasted with alternative frameworks that are equally consistent with existing data. Therefore their conclusions are premature without further empirical testing and support. In particular, to accept VA's argument it would be necessary to demonstrate both that humans in fact represent self-embedding in the structure of intentional action, and that language is not used to construct these representations. Copyright © 2015 Elsevier Inc. All rights reserved.

  4. ON THE SUBJECT OF HYPOTHESIS TESTING

    PubMed Central

    Ugoni, Antony

    1993-01-01

    In this paper, the definition of a statistical hypothesis is discussed, and the considerations which need to be addressed when testing a hypothesis. In particular, the p-value, significance level, and power of a test are reviewed. Finally, the often quoted confidence interval is given a brief introduction. PMID:17989768

  5. Some consequences of using the Horsfall-Barratt scale for hypothesis testing

    USDA-ARS?s Scientific Manuscript database

    Comparing treatment effects by hypothesis testing is a common practice in plant pathology. Nearest percent estimates (NPEs) of disease severity were compared to Horsfall-Barratt (H-B) scale data to explore whether there was an effect of assessment method on hypothesis testing. A simulation model ba...

  6. Hypothesis Testing in Task-Based Interaction

    ERIC Educational Resources Information Center

    Choi, Yujeong; Kilpatrick, Cynthia

    2014-01-01

    Whereas studies show that comprehensible output facilitates L2 learning, hypothesis testing has received little attention in Second Language Acquisition (SLA). Following Shehadeh (2003), we focus on hypothesis testing episodes (HTEs) in which learners initiate repair of their own speech in interaction. In the context of a one-way information gap…

  7. Classroom-Based Strategies to Incorporate Hypothesis Testing in Functional Behavior Assessments

    ERIC Educational Resources Information Center

    Lloyd, Blair P.; Weaver, Emily S.; Staubitz, Johanna L.

    2017-01-01

    When results of descriptive functional behavior assessments are unclear, hypothesis testing can help school teams understand how the classroom environment affects a student's challenging behavior. This article describes two hypothesis testing strategies that can be used in classroom settings: structural analysis and functional analysis. For each…

  8. Hypothesis Testing in the Real World

    ERIC Educational Resources Information Center

    Miller, Jeff

    2017-01-01

    Critics of null hypothesis significance testing suggest that (a) its basic logic is invalid and (b) it addresses a question that is of no interest. In contrast to (a), I argue that the underlying logic of hypothesis testing is actually extremely straightforward and compelling. To substantiate that, I present examples showing that hypothesis…

  9. Perceptual context effects of speech and nonspeech sounds: the role of auditory categories.

    PubMed

    Aravamudhan, Radhika; Lotto, Andrew J; Hawks, John W

    2008-09-01

    Williams [(1986). "Role of dynamic information in the perception of coarticulated vowels," Ph.D. thesis, University of Connecticut, Standford, CT] demonstrated that nonspeech contexts had no influence on pitch judgments of nonspeech targets, whereas context effects were obtained when instructed to perceive the sounds as speech. On the other hand, Holt et al. [(2000). "Neighboring spectral content influences vowel identification," J. Acoust. Soc. Am. 108, 710-722] showed that nonspeech contexts were sufficient to elicit context effects in speech targets. The current study was to test a hypothesis that could explain the varying effectiveness of nonspeech contexts: Context effects are obtained only when there are well-established perceptual categories for the target stimuli. Experiment 1 examined context effects in speech and nonspeech signals using four series of stimuli: steady-state vowels that perceptually spanned from /inverted ohm/-/I/ in isolation and in the context of /w/ (with no steady-state portion) and two nonspeech sine-wave series that mimicked the acoustics of the speech series. In agreement with previous work context effects were obtained for speech contexts and targets but not for nonspeech analogs. Experiment 2 tested predictions of the hypothesis by testing for nonspeech context effects after the listeners had been trained to categorize the sounds. Following training, context-dependent categorization was obtained for nonspeech stimuli in the training group. These results are presented within a general perceptual-cognitive framework for speech perception research.

  10. Perceptual context effects of speech and nonspeech sounds: The role of auditory categories

    PubMed Central

    Aravamudhan, Radhika; Lotto, Andrew J.; Hawks, John W.

    2008-01-01

    Williams [(1986). “Role of dynamic information in the perception of coarticulated vowels,” Ph.D. thesis, University of Connecticut, Standford, CT] demonstrated that nonspeech contexts had no influence on pitch judgments of nonspeech targets, whereas context effects were obtained when instructed to perceive the sounds as speech. On the other hand, Holt et al. [(2000). “Neighboring spectral content influences vowel identification,” J. Acoust. Soc. Am. 108, 710–722] showed that nonspeech contexts were sufficient to elicit context effects in speech targets. The current study was to test a hypothesis that could explain the varying effectiveness of nonspeech contexts: Context effects are obtained only when there are well-established perceptual categories for the target stimuli. Experiment 1 examined context effects in speech and nonspeech signals using four series of stimuli: steady-state vowels that perceptually spanned from ∕ʊ∕-∕ɪ∕ in isolation and in the context of ∕w∕ (with no steady-state portion) and two nonspeech sine-wave series that mimicked the acoustics of the speech series. In agreement with previous work context effects were obtained for speech contexts and targets but not for nonspeech analogs. Experiment 2 tested predictions of the hypothesis by testing for nonspeech context effects after the listeners had been trained to categorize the sounds. Following training, context-dependent categorization was obtained for nonspeech stimuli in the training group. These results are presented within a general perceptual-cognitive framework for speech perception research. PMID:19045660

  11. Roles of Abductive Reasoning and Prior Belief in Children's Generation of Hypotheses about Pendulum Motion

    ERIC Educational Resources Information Center

    Kwon, Yong-Ju; Jeong, Jin-Su; Park, Yun-Bok

    2006-01-01

    The purpose of the present study was to test the hypothesis that student's abductive reasoning skills play an important role in the generation of hypotheses on pendulum motion tasks. To test the hypothesis, a hypothesis-generating test on pendulum motion, and a prior-belief test about pendulum motion were developed and administered to a sample of…

  12. Towards a Holistic Framework for the Evaluation of Emergency Plans in Indoor Environments

    PubMed Central

    Serrano, Emilio; Poveda, Geovanny; Garijo, Mercedes

    2014-01-01

    One of the most promising fields for ambient intelligence is the implementation of intelligent emergency plans. Because the use of drills and living labs cannot reproduce social behaviors, such as panic attacks, that strongly affect these plans, the use of agent-based social simulation provides an approach to evaluate these plans more thoroughly. (1) The hypothesis presented in this paper is that there has been little interest in describing the key modules that these simulators must include, such as formally represented knowledge and a realistic simulated sensor model, and especially in providing researchers with tools to reuse, extend and interconnect modules from different works. This lack of interest hinders researchers from achieving a holistic framework for evaluating emergency plans and forces them to reconsider and to implement the same components from scratch over and over. In addition to supporting this hypothesis by considering over 150 simulators, this paper: (2) defines the main modules identified and proposes the use of semantic web technologies as a cornerstone for the aforementioned holistic framework; (3) provides a basic methodology to achieve the framework; (4) identifies the main challenges; and (5) presents an open and free software tool to hint at the potential of such a holistic view of emergency plan evaluation in indoor environments. PMID:24662453

  13. The brain, self and society: a social-neuroscience model of predictive processing.

    PubMed

    Kelly, Michael P; Kriznik, Natasha M; Kinmonth, Ann Louise; Fletcher, Paul C

    2018-05-10

    This paper presents a hypothesis about how social interactions shape and influence predictive processing in the brain. The paper integrates concepts from neuroscience and sociology where a gulf presently exists between the ways that each describe the same phenomenon - how the social world is engaged with by thinking humans. We combine the concepts of predictive processing models (also called predictive coding models in the neuroscience literature) with ideal types, typifications and social practice - concepts from the sociological literature. This generates a unified hypothetical framework integrating the social world and hypothesised brain processes. The hypothesis combines aspects of neuroscience and psychology with social theory to show how social behaviors may be "mapped" onto brain processes. It outlines a conceptual framework that connects the two disciplines and that may enable creative dialogue and potential future research.

  14. Does plant apparency matter? Thirty years of data provide limited support but reveal clear patterns of the effects of plant chemistry on herbivores.

    PubMed

    Smilanich, Angela M; Fincher, R Malia; Dyer, Lee A

    2016-05-01

    According to the plant-apparency hypothesis, apparent plants allocate resources to quantitative defenses that negatively affect generalist and specialist herbivores, while unapparent plants invest more in qualitative defenses that negatively affect nonadapted generalists. Although this hypothesis has provided a useful framework for understanding the evolution of plant chemical defense, there are many inconsistencies surrounding associated predictions, and it has been heavily criticized and deemed obsolete. We used a hierarchical Bayesian meta-analysis model to test whether defenses from apparent and unapparent plants differ in their effects on herbivores. We collected a total of 225 effect sizes from 158 published papers in which the effects of plant chemistry on herbivore performance were reported. As predicted by the plant-apparency hypothesis, we found a prevalence of quantitative defenses in woody plants and qualitative defenses in herbaceous plants. However, the detrimental impacts of qualitative defenses were more effective against specialists than generalists, and the effects of chemical defenses did not significantly differ between specialists and generalists for woody or herbaceous plants. A striking pattern that emerged from our data was a pervasiveness of beneficial effects of secondary metabolites on herbivore performance, especially generalists. This pattern provides evidence that herbivores are evolving effective counteradaptations to putative plant defenses. © 2016 The Authors. New Phytologist © 2016 New Phytologist Trust.

  15. Making Knowledge Delivery Failsafe: Adding Step Zero in Hypothesis Testing

    ERIC Educational Resources Information Center

    Pan, Xia; Zhou, Qiang

    2010-01-01

    Knowledge of statistical analysis is increasingly important for professionals in modern business. For example, hypothesis testing is one of the critical topics for quality managers and team workers in Six Sigma training programs. Delivering the knowledge of hypothesis testing effectively can be an important step for the incapable learners or…

  16. SubClonal Hierarchy Inference from Somatic Mutations: Automatic Reconstruction of Cancer Evolutionary Trees from Multi-region Next Generation Sequencing

    PubMed Central

    Niknafs, Noushin; Beleva-Guthrie, Violeta; Naiman, Daniel Q.; Karchin, Rachel

    2015-01-01

    Recent improvements in next-generation sequencing of tumor samples and the ability to identify somatic mutations at low allelic fractions have opened the way for new approaches to model the evolution of individual cancers. The power and utility of these models is increased when tumor samples from multiple sites are sequenced. Temporal ordering of the samples may provide insight into the etiology of both primary and metastatic lesions and rationalizations for tumor recurrence and therapeutic failures. Additional insights may be provided by temporal ordering of evolving subclones—cellular subpopulations with unique mutational profiles. Current methods for subclone hierarchy inference tightly couple the problem of temporal ordering with that of estimating the fraction of cancer cells harboring each mutation. We present a new framework that includes a rigorous statistical hypothesis test and a collection of tools that make it possible to decouple these problems, which we believe will enable substantial progress in the field of subclone hierarchy inference. The methods presented here can be flexibly combined with methods developed by others addressing either of these problems. We provide tools to interpret hypothesis test results, which inform phylogenetic tree construction, and we introduce the first genetic algorithm designed for this purpose. The utility of our framework is systematically demonstrated in simulations. For most tested combinations of tumor purity, sequencing coverage, and tree complexity, good power (≥ 0.8) can be achieved and Type 1 error is well controlled when at least three tumor samples are available from a patient. Using data from three published multi-region tumor sequencing studies of (murine) small cell lung cancer, acute myeloid leukemia, and chronic lymphocytic leukemia, in which the authors reconstructed subclonal phylogenetic trees by manual expert curation, we show how different configurations of our tools can identify either a single tree in agreement with the authors, or a small set of trees, which include the authors’ preferred tree. Our results have implications for improved modeling of tumor evolution and the importance of multi-region tumor sequencing. PMID:26436540

  17. Testing of Hypothesis in Equivalence and Non Inferiority Trials-A Concept.

    PubMed

    Juneja, Atul; Aggarwal, Abha R; Adhikari, Tulsi; Pandey, Arvind

    2016-04-01

    Establishing the appropriate hypothesis is one of the important steps for carrying out the statistical tests/analysis. Its understanding is important for interpreting the results of statistical analysis. The current communication attempts to provide the concept of testing of hypothesis in non inferiority and equivalence trials, where the null hypothesis is just reverse of what is set up for conventional superiority trials. It is similarly looked for rejection for establishing the fact the researcher is intending to prove. It is important to mention that equivalence or non inferiority cannot be proved by accepting the null hypothesis of no difference. Hence, establishing the appropriate statistical hypothesis is extremely important to arrive at meaningful conclusion for the set objectives in research.

  18. Radiatively driven stratosphere-troposphere interactions near the tops of tropical cloud clusters

    NASA Technical Reports Server (NTRS)

    Churchill, Dean D.; Houze, Robert A., Jr.

    1990-01-01

    Results are presented of two numerical simulations of the mechanism involved in the dehydration of air, using the model of Churchill (1988) and Churchill and Houze (1990) which combines the water and ice physics parameterizations and IR and solar-radiation parameterization with a convective adjustment scheme in a kinematic nondynamic framework. One simulation, a cirrus cloud simulation, was to test the Danielsen (1982) hypothesis of a dehydration mechanism for the stratosphere; the other was to simulate the mesoscale updraft in order to test an alternative mechanism for 'freeze-drying' the air. The results show that the physical processes simulated in the mesoscale updraft differ from those in the thin-cirrus simulation. While in the thin-cirrus case, eddy fluxes occur in response to IR radiative destabilization, and, hence, no net transfer occurs between troposphere and stratosphere, the mesosphere updraft case has net upward mass transport into the lower stratosphere.

  19. Predictability of machine learning techniques to forecast the trends of market index prices: Hypothesis testing for the Korean stock markets.

    PubMed

    Pyo, Sujin; Lee, Jaewook; Cha, Mincheol; Jang, Huisu

    2017-01-01

    The prediction of the trends of stocks and index prices is one of the important issues to market participants. Investors have set trading or fiscal strategies based on the trends, and considerable research in various academic fields has been studied to forecast financial markets. This study predicts the trends of the Korea Composite Stock Price Index 200 (KOSPI 200) prices using nonparametric machine learning models: artificial neural network, support vector machines with polynomial and radial basis function kernels. In addition, this study states controversial issues and tests hypotheses about the issues. Accordingly, our results are inconsistent with those of the precedent research, which are generally considered to have high prediction performance. Moreover, Google Trends proved that they are not effective factors in predicting the KOSPI 200 index prices in our frameworks. Furthermore, the ensemble methods did not improve the accuracy of the prediction.

  20. Predictability of machine learning techniques to forecast the trends of market index prices: Hypothesis testing for the Korean stock markets

    PubMed Central

    Pyo, Sujin; Lee, Jaewook; Cha, Mincheol

    2017-01-01

    The prediction of the trends of stocks and index prices is one of the important issues to market participants. Investors have set trading or fiscal strategies based on the trends, and considerable research in various academic fields has been studied to forecast financial markets. This study predicts the trends of the Korea Composite Stock Price Index 200 (KOSPI 200) prices using nonparametric machine learning models: artificial neural network, support vector machines with polynomial and radial basis function kernels. In addition, this study states controversial issues and tests hypotheses about the issues. Accordingly, our results are inconsistent with those of the precedent research, which are generally considered to have high prediction performance. Moreover, Google Trends proved that they are not effective factors in predicting the KOSPI 200 index prices in our frameworks. Furthermore, the ensemble methods did not improve the accuracy of the prediction. PMID:29136004

  1. Dynamic motion planning of 3D human locomotion using gradient-based optimization.

    PubMed

    Kim, Hyung Joo; Wang, Qian; Rahmatalla, Salam; Swan, Colby C; Arora, Jasbir S; Abdel-Malek, Karim; Assouline, Jose G

    2008-06-01

    Since humans can walk with an infinite variety of postures and limb movements, there is no unique solution to the modeling problem to predict human gait motions. Accordingly, we test herein the hypothesis that the redundancy of human walking mechanisms makes solving for human joint profiles and force time histories an indeterminate problem best solved by inverse dynamics and optimization methods. A new optimization-based human-modeling framework is thus described for predicting three-dimensional human gait motions on level and inclined planes. The basic unknowns in the framework are the joint motion time histories of a 25-degree-of-freedom human model and its six global degrees of freedom. The joint motion histories are calculated by minimizing an objective function such as deviation of the trunk from upright posture that relates to the human model's performance. A variety of important constraints are imposed on the optimization problem, including (1) satisfaction of dynamic equilibrium equations by requiring the model's zero moment point (ZMP) to lie within the instantaneous geometrical base of support, (2) foot collision avoidance, (3) limits on ground-foot friction, and (4) vanishing yawing moment. Analytical forms of objective and constraint functions are presented and discussed for the proposed human-modeling framework in which the resulting optimization problems are solved using gradient-based mathematical programming techniques. When the framework is applied to the modeling of bipedal locomotion on level and inclined planes, acyclic human walking motions that are smooth and realistic as opposed to less natural robotic motions are obtained. The aspects of the modeling framework requiring further investigation and refinement, as well as potential applications of the framework in biomechanics, are discussed.

  2. Cognitive processing in bipolar disorder conceptualized using the Interactive Cognitive Subsystems (ICS) model.

    PubMed

    Lomax, C L; Barnard, P J; Lam, D

    2009-05-01

    There are few theoretical proposals that attempt to account for the variation in affective processing across different affective states of bipolar disorder (BD). The Interacting Cognitive Subsystems (ICS) framework has been recently extended to account for manic states. Within the framework, positive mood state is hypothesized to tap into an implicational level of processing, which is proposed to be more extreme in states of mania. Thirty individuals with BD and 30 individuals with no history of affective disorder were tested in euthymic mood state and then in induced positive mood state using the Question-Answer task to examine the mode of processing of schemas. The task was designed to test whether individuals would detect discrepancies within the prevailing schemas of the sentences. Although the present study did not support the hypothesis that the groups differ in their ability to detect discrepancies within schemas, we did find that the BD group was significantly more likely than the control group to answer questions that were consistent with the prevailing schemas, both before and after mood induction. These results may reflect a general cognitive bias, that individuals with BD have a tendency to operate at a more abstract level of representation. This may leave an individual prone to affective disturbance, although further research is required to replicate this finding.

  3. Assessing the homogenization of urban land management with an application to US residential lawn care

    PubMed Central

    Polsky, Colin; Grove, J. Morgan; Knudson, Chris; Groffman, Peter M.; Bettez, Neil; Cavender-Bares, Jeannine; Hall, Sharon J.; Heffernan, James B.; Hobbie, Sarah E.; Larson, Kelli L.; Morse, Jennifer L.; Neill, Christopher; Nelson, Kristen C.; Ogden, Laura A.; O’Neil-Dunne, Jarlath; Pataki, Diane E.; Roy Chowdhury, Rinku; Steele, Meredith K.

    2014-01-01

    Changes in land use, land cover, and land management present some of the greatest potential global environmental challenges of the 21st century. Urbanization, one of the principal drivers of these transformations, is commonly thought to be generating land changes that are increasingly similar. An implication of this multiscale homogenization hypothesis is that the ecosystem structure and function and human behaviors associated with urbanization should be more similar in certain kinds of urbanized locations across biogeophysical gradients than across urbanization gradients in places with similar biogeophysical characteristics. This paper introduces an analytical framework for testing this hypothesis, and applies the framework to the case of residential lawn care. This set of land management behaviors are often assumed—not demonstrated—to exhibit homogeneity. Multivariate analyses are conducted on telephone survey responses from a geographically stratified random sample of homeowners (n = 9,480), equally distributed across six US metropolitan areas. Two behaviors are examined: lawn fertilizing and irrigating. Limited support for strong homogenization is found at two scales (i.e., multi- and single-city; 2 of 36 cases), but significant support is found for homogenization at only one scale (22 cases) or at neither scale (12 cases). These results suggest that US lawn care behaviors are more differentiated in practice than in theory. Thus, even if the biophysical outcomes of urbanization are homogenizing, managing the associated sustainability implications may require a multiscale, differentiated approach because the underlying social practices appear relatively varied. The analytical approach introduced here should also be productive for other facets of urban-ecological homogenization. PMID:24616515

  4. The impact of marijuana use on memory in HIV-infected patients: a comprehensive review of the HIV and marijuana literatures

    PubMed Central

    Skalski, Linda M.; Towe, Sheri L.; Sikkema, Kathleen J.; Meade, Christina S.

    2016-01-01

    Background The most robust neurocognitive effect of marijuana use is memory impairment. Memory deficits are also high among persons living with HIV/AIDS, and marijuana is the most commonly used drug in this population. Yet research examining neurocognitive outcomes resulting from co-occurring marijuana and HIV is limited. Objective The primary objectives of this comprehensive review are to: (1) examine the literature on memory functioning in HIV-infected individuals; (2) examine the literature on memory functioning in marijuana users; (3) synthesize findings and propose a theoretical framework to guide future research. Method PubMed was searched for English publications 2000–2013. Twenty-two studies met inclusion criteria in the HIV literature, and 23 studies in the marijuana literature. Results Among HIV-infected individuals, memory deficits with medium to large effect sizes were observed. Marijuana users also demonstrated memory problems, but results were less consistent due to the diversity of samples. Conclusion A compensatory hypothesis, based on the cognitive aging literature, is proposed to provide a framework to explore the interaction between marijuana and HIV. There is some evidence that individuals infected with HIV recruit additional brain regions during memory tasks to compensate for HIV-related declines in neurocognitive functioning. Marijuana use causes impairment in similar brain systems, and thus it is hypothesized that the added neural strain of marijuana can exhaust neural resources, resulting in pronounced memory impairment. It will be important to test this hypothesis empirically, and future research priorities are discussed. PMID:27138170

  5. Approaches to informed consent for hypothesis-testing and hypothesis-generating clinical genomics research.

    PubMed

    Facio, Flavia M; Sapp, Julie C; Linn, Amy; Biesecker, Leslie G

    2012-10-10

    Massively-parallel sequencing (MPS) technologies create challenges for informed consent of research participants given the enormous scale of the data and the wide range of potential results. We propose that the consent process in these studies be based on whether they use MPS to test a hypothesis or to generate hypotheses. To demonstrate the differences in these approaches to informed consent, we describe the consent processes for two MPS studies. The purpose of our hypothesis-testing study is to elucidate the etiology of rare phenotypes using MPS. The purpose of our hypothesis-generating study is to test the feasibility of using MPS to generate clinical hypotheses, and to approach the return of results as an experimental manipulation. Issues to consider in both designs include: volume and nature of the potential results, primary versus secondary results, return of individual results, duty to warn, length of interaction, target population, and privacy and confidentiality. The categorization of MPS studies as hypothesis-testing versus hypothesis-generating can help to clarify the issue of so-called incidental or secondary results for the consent process, and aid the communication of the research goals to study participants.

  6. An Exercise for Illustrating the Logic of Hypothesis Testing

    ERIC Educational Resources Information Center

    Lawton, Leigh

    2009-01-01

    Hypothesis testing is one of the more difficult concepts for students to master in a basic, undergraduate statistics course. Students often are puzzled as to why statisticians simply don't calculate the probability that a hypothesis is true. This article presents an exercise that forces students to lay out on their own a procedure for testing a…

  7. Hypothesis Testing, "p" Values, Confidence Intervals, Measures of Effect Size, and Bayesian Methods in Light of Modern Robust Techniques

    ERIC Educational Resources Information Center

    Wilcox, Rand R.; Serang, Sarfaraz

    2017-01-01

    The article provides perspectives on p values, null hypothesis testing, and alternative techniques in light of modern robust statistical methods. Null hypothesis testing and "p" values can provide useful information provided they are interpreted in a sound manner, which includes taking into account insights and advances that have…

  8. Hypothesis Testing Using Spatially Dependent Heavy Tailed Multisensor Data

    DTIC Science & Technology

    2014-12-01

    Office of Research 113 Bowne Hall Syracuse, NY 13244 -1200 ABSTRACT HYPOTHESIS TESTING USING SPATIALLY DEPENDENT HEAVY-TAILED MULTISENSOR DATA Report...consistent with the null hypothesis of linearity and can be used to estimate the distribution of a test statistic that can discrimi- nate between the null... Test for nonlinearity. Histogram is generated using the surrogate data. The statistic of the original time series is represented by the solid line

  9. Technique Feature Analysis or Involvement Load Hypothesis: Estimating Their Predictive Power in Vocabulary Learning.

    PubMed

    Gohar, Manoochehr Jafari; Rahmanian, Mahboubeh; Soleimani, Hassan

    2018-02-05

    Vocabulary learning has always been a great concern and has attracted the attention of many researchers. Among the vocabulary learning hypotheses, involvement load hypothesis and technique feature analysis have been proposed which attempt to bring some concepts like noticing, motivation, and generation into focus. In the current study, 90 high proficiency EFL students were assigned into three vocabulary tasks of sentence making, composition, and reading comprehension in order to examine the power of involvement load hypothesis and technique feature analysis frameworks in predicting vocabulary learning. It was unraveled that involvement load hypothesis cannot be a good predictor, and technique feature analysis was a good predictor in pretest to posttest score change and not in during-task activity. The implications of the results will be discussed in the light of preparing vocabulary tasks.

  10. Integrated Testing Strategy (ITS) - Opportunities to better use existing data and guide future testing in toxicology.

    PubMed

    Jaworska, Joanna; Hoffmann, Sebastian

    2010-01-01

    The topic of Integrated Testing Strategies (ITS) has attracted considerable attention, and not only because it is supposed to be a central element of REACH, the ambitious European chemical regulation effort. Although what ITSs are supposed to do seems unambiguous, i.e. speeding up hazard and risk assessment while reducing testing costs, not much has been said, except basic conceptual proposals, about the methodologies that would allow execution of these concepts. Although a pressing concern, the topic of ITS has drawn mostly general reviews, broad concepts, and the expression of a clear need for more research on ITS. Published research in the field remains scarce. Solutions for ITS design emerge slowly, most likely due to the methodological challenges of the task, and perhaps also to it its complexity and the need for multidisciplinary collaboration. Along with the challenge, ITS offer a unique opportunity to contribute to the Toxicology of the 21st century by providing frameworks and tools to actually implement 21st century toxicology data in the chemical management and decision making processes. Further, ITS have the potential to significantly contribute to a modernization of the science of risk assessment. Therefore, to advance ITS research we propose a methodical approach to their design and will discuss currently available approaches as well as challenges to overcome. To this end, we define a framework for ITS that will inform toxicological decisions in a systematic, transparent, and consistent way. We review conceptual requirements for ITS developed earlier and present a roadmap to an operational framework that should be probabilistic, hypothesis-driven, and adaptive. Furthermore, we define properties an ITS should have in order to meet the identified requirements and differentiate them from evidence synthesis. Making use of an ITS for skin sensitization, we demonstrate how the proposed ITS concepts can be implemented.

  11. Fifty Years of Physics of Living Systems.

    PubMed

    Latash, Mark L

    2016-01-01

    The equilibrium-point hypothesis and its more recent version, the referent configuration hypothesis, represent the physical approach to the neural control of action. This hypothesis can be naturally combined with the idea of hierarchical control of movements and of synergic organization of the abundant systems involved in all actions. Any action starts with defining trajectories of a few referent coordinates for a handful of salient task-specific variables. Further, referent coordinates at hierarchically lower levels emerge down to thresholds of the tonic stretch reflex for the participating muscles. Stability of performance with respect to salient variables is reflected in the structure of inter-trial variance and phenomena of motor equivalence. Three lines of recent research within this framework are reviewed. First, synergic adjustments of the referent coordinate and apparent stiffness have been demonstrated during finger force production supporting the main idea of control with referent coordinates. Second, the notion of unintentional voluntary movements has been introduced reflecting unintentional drifts in referent coordinates. Two types of unintentional movements have been observed with different characteristic times. Third, this framework has been applied to studies of impaired movements in neurological patients. Overall, the physical approach searching for laws of nature underlying biological movement has been highly stimulating and productive.

  12. Combining statistical inference and decisions in ecology

    USGS Publications Warehouse

    Williams, Perry J.; Hooten, Mevin B.

    2016-01-01

    Statistical decision theory (SDT) is a sub-field of decision theory that formally incorporates statistical investigation into a decision-theoretic framework to account for uncertainties in a decision problem. SDT provides a unifying analysis of three types of information: statistical results from a data set, knowledge of the consequences of potential choices (i.e., loss), and prior beliefs about a system. SDT links the theoretical development of a large body of statistical methods including point estimation, hypothesis testing, and confidence interval estimation. The theory and application of SDT have mainly been developed and published in the fields of mathematics, statistics, operations research, and other decision sciences, but have had limited exposure in ecology. Thus, we provide an introduction to SDT for ecologists and describe its utility for linking the conventionally separate tasks of statistical investigation and decision making in a single framework. We describe the basic framework of both Bayesian and frequentist SDT, its traditional use in statistics, and discuss its application to decision problems that occur in ecology. We demonstrate SDT with two types of decisions: Bayesian point estimation, and an applied management problem of selecting a prescribed fire rotation for managing a grassland bird species. Central to SDT, and decision theory in general, are loss functions. Thus, we also provide basic guidance and references for constructing loss functions for an SDT problem.

  13. Optimal allocation of leaf epidermal area for gas exchange.

    PubMed

    de Boer, Hugo J; Price, Charles A; Wagner-Cremer, Friederike; Dekker, Stefan C; Franks, Peter J; Veneklaas, Erik J

    2016-06-01

    A long-standing research focus in phytology has been to understand how plants allocate leaf epidermal space to stomata in order to achieve an economic balance between the plant's carbon needs and water use. Here, we present a quantitative theoretical framework to predict allometric relationships between morphological stomatal traits in relation to leaf gas exchange and the required allocation of epidermal area to stomata. Our theoretical framework was derived from first principles of diffusion and geometry based on the hypothesis that selection for higher anatomical maximum stomatal conductance (gsmax ) involves a trade-off to minimize the fraction of the epidermis that is allocated to stomata. Predicted allometric relationships between stomatal traits were tested with a comprehensive compilation of published and unpublished data on 1057 species from all major clades. In support of our theoretical framework, stomatal traits of this phylogenetically diverse sample reflect spatially optimal allometry that minimizes investment in the allocation of epidermal area when plants evolve towards higher gsmax . Our results specifically highlight that the stomatal morphology of angiosperms evolved along spatially optimal allometric relationships. We propose that the resulting wide range of viable stomatal trait combinations equips angiosperms with developmental and evolutionary flexibility in leaf gas exchange unrivalled by gymnosperms and pteridophytes. © 2016 The Authors New Phytologist © 2016 New Phytologist Trust.

  14. Exploring heterogeneous market hypothesis using realized volatility

    NASA Astrophysics Data System (ADS)

    Chin, Wen Cheong; Isa, Zaidi; Mohd Nor, Abu Hassan Shaari

    2013-04-01

    This study investigates the heterogeneous market hypothesis using high frequency data. The cascaded heterogeneous trading activities with different time durations are modelled by the heterogeneous autoregressive framework. The empirical study indicated the presence of long memory behaviour and predictability elements in the financial time series which supported heterogeneous market hypothesis. Besides the common sum-of-square intraday realized volatility, we also advocated two power variation realized volatilities in forecast evaluation and risk measurement in order to overcome the possible abrupt jumps during the credit crisis. Finally, the empirical results are used in determining the market risk using the value-at-risk approach. The findings of this study have implications for informationally market efficiency analysis, portfolio strategies and risk managements.

  15. The role of responsibility and fear of guilt in hypothesis-testing.

    PubMed

    Mancini, Francesco; Gangemi, Amelia

    2006-12-01

    Recent theories argue that both perceived responsibility and fear of guilt increase obsessive-like behaviours. We propose that hypothesis-testing might account for this effect. Both perceived responsibility and fear of guilt would influence subjects' hypothesis-testing, by inducing a prudential style. This style implies focusing on and confirming the worst hypothesis, and reiterating the testing process. In our experiment, we manipulated the responsibility and fear of guilt of 236 normal volunteers who executed a deductive task. The results show that perceived responsibility is the main factor that influenced individuals' hypothesis-testing. Fear of guilt has however a significant additive effect. Guilt-fearing participants preferred to carry on with the diagnostic process, even when faced with initial favourable evidence, whereas participants in the responsibility condition only did so when confronted with an unfavourable evidence. Implications for the understanding of obsessive-compulsive disorder (OCD) are discussed.

  16. Paleodistributions and Comparative Molecular Phylogeography of Leafcutter Ants (Atta spp.) Provide New Insight into the Origins of Amazonian Diversity

    PubMed Central

    Solomon, Scott E.; Bacci, Mauricio; Martins, Joaquim; Vinha, Giovanna Gonçalves; Mueller, Ulrich G.

    2008-01-01

    The evolutionary basis for high species diversity in tropical regions of the world remains unresolved. Much research has focused on the biogeography of speciation in the Amazon Basin, which harbors the greatest diversity of terrestrial life. The leading hypotheses on allopatric diversification of Amazonian taxa are the Pleistocene refugia, marine incursion, and riverine barrier hypotheses. Recent advances in the fields of phylogeography and species-distribution modeling permit a modern re-evaluation of these hypotheses. Our approach combines comparative, molecular phylogeographic analyses using mitochondrial DNA sequence data with paleodistribution modeling of species ranges at the last glacial maximum (LGM) to test these hypotheses for three co-distributed species of leafcutter ants (Atta spp.). The cumulative results of all tests reject every prediction of the riverine barrier hypothesis, but are unable to reject several predictions of the Pleistocene refugia and marine incursion hypotheses. Coalescent dating analyses suggest that population structure formed recently (Pleistocene-Pliocene), but are unable to reject the possibility that Miocene events may be responsible for structuring populations in two of the three species examined. The available data therefore suggest that either marine incursions in the Miocene or climate changes during the Pleistocene—or both—have shaped the population structure of the three species examined. Our results also reconceptualize the traditional Pleistocene refugia hypothesis, and offer a novel framework for future research into the area. PMID:18648512

  17. Stratified exact tests for the weak causal null hypothesis in randomized trials with a binary outcome.

    PubMed

    Chiba, Yasutaka

    2017-09-01

    Fisher's exact test is commonly used to compare two groups when the outcome is binary in randomized trials. In the context of causal inference, this test explores the sharp causal null hypothesis (i.e. the causal effect of treatment is the same for all subjects), but not the weak causal null hypothesis (i.e. the causal risks are the same in the two groups). Therefore, in general, rejection of the null hypothesis by Fisher's exact test does not mean that the causal risk difference is not zero. Recently, Chiba (Journal of Biometrics and Biostatistics 2015; 6: 244) developed a new exact test for the weak causal null hypothesis when the outcome is binary in randomized trials; the new test is not based on any large sample theory and does not require any assumption. In this paper, we extend the new test; we create a version of the test applicable to a stratified analysis. The stratified exact test that we propose is general in nature and can be used in several approaches toward the estimation of treatment effects after adjusting for stratification factors. The stratified Fisher's exact test of Jung (Biometrical Journal 2014; 56: 129-140) tests the sharp causal null hypothesis. This test applies a crude estimator of the treatment effect and can be regarded as a special case of our proposed exact test. Our proposed stratified exact test can be straightforwardly extended to analysis of noninferiority trials and to construct the associated confidence interval. © 2017 WILEY-VCH Verlag GmbH & Co. KGaA, Weinheim.

  18. A statistical test to show negligible trend

    Treesearch

    Philip M. Dixon; Joseph H.K. Pechmann

    2005-01-01

    The usual statistical tests of trend are inappropriate for demonstrating the absence of trend. This is because failure to reject the null hypothesis of no trend does not prove that null hypothesis. The appropriate statistical method is based on an equivalence test. The null hypothesis is that the trend is not zero, i.e., outside an a priori specified equivalence region...

  19. A Tutorial in Bayesian Potential Outcomes Mediation Analysis.

    PubMed

    Miočević, Milica; Gonzalez, Oscar; Valente, Matthew J; MacKinnon, David P

    2018-01-01

    Statistical mediation analysis is used to investigate intermediate variables in the relation between independent and dependent variables. Causal interpretation of mediation analyses is challenging because randomization of subjects to levels of the independent variable does not rule out the possibility of unmeasured confounders of the mediator to outcome relation. Furthermore, commonly used frequentist methods for mediation analysis compute the probability of the data given the null hypothesis, which is not the probability of a hypothesis given the data as in Bayesian analysis. Under certain assumptions, applying the potential outcomes framework to mediation analysis allows for the computation of causal effects, and statistical mediation in the Bayesian framework gives indirect effects probabilistic interpretations. This tutorial combines causal inference and Bayesian methods for mediation analysis so the indirect and direct effects have both causal and probabilistic interpretations. Steps in Bayesian causal mediation analysis are shown in the application to an empirical example.

  20. Hypothesis for cognitive effects of transcranial direct current stimulation: Externally- and internally-directed cognition.

    PubMed

    Greenwood, Pamela M; Blumberg, Eric J; Scheldrup, Melissa R

    2018-03-01

    A comprehensive explanation is lacking for the broad array of cognitive effects modulated by transcranial direct current stimulation (tDCS). We advanced the testable hypothesis that tDCS to the default mode network (DMN) increases processing of goals and stored information at the expense of external events. We further hypothesized that tDCS to the dorsal attention network (DAN) increases processing of external events at the expense of goals and stored information. A literature search (PsychINFO) identified 42 empirical studies and 3 meta-analyses examining effects of prefrontal and/or parietal tDCS on tasks that selectively required external and/or internal processing. Most, though not all, of the studies that met our search criteria supported our hypothesis. Three meta-analyses supported our hypothesis. The hypothesis we advanced provides a framework for the design and interpretation of results in light of the role of large-scale intrinsic networks that govern attention. Copyright © 2017 Elsevier Ltd. All rights reserved.

  1. Life on the line: Job demands, perceived co-worker support for safety, and hazardous work events.

    PubMed

    Turner, Nick; Chmiel, Nik; Hershcovis, M Sandy; Walls, Melanie

    2010-10-01

    The present study of 334 United Kingdom trackside workers tested an interaction hypothesis. We hypothesized, drawing on the job demands-resources framework, that perceived support for safety (from senior managers, supervisors, and coworkers) as job resources would weaken the relationship between higher job demands and more frequent hazardous work events. Consistent with social impact theory, we predicted that perceived coworker support for safety would be particularly influential when trackside workers faced higher job demands. Moderated multiple regression showed that, of all three sources of perceived support for safety, perceived coworker support for safety was most important for keeping employees safe in the face of high job demands. © 2010 APA, all rights reserved.

  2. Effects of anxiety on task switching: evidence from the mixed antisaccade task.

    PubMed

    Ansari, Tahereh L; Derakshan, Nazanin; Richards, Anne

    2008-09-01

    According to the attentional control theory of anxiety (Eysenck, Derakshan, Santos, & Calvo, 2007), anxiety impairs performance on cognitive tasks that involve the shifting function of working memory. This hypothesis was tested using a mixed antisaccade paradigm, in which participants performed single-task and mixed-task versions of the paradigm. The single task involved the completion of separate blocks of anti- and prosaccade trials, whereas in the mixed task, participants completed anti- and prosaccade trials in a random order within blocks. Analysis of switch costs showed that high-anxious individuals did not exhibit the commonly reported paradoxical improvement in saccade latency, whereas low-anxious individuals did. The findings are discussed within the framework of attentional control theory.

  3. Unadjusted Bivariate Two-Group Comparisons: When Simpler is Better.

    PubMed

    Vetter, Thomas R; Mascha, Edward J

    2018-01-01

    Hypothesis testing involves posing both a null hypothesis and an alternative hypothesis. This basic statistical tutorial discusses the appropriate use, including their so-called assumptions, of the common unadjusted bivariate tests for hypothesis testing and thus comparing study sample data for a difference or association. The appropriate choice of a statistical test is predicated on the type of data being analyzed and compared. The unpaired or independent samples t test is used to test the null hypothesis that the 2 population means are equal, thereby accepting the alternative hypothesis that the 2 population means are not equal. The unpaired t test is intended for comparing dependent continuous (interval or ratio) data from 2 study groups. A common mistake is to apply several unpaired t tests when comparing data from 3 or more study groups. In this situation, an analysis of variance with post hoc (posttest) intragroup comparisons should instead be applied. Another common mistake is to apply a series of unpaired t tests when comparing sequentially collected data from 2 study groups. In this situation, a repeated-measures analysis of variance, with tests for group-by-time interaction, and post hoc comparisons, as appropriate, should instead be applied in analyzing data from sequential collection points. The paired t test is used to assess the difference in the means of 2 study groups when the sample observations have been obtained in pairs, often before and after an intervention in each study subject. The Pearson chi-square test is widely used to test the null hypothesis that 2 unpaired categorical variables, each with 2 or more nominal levels (values), are independent of each other. When the null hypothesis is rejected, 1 concludes that there is a probable association between the 2 unpaired categorical variables. When comparing 2 groups on an ordinal or nonnormally distributed continuous outcome variable, the 2-sample t test is usually not appropriate. The Wilcoxon-Mann-Whitney test is instead preferred. When making paired comparisons on data that are ordinal, or continuous but nonnormally distributed, the Wilcoxon signed-rank test can be used. In analyzing their data, researchers should consider the continued merits of these simple yet equally valid unadjusted bivariate statistical tests. However, the appropriate use of an unadjusted bivariate test still requires a solid understanding of its utility, assumptions (requirements), and limitations. This understanding will mitigate the risk of misleading findings, interpretations, and conclusions.

  4. Longitudinal Dimensionality of Adolescent Psychopathology: Testing the Differentiation Hypothesis

    ERIC Educational Resources Information Center

    Sterba, Sonya K.; Copeland, William; Egger, Helen L.; Costello, E. Jane; Erkanli, Alaattin; Angold, Adrian

    2010-01-01

    Background: The differentiation hypothesis posits that the underlying liability distribution for psychopathology is of low dimensionality in young children, inflating diagnostic comorbidity rates, but increases in dimensionality with age as latent syndromes become less correlated. This hypothesis has not been adequately tested with longitudinal…

  5. A large scale test of the gaming-enhancement hypothesis.

    PubMed

    Przybylski, Andrew K; Wang, John C

    2016-01-01

    A growing research literature suggests that regular electronic game play and game-based training programs may confer practically significant benefits to cognitive functioning. Most evidence supporting this idea, the gaming-enhancement hypothesis , has been collected in small-scale studies of university students and older adults. This research investigated the hypothesis in a general way with a large sample of 1,847 school-aged children. Our aim was to examine the relations between young people's gaming experiences and an objective test of reasoning performance. Using a Bayesian hypothesis testing approach, evidence for the gaming-enhancement and null hypotheses were compared. Results provided no substantive evidence supporting the idea that having preference for or regularly playing commercially available games was positively associated with reasoning ability. Evidence ranged from equivocal to very strong in support for the null hypothesis over what was predicted. The discussion focuses on the value of Bayesian hypothesis testing for investigating electronic gaming effects, the importance of open science practices, and pre-registered designs to improve the quality of future work.

  6. Analyzing learning during Peer Instruction dialogues: A resource activation framework

    NASA Astrophysics Data System (ADS)

    Wood, Anna K.; Galloway, Ross K.; Hardy, Judy; Sinclair, Christine M.

    2014-12-01

    Peer Instruction (PI) is an evidence based pedagogy commonly used in undergraduate physics instruction. When asked questions designed to test conceptual understanding, it has been observed that the proportion of students choosing the correct answer increases following peer discussion; however, relatively little is known about what takes place during these discussions or how they are beneficial to the processes of learning physics [M. C. James and S. Willoughby, Am. J. Phys. 79, 123 (2011)]. In this paper a framework for analyzing PI discussions developed through the lens of the "resources model" [D. Hammer, Am. J. Phys. 64, 1316 (1996); D. Hammer et al., Information Age Publishing (2005)] is proposed. A central hypothesis for this framework is that the dialogue with peers plays a crucial role in activating appropriate cognitive resources, enabling the students to see the problem differently, and therefore to answer the questions correctly. This framework is used to gain greater insights into the PI discussions of first year undergraduate physics students at the University of Edinburgh, UK, which were recorded using Livescribe Smartpens. Analysis of the dialogues revealed three different types of resource activation corresponding to increasing cognitive grain size. These were activation of knowledge elements, activation of linkages between knowledge elements, and activation of control structures (epistemic games and epistemological frames). Three case studies are examined to illustrate the role that peer dialogue plays in the activation of these cognitive resources in a PI session. The implications for pedagogical practice are discussed.

  7. Bayesian networks for evaluation of evidence from forensic entomology.

    PubMed

    Andersson, M Gunnar; Sundström, Anders; Lindström, Anders

    2013-09-01

    In the aftermath of a CBRN incident, there is an urgent need to reconstruct events in order to bring the perpetrators to court and to take preventive actions for the future. The challenge is to discriminate, based on available information, between alternative scenarios. Forensic interpretation is used to evaluate to what extent results from the forensic investigation favor the prosecutors' or the defendants' arguments, using the framework of Bayesian hypothesis testing. Recently, several new scientific disciplines have been used in a forensic context. In the AniBioThreat project, the framework was applied to veterinary forensic pathology, tracing of pathogenic microorganisms, and forensic entomology. Forensic entomology is an important tool for estimating the postmortem interval in, for example, homicide investigations as a complement to more traditional methods. In this article we demonstrate the applicability of the Bayesian framework for evaluating entomological evidence in a forensic investigation through the analysis of a hypothetical scenario involving suspect movement of carcasses from a clandestine laboratory. Probabilities of different findings under the alternative hypotheses were estimated using a combination of statistical analysis of data, expert knowledge, and simulation, and entomological findings are used to update the beliefs about the prosecutors' and defendants' hypotheses and to calculate the value of evidence. The Bayesian framework proved useful for evaluating complex hypotheses using findings from several insect species, accounting for uncertainty about development rate, temperature, and precolonization. The applicability of the forensic statistic approach to evaluating forensic results from a CBRN incident is discussed.

  8. Null but not void: considerations for hypothesis testing.

    PubMed

    Shaw, Pamela A; Proschan, Michael A

    2013-01-30

    Standard statistical theory teaches us that once the null and alternative hypotheses have been defined for a parameter, the choice of the statistical test is clear. Standard theory does not teach us how to choose the null or alternative hypothesis appropriate to the scientific question of interest. Neither does it tell us that in some cases, depending on which alternatives are realistic, we may want to define our null hypothesis differently. Problems in statistical practice are frequently not as pristinely summarized as the classic theory in our textbooks. In this article, we present examples in statistical hypothesis testing in which seemingly simple choices are in fact rich with nuance that, when given full consideration, make the choice of the right hypothesis test much less straightforward. Published 2012. This article is a US Government work and is in the public domain in the USA.

  9. Effect of climate-related mass extinctions on escalation in molluscs

    NASA Astrophysics Data System (ADS)

    Hansen, Thor A.; Kelley, Patricia H.; Melland, Vicky D.; Graham, Scott E.

    1999-12-01

    We test the hypothesis that escalated species (e.g., those with antipredatory adaptations such as heavy armor) are more vulnerable to extinctions caused by changes in climate. If this hypothesis is valid, recovery faunas after climate-related extinctions should include significantly fewer species with escalated shell characteristics, and escalated species should undergo greater rates of extinction than nonescalated species. This hypothesis is tested for the Cretaceous-Paleocene, Eocene-Oligocene, middle Miocene, and Pliocene-Pleistocene mass extinctions. Gastropod and bivalve molluscs from the U.S. coastal plain were evaluated for 10 shell characters that confer resistance to predators. Of 40 tests, one supported the hypothesis; highly ornamented gastropods underwent greater levels of Pliocene-Pleistocene extinction than did nonescalated species. All remaining tests were nonsignificant. The hypothesis that escalated species are more vulnerable to climate-related mass extinctions is not supported.

  10. From point process observations to collective neural dynamics: Nonlinear Hawkes process GLMs, low-dimensional dynamics and coarse graining

    PubMed Central

    Truccolo, Wilson

    2017-01-01

    This review presents a perspective on capturing collective dynamics in recorded neuronal ensembles based on multivariate point process models, inference of low-dimensional dynamics and coarse graining of spatiotemporal measurements. A general probabilistic framework for continuous time point processes reviewed, with an emphasis on multivariate nonlinear Hawkes processes with exogenous inputs. A point process generalized linear model (PP-GLM) framework for the estimation of discrete time multivariate nonlinear Hawkes processes is described. The approach is illustrated with the modeling of collective dynamics in neocortical neuronal ensembles recorded in human and non-human primates, and prediction of single-neuron spiking. A complementary approach to capture collective dynamics based on low-dimensional dynamics (“order parameters”) inferred via latent state-space models with point process observations is presented. The approach is illustrated by inferring and decoding low-dimensional dynamics in primate motor cortex during naturalistic reach and grasp movements. Finally, we briefly review hypothesis tests based on conditional inference and spatiotemporal coarse graining for assessing collective dynamics in recorded neuronal ensembles. PMID:28336305

  11. From point process observations to collective neural dynamics: Nonlinear Hawkes process GLMs, low-dimensional dynamics and coarse graining.

    PubMed

    Truccolo, Wilson

    2016-11-01

    This review presents a perspective on capturing collective dynamics in recorded neuronal ensembles based on multivariate point process models, inference of low-dimensional dynamics and coarse graining of spatiotemporal measurements. A general probabilistic framework for continuous time point processes reviewed, with an emphasis on multivariate nonlinear Hawkes processes with exogenous inputs. A point process generalized linear model (PP-GLM) framework for the estimation of discrete time multivariate nonlinear Hawkes processes is described. The approach is illustrated with the modeling of collective dynamics in neocortical neuronal ensembles recorded in human and non-human primates, and prediction of single-neuron spiking. A complementary approach to capture collective dynamics based on low-dimensional dynamics ("order parameters") inferred via latent state-space models with point process observations is presented. The approach is illustrated by inferring and decoding low-dimensional dynamics in primate motor cortex during naturalistic reach and grasp movements. Finally, we briefly review hypothesis tests based on conditional inference and spatiotemporal coarse graining for assessing collective dynamics in recorded neuronal ensembles. Published by Elsevier Ltd.

  12. Is this scaling nonlinear?

    PubMed Central

    2016-01-01

    One of the most celebrated findings in complex systems in the last decade is that different indexes y (e.g. patents) scale nonlinearly with the population x of the cities in which they appear, i.e. y∼xβ,β≠1. More recently, the generality of this finding has been questioned in studies that used new databases and different definitions of city boundaries. In this paper, we investigate the existence of nonlinear scaling, using a probabilistic framework in which fluctuations are accounted for explicitly. In particular, we show that this allows not only to (i) estimate β and confidence intervals, but also to (ii) quantify the evidence in favour of β≠1 and (iii) test the hypothesis that the observations are compatible with the nonlinear scaling. We employ this framework to compare five different models to 15 different datasets and we find that the answers to points (i)–(iii) crucially depend on the fluctuations contained in the data, on how they are modelled, and on the fact that the city sizes are heavy-tailed distributed. PMID:27493764

  13. Conservation in the face of climate change: The roles of alternative models, monitoring, and adaptation in confronting and reducing uncertainty

    USGS Publications Warehouse

    Conroy, M.J.; Runge, M.C.; Nichols, J.D.; Stodola, K.W.; Cooper, R.J.

    2011-01-01

    The broad physical and biological principles behind climate change and its potential large scale ecological impacts on biota are fairly well understood, although likely responses of biotic communities at fine spatio-temporal scales are not, limiting the ability of conservation programs to respond effectively to climate change outside the range of human experience. Much of the climate debate has focused on attempts to resolve key uncertainties in a hypothesis-testing framework. However, conservation decisions cannot await resolution of these scientific issues and instead must proceed in the face of uncertainty. We suggest that conservation should precede in an adaptive management framework, in which decisions are guided by predictions under multiple, plausible hypotheses about climate impacts. Under this plan, monitoring is used to evaluate the response of the system to climate drivers, and management actions (perhaps experimental) are used to confront testable predictions with data, in turn providing feedback for future decision making. We illustrate these principles with the problem of mitigating the effects of climate change on terrestrial bird communities in the southern Appalachian Mountains, USA. ?? 2010 Elsevier Ltd.

  14. f(T) gravity and energy distribution in Landau-Lifshitz prescription

    NASA Astrophysics Data System (ADS)

    Ganiou, M. G.; Houndjo, M. J. S.; Tossa, J.

    We investigate in this paper the Landau-Lifshitz energy distribution in the framework of f(T) theory view as a modified version of Teleparallel theory. From some important Teleparallel theory results on the localization of energy, our investigations generalize the Landau-Lifshitz prescription from the computation of the energy-momentum complex to the framework of f(T) gravity as it is done in the modified versions of General Relativity. We compute the energy density in the first step for three plane-symmetric metrics in vacuum. We find for the second metric that the energy density vanishes independently of f(T) models. We find that the Teleparallel Landau-Lifshitz energy-momentum complex formulations for these metrics are different from those obtained in General Relativity for the same metrics. Second, the calculations are performed for the cosmic string spacetime metric. It results that the energy distribution depends on the mass M and the radius r of cosmic string and it is strongly affected by the parameter of the considered quadratic and cubic f(T) models. Our investigation with this metric induces interesting results susceptible to be tested with some astrophysics hypothesis.

  15. Investigating the Cosmic Web with Topological Data Analysis

    NASA Astrophysics Data System (ADS)

    Cisewski-Kehe, Jessi; Wu, Mike; Fasy, Brittany; Hellwing, Wojciech; Lovell, Mark; Rinaldo, Alessandro; Wasserman, Larry

    2018-01-01

    Data exhibiting complicated spatial structures are common in many areas of science (e.g. cosmology, biology), but can be difficult to analyze. Persistent homology is a popular approach within the area of Topological Data Analysis that offers a new way to represent, visualize, and interpret complex data by extracting topological features, which can be used to infer properties of the underlying structures. In particular, TDA may be useful for analyzing the large-scale structure (LSS) of the Universe, which is an intricate and spatially complex web of matter. In order to understand the physics of the Universe, theoretical and computational cosmologists develop large-scale simulations that allow for visualizing and analyzing the LSS under varying physical assumptions. Each point in the 3D data set represents a galaxy or a cluster of galaxies, and topological summaries ("persistent diagrams") can be obtained summarizing the different ordered holes in the data (e.g. connected components, loops, voids).The topological summaries are interesting and informative descriptors of the Universe on their own, but hypothesis tests using the topological summaries would provide a way to make more rigorous comparisons of LSS under different theoretical models. For example, the received cosmological model has cold dark matter (CDM); however, while the case is strong for CDM, there are some observational inconsistencies with this theory. Another possibility is warm dark matter (WDM). It is of interest to see if a CDM Universe and WDM Universe produce LSS that is topologically distinct.We present several possible test statistics for two-sample hypothesis tests using the topological summaries, carryout a simulation study to investigate the suitableness of the proposed test statistics using simulated data from a variation of the Voronoi foam model, and finally we apply the proposed inference framework to WDM vs. CDM cosmological simulation data.

  16. On Restructurable Control System Theory

    NASA Technical Reports Server (NTRS)

    Athans, M.

    1983-01-01

    The state of stochastic system and control theory as it impacts restructurable control issues is addressed. The multivariable characteristics of the control problem are addressed. The failure detection/identification problem is discussed as a multi-hypothesis testing problem. Control strategy reconfiguration, static multivariable controls, static failure hypothesis testing, dynamic multivariable controls, fault-tolerant control theory, dynamic hypothesis testing, generalized likelihood ratio (GLR) methods, and adaptive control are discussed.

  17. Understanding the dynamic effects of returning patients toward emergency department density

    NASA Astrophysics Data System (ADS)

    Ahmad, Norazura; Zulkepli, Jafri; Ramli, Razamin; Ghani, Noraida Abdul; Teo, Aik Howe

    2017-11-01

    This paper presents the development of a dynamic hypothesis for the effect of returning patients to the emergency department (ED). A logical tree from the Theory of Constraint known as Current Reality Tree was used to identify the key variables. Then, a hypothetical framework portraying the interrelated variables and its influencing relationships was developed using causal loop diagrams (CLD). The conceptual framework was designed as the basis for the development of a system dynamics model.

  18. Statistical testing and power analysis for brain-wide association study.

    PubMed

    Gong, Weikang; Wan, Lin; Lu, Wenlian; Ma, Liang; Cheng, Fan; Cheng, Wei; Grünewald, Stefan; Feng, Jianfeng

    2018-04-05

    The identification of connexel-wise associations, which involves examining functional connectivities between pairwise voxels across the whole brain, is both statistically and computationally challenging. Although such a connexel-wise methodology has recently been adopted by brain-wide association studies (BWAS) to identify connectivity changes in several mental disorders, such as schizophrenia, autism and depression, the multiple correction and power analysis methods designed specifically for connexel-wise analysis are still lacking. Therefore, we herein report the development of a rigorous statistical framework for connexel-wise significance testing based on the Gaussian random field theory. It includes controlling the family-wise error rate (FWER) of multiple hypothesis testings using topological inference methods, and calculating power and sample size for a connexel-wise study. Our theoretical framework can control the false-positive rate accurately, as validated empirically using two resting-state fMRI datasets. Compared with Bonferroni correction and false discovery rate (FDR), it can reduce false-positive rate and increase statistical power by appropriately utilizing the spatial information of fMRI data. Importantly, our method bypasses the need of non-parametric permutation to correct for multiple comparison, thus, it can efficiently tackle large datasets with high resolution fMRI images. The utility of our method is shown in a case-control study. Our approach can identify altered functional connectivities in a major depression disorder dataset, whereas existing methods fail. A software package is available at https://github.com/weikanggong/BWAS. Copyright © 2018 Elsevier B.V. All rights reserved.

  19. Revised standards for statistical evidence.

    PubMed

    Johnson, Valen E

    2013-11-26

    Recent advances in Bayesian hypothesis testing have led to the development of uniformly most powerful Bayesian tests, which represent an objective, default class of Bayesian hypothesis tests that have the same rejection regions as classical significance tests. Based on the correspondence between these two classes of tests, it is possible to equate the size of classical hypothesis tests with evidence thresholds in Bayesian tests, and to equate P values with Bayes factors. An examination of these connections suggest that recent concerns over the lack of reproducibility of scientific studies can be attributed largely to the conduct of significance tests at unjustifiably high levels of significance. To correct this problem, evidence thresholds required for the declaration of a significant finding should be increased to 25-50:1, and to 100-200:1 for the declaration of a highly significant finding. In terms of classical hypothesis tests, these evidence standards mandate the conduct of tests at the 0.005 or 0.001 level of significance.

  20. Multiple hypothesis tracking for cluttered biological image sequences.

    PubMed

    Chenouard, Nicolas; Bloch, Isabelle; Olivo-Marin, Jean-Christophe

    2013-11-01

    In this paper, we present a method for simultaneously tracking thousands of targets in biological image sequences, which is of major importance in modern biology. The complexity and inherent randomness of the problem lead us to propose a unified probabilistic framework for tracking biological particles in microscope images. The framework includes realistic models of particle motion and existence and of fluorescence image features. For the track extraction process per se, the very cluttered conditions motivate the adoption of a multiframe approach that enforces tracking decision robustness to poor imaging conditions and to random target movements. We tackle the large-scale nature of the problem by adapting the multiple hypothesis tracking algorithm to the proposed framework, resulting in a method with a favorable tradeoff between the model complexity and the computational cost of the tracking procedure. When compared to the state-of-the-art tracking techniques for bioimaging, the proposed algorithm is shown to be the only method providing high-quality results despite the critically poor imaging conditions and the dense target presence. We thus demonstrate the benefits of advanced Bayesian tracking techniques for the accurate computational modeling of dynamical biological processes, which is promising for further developments in this domain.

  1. Task-Specific Response Strategy Selection on the Basis of Recent Training Experience

    PubMed Central

    Fulvio, Jacqueline M.; Green, C. Shawn; Schrater, Paul R.

    2014-01-01

    The goal of training is to produce learning for a range of activities that are typically more general than the training task itself. Despite a century of research, predicting the scope of learning from the content of training has proven extremely difficult, with the same task producing narrowly focused learning strategies in some cases and broadly scoped learning strategies in others. Here we test the hypothesis that human subjects will prefer a decision strategy that maximizes performance and reduces uncertainty given the demands of the training task and that the strategy chosen will then predict the extent to which learning is transferable. To test this hypothesis, we trained subjects on a moving dot extrapolation task that makes distinct predictions for two types of learning strategy: a narrow model-free strategy that learns an input-output mapping for training stimuli, and a general model-based strategy that utilizes humans' default predictive model for a class of trajectories. When the number of distinct training trajectories is low, we predict better performance for the mapping strategy, but as the number increases, a predictive model is increasingly favored. Consonant with predictions, subject extrapolations for test trajectories were consistent with using a mapping strategy when trained on a small number of training trajectories and a predictive model when trained on a larger number. The general framework developed here can thus be useful both in interpreting previous patterns of task-specific versus task-general learning, as well as in building future training paradigms with certain desired outcomes. PMID:24391490

  2. Semantic congruence reverses effects of sleep restriction on associative encoding.

    PubMed

    Alberca-Reina, Esther; Cantero, Jose L; Atienza, Mercedes

    2014-04-01

    Encoding and memory consolidation are influenced by factors such as sleep and congruency of newly learned information with prior knowledge (i.e., schema). However, only a few studies have examined the contribution of sleep to enhancement of schema-dependent memory. Based on previous studies showing that total sleep deprivation specifically impairs hippocampal encoding, and that coherent schemas reduce the hippocampal consolidation period after learning, we predict that sleep loss in the pre-training night will mainly affect schema-unrelated information whereas sleep restriction in the post-training night will have similar effects on schema-related and unrelated information. Here, we tested this hypothesis by presenting participants with face-face associations that could be semantically related or unrelated under different sleep conditions: normal sleep before and after training, and acute sleep restriction either before or after training. Memory was tested one day after training, just after introducing an interference task, and two days later, without any interference. Significant results were evident on the second retesting session. In particular, sleep restriction before training enhanced memory for semantically congruent events in detriment of memory for unrelated events, supporting the specific role of sleep in hippocampal memory encoding. Unexpectedly, sleep restriction after training enhanced memory for both related and unrelated events. Although this finding may suggest a poorer encoding during the interference task, this hypothesis should be specifically tested in future experiments. All together, the present results support a framework in which encoding processes seem to be more vulnerable to sleep loss than consolidation processes. Copyright © 2014 Elsevier Inc. All rights reserved.

  3. Adaptation of all-ceramic fixed partial dentures.

    PubMed

    Borba, Márcia; Cesar, Paulo F; Griggs, Jason A; Della Bona, Álvaro

    2011-11-01

    To measure the marginal and internal fit of three-unit fixed partial dentures (FPDs) using the micro-CT technique, testing the null hypothesis that there is no difference in the adaptation between the ceramic systems studied. Stainless steel models of prepared abutments were fabricated to design the FPDs. Ten FPDs were produced from each framework ceramic (YZ - Vita In-Ceram YZ and IZ - Vita In-Ceram Zirconia) using CEREC inLab according to the manufacturer instructions. All FPDs were veneered using the recommended porcelain. Each FPD was seated on the original model and scanned using micro-CT. Files were processed using NRecon and CTAn software. Adobe Photoshop and Image J software were used to analyze the cross-sections images. Five measuring locations were used as follows: MG - marginal gap; CA - chamfer area; AW - axial wall; AOT - axio-occlusal transition area; OA - occlusal area. The horizontal marginal discrepancy (HMD) was evaluated in another set of images. Results were statistically analyzed using ANOVA and Tukey tests (α=0.05). The mean values for MG, CA, AW, OA and HMD were significantly different for all tested groups (p<0.05). IZ exhibited greater mean values than YZ for all measuring locations except for AW and AOT. OA showed the greatest mean gap values for both ceramic systems. MG and AW mean gap values were low for both systems. The ceramic systems evaluated showed different levels of marginal and internal fit, rejecting the study hypothesis. Yet, both ceramic systems showed clinically acceptable marginal and internal fit. Copyright © 2011 Academy of Dental Materials. Published by Elsevier Ltd. All rights reserved.

  4. Adaptation of all-ceramic fixed partial dentures

    PubMed Central

    Borba, Márcia; Cesar, Paulo F.; Griggs, Jason A.; Della Bona, Álvaro

    2011-01-01

    Objectives To measure the marginal and internal fit of three-unit fixed partial dentures (FPDs) using the micro-CT technique, testing the null hypothesis that there is no difference in the adaptation between the ceramic systems studied. Methods Stainless steel models of prepared abutments were fabricated to design the FPDs. Ten FPDs were produced from each framework ceramic (YZ - Vita In-Ceram YZ and IZ - Vita In-Ceram Zirconia) using CEREC inLab according to the manufacturer instructions. All FPDs were veneered using the recommended porcelain. Each FPD was seated on the original model and scanned using micro-CT. Files were processed using NRecon and CTAn software. Adobe Photoshop and Image J software were used to analyze the cross-sections images. Five measuring locations were used as follows: MG – marginal gap; CA - chamfer area; AW - axial wall; AOT - axio-occlusal transition area; OA - occlusal area. The horizontal marginal discrepancy (HMD) was evaluated in another set of images. Results were statistically analyzed using ANOVA and Tukey tests (α=0.05). Results The mean values for MG, CA, AW, OA and HMD were significantly different for all tested groups (p<0.05). IZ exhibited greater mean values than YZ for all measuring locations except for AW and AOT. OA showed the greatest mean gap values for both ceramic systems. MG and AW mean gap values were low for both systems. Significance The ceramic systems evaluated showed different levels of marginal and internal fit, rejecting the study hypothesis. Yet, both ceramic systems showed clinically acceptable marginal and internal fit. PMID:21920595

  5. Classification of HCV and HIV-1 Sequences with the Branching Index

    PubMed Central

    Hraber, Peter; Kuiken, Carla; Waugh, Mark; Geer, Shaun; Bruno, William J.; Leitner, Thomas

    2009-01-01

    SUMMARY Classification of viral sequences should be fast, objective, accurate, and reproducible. Most methods that classify sequences use either pairwise distances or phylogenetic relations, but cannot discern when a sequence is unclassifiable. The branching index (BI) combines distance and phylogeny methods to compute a ratio that quantifies how closely a query sequence clusters with a subtype clade. In the hypothesis-testing framework of statistical inference, the BI is compared with a threshold to test whether sufficient evidence exists for the query sequence to be classified among known sequences. If above the threshold, the null hypothesis of no support for the subtype relation is rejected and the sequence is taken as belonging to the subtype clade with which it clusters on the tree. This study evaluates statistical properties of the branching index for subtype classification in HCV and HIV-1. Pairs of BI values with known positive and negative test results were computed from 10,000 random fragments of reference alignments. Sampled fragments were of sufficient length to contain phylogenetic signal that groups reference sequences together properly into subtype clades. For HCV, a threshold BI of 0.71 yields 95.1% agreement with reference subtypes, with equal false positive and false negative rates. For HIV-1, a threshold of 0.66 yields 93.5% agreement. Higher thresholds can be used where lower false positive rates are required. In synthetic recombinants, regions without breakpoints are recognized accurately; regions with breakpoints do not uniquely represent any known subtype. Web-based services for viral subtype classification with the branching index are available online. PMID:18753218

  6. The Gumbel hypothesis test for left censored observations using regional earthquake records as an example

    NASA Astrophysics Data System (ADS)

    Thompson, E. M.; Hewlett, J. B.; Baise, L. G.; Vogel, R. M.

    2011-01-01

    Annual maximum (AM) time series are incomplete (i.e., censored) when no events are included above the assumed censoring threshold (i.e., magnitude of completeness). We introduce a distrtibutional hypothesis test for left-censored Gumbel observations based on the probability plot correlation coefficient (PPCC). Critical values of the PPCC hypothesis test statistic are computed from Monte-Carlo simulations and are a function of sample size, censoring level, and significance level. When applied to a global catalog of earthquake observations, the left-censored Gumbel PPCC tests are unable to reject the Gumbel hypothesis for 45 of 46 seismic regions. We apply four different field significance tests for combining individual tests into a collective hypothesis test. None of the field significance tests are able to reject the global hypothesis that AM earthquake magnitudes arise from a Gumbel distribution. Because the field significance levels are not conclusive, we also compute the likelihood that these field significance tests are unable to reject the Gumbel model when the samples arise from a more complex distributional alternative. A power study documents that the censored Gumbel PPCC test is unable to reject some important and viable Generalized Extreme Value (GEV) alternatives. Thus, we cannot rule out the possibility that the global AM earthquake time series could arise from a GEV distribution with a finite upper bound, also known as a reverse Weibull distribution. Our power study also indicates that the binomial and uniform field significance tests are substantially more powerful than the more commonly used Bonferonni and false discovery rate multiple comparison procedures.

  7. Biostatistics Series Module 2: Overview of Hypothesis Testing.

    PubMed

    Hazra, Avijit; Gogtay, Nithya

    2016-01-01

    Hypothesis testing (or statistical inference) is one of the major applications of biostatistics. Much of medical research begins with a research question that can be framed as a hypothesis. Inferential statistics begins with a null hypothesis that reflects the conservative position of no change or no difference in comparison to baseline or between groups. Usually, the researcher has reason to believe that there is some effect or some difference which is the alternative hypothesis. The researcher therefore proceeds to study samples and measure outcomes in the hope of generating evidence strong enough for the statistician to be able to reject the null hypothesis. The concept of the P value is almost universally used in hypothesis testing. It denotes the probability of obtaining by chance a result at least as extreme as that observed, even when the null hypothesis is true and no real difference exists. Usually, if P is < 0.05 the null hypothesis is rejected and sample results are deemed statistically significant. With the increasing availability of computers and access to specialized statistical software, the drudgery involved in statistical calculations is now a thing of the past, once the learning curve of the software has been traversed. The life sciences researcher is therefore free to devote oneself to optimally designing the study, carefully selecting the hypothesis tests to be applied, and taking care in conducting the study well. Unfortunately, selecting the right test seems difficult initially. Thinking of the research hypothesis as addressing one of five generic research questions helps in selection of the right hypothesis test. In addition, it is important to be clear about the nature of the variables (e.g., numerical vs. categorical; parametric vs. nonparametric) and the number of groups or data sets being compared (e.g., two or more than two) at a time. The same research question may be explored by more than one type of hypothesis test. While this may be of utility in highlighting different aspects of the problem, merely reapplying different tests to the same issue in the hope of finding a P < 0.05 is a wrong use of statistics. Finally, it is becoming the norm that an estimate of the size of any effect, expressed with its 95% confidence interval, is required for meaningful interpretation of results. A large study is likely to have a small (and therefore "statistically significant") P value, but a "real" estimate of the effect would be provided by the 95% confidence interval. If the intervals overlap between two interventions, then the difference between them is not so clear-cut even if P < 0.05. The two approaches are now considered complementary to one another.

  8. Biostatistics Series Module 2: Overview of Hypothesis Testing

    PubMed Central

    Hazra, Avijit; Gogtay, Nithya

    2016-01-01

    Hypothesis testing (or statistical inference) is one of the major applications of biostatistics. Much of medical research begins with a research question that can be framed as a hypothesis. Inferential statistics begins with a null hypothesis that reflects the conservative position of no change or no difference in comparison to baseline or between groups. Usually, the researcher has reason to believe that there is some effect or some difference which is the alternative hypothesis. The researcher therefore proceeds to study samples and measure outcomes in the hope of generating evidence strong enough for the statistician to be able to reject the null hypothesis. The concept of the P value is almost universally used in hypothesis testing. It denotes the probability of obtaining by chance a result at least as extreme as that observed, even when the null hypothesis is true and no real difference exists. Usually, if P is < 0.05 the null hypothesis is rejected and sample results are deemed statistically significant. With the increasing availability of computers and access to specialized statistical software, the drudgery involved in statistical calculations is now a thing of the past, once the learning curve of the software has been traversed. The life sciences researcher is therefore free to devote oneself to optimally designing the study, carefully selecting the hypothesis tests to be applied, and taking care in conducting the study well. Unfortunately, selecting the right test seems difficult initially. Thinking of the research hypothesis as addressing one of five generic research questions helps in selection of the right hypothesis test. In addition, it is important to be clear about the nature of the variables (e.g., numerical vs. categorical; parametric vs. nonparametric) and the number of groups or data sets being compared (e.g., two or more than two) at a time. The same research question may be explored by more than one type of hypothesis test. While this may be of utility in highlighting different aspects of the problem, merely reapplying different tests to the same issue in the hope of finding a P < 0.05 is a wrong use of statistics. Finally, it is becoming the norm that an estimate of the size of any effect, expressed with its 95% confidence interval, is required for meaningful interpretation of results. A large study is likely to have a small (and therefore “statistically significant”) P value, but a “real” estimate of the effect would be provided by the 95% confidence interval. If the intervals overlap between two interventions, then the difference between them is not so clear-cut even if P < 0.05. The two approaches are now considered complementary to one another. PMID:27057011

  9. Statistical Validation of Surrogate Endpoints: Another Look at the Prentice Criterion and Other Criteria.

    PubMed

    Saraf, Sanatan; Mathew, Thomas; Roy, Anindya

    2015-01-01

    For the statistical validation of surrogate endpoints, an alternative formulation is proposed for testing Prentice's fourth criterion, under a bivariate normal model. In such a setup, the criterion involves inference concerning an appropriate regression parameter, and the criterion holds if the regression parameter is zero. Testing such a null hypothesis has been criticized in the literature since it can only be used to reject a poor surrogate, and not to validate a good surrogate. In order to circumvent this, an equivalence hypothesis is formulated for the regression parameter, namely the hypothesis that the parameter is equivalent to zero. Such an equivalence hypothesis is formulated as an alternative hypothesis, so that the surrogate endpoint is statistically validated when the null hypothesis is rejected. Confidence intervals for the regression parameter and tests for the equivalence hypothesis are proposed using bootstrap methods and small sample asymptotics, and their performances are numerically evaluated and recommendations are made. The choice of the equivalence margin is a regulatory issue that needs to be addressed. The proposed equivalence testing formulation is also adopted for other parameters that have been proposed in the literature on surrogate endpoint validation, namely, the relative effect and proportion explained.

  10. Test of association: which one is the most appropriate for my study?

    PubMed

    Gonzalez-Chica, David Alejandro; Bastos, João Luiz; Duquia, Rodrigo Pereira; Bonamigo, Renan Rangel; Martínez-Mesa, Jeovany

    2015-01-01

    Hypothesis tests are statistical tools widely used for assessing whether or not there is an association between two or more variables. These tests provide a probability of the type 1 error (p-value), which is used to accept or reject the null study hypothesis. To provide a practical guide to help researchers carefully select the most appropriate procedure to answer the research question. We discuss the logic of hypothesis testing and present the prerequisites of each procedure based on practical examples.

  11. Improving the Crossing-SIBTEST Statistic for Detecting Non-uniform DIF.

    PubMed

    Chalmers, R Philip

    2018-06-01

    This paper demonstrates that, after applying a simple modification to Li and Stout's (Psychometrika 61(4):647-677, 1996) CSIBTEST statistic, an improved variant of the statistic could be realized. It is shown that this modified version of CSIBTEST has a more direct association with the SIBTEST statistic presented by Shealy and Stout (Psychometrika 58(2):159-194, 1993). In particular, the asymptotic sampling distributions and general interpretation of the effect size estimates are the same for SIBTEST and the new CSIBTEST. Given the more natural connection to SIBTEST, it is shown that Li and Stout's hypothesis testing approach is insufficient for CSIBTEST; thus, an improved hypothesis testing procedure is required. Based on the presented arguments, a new chi-squared-based hypothesis testing approach is proposed for the modified CSIBTEST statistic. Positive results from a modest Monte Carlo simulation study strongly suggest the original CSIBTEST procedure and randomization hypothesis testing approach should be replaced by the modified statistic and hypothesis testing method.

  12. Integrated consensus-based frameworks for unmanned vehicle routing and targeting assignment

    NASA Astrophysics Data System (ADS)

    Barnawi, Waleed T.

    Unmanned aerial vehicles (UAVs) are increasingly deployed in complex and dynamic environments to perform multiple tasks cooperatively with other UAVs that contribute to overarching mission effectiveness. Studies by the Department of Defense (DoD) indicate future operations may include anti-access/area-denial (A2AD) environments which limit human teleoperator decision-making and control. This research addresses the problem of decentralized vehicle re-routing and task reassignments through consensus-based UAV decision-making. An Integrated Consensus-Based Framework (ICF) is formulated as a solution to the combined single task assignment problem and vehicle routing problem. The multiple assignment and vehicle routing problem is solved with the Integrated Consensus-Based Bundle Framework (ICBF). The frameworks are hierarchically decomposed into two levels. The bottom layer utilizes the renowned Dijkstra's Algorithm. The top layer addresses task assignment with two methods. The single assignment approach is called the Caravan Auction Algorithm (CarA) Algorithm. This technique extends the Consensus-Based Auction Algorithm (CBAA) to provide awareness for task completion by agents and adopt abandoned tasks. The multiple assignment approach called the Caravan Auction Bundle Algorithm (CarAB) extends the Consensus-Based Bundle Algorithm (CBBA) by providing awareness for lost resources, prioritizing remaining tasks, and adopting abandoned tasks. Research questions are investigated regarding the novelty and performance of the proposed frameworks. Conclusions regarding the research questions will be provided through hypothesis testing. Monte Carlo simulations will provide evidence to support conclusions regarding the research hypotheses for the proposed frameworks. The approach provided in this research addresses current and future military operations for unmanned aerial vehicles. However, the general framework implied by the proposed research is adaptable to any unmanned vehicle. Civil applications that involve missions where human observability would be limited could benefit from the independent UAV task assignment, such as exploration and fire surveillance are also notable uses for this approach.

  13. Towards a framework for testing general relativity with extreme-mass-ratio-inspiral observations

    NASA Astrophysics Data System (ADS)

    Chua, A. J. K.; Hee, S.; Handley, W. J.; Higson, E.; Moore, C. J.; Gair, J. R.; Hobson, M. P.; Lasenby, A. N.

    2018-07-01

    Extreme-mass-ratio-inspiral observations from future space-based gravitational-wave detectors such as LISA will enable strong-field tests of general relativity with unprecedented precision, but at prohibitive computational cost if existing statistical techniques are used. In one such test that is currently employed for LIGO black hole binary mergers, generic deviations from relativity are represented by N deformation parameters in a generalized waveform model; the Bayesian evidence for each of its 2N combinatorial submodels is then combined into a posterior odds ratio for modified gravity over relativity in a null-hypothesis test. We adapt and apply this test to a generalized model for extreme-mass-ratio inspirals constructed on deformed black hole spacetimes, and focus our investigation on how computational efficiency can be increased through an evidence-free method of model selection. This method is akin to the algorithm known as product-space Markov chain Monte Carlo, but uses nested sampling and improved error estimates from a rethreading technique. We perform benchmarking and robustness checks for the method, and find order-of-magnitude computational gains over regular nested sampling in the case of synthetic data generated from the null model.

  14. Towards a framework for testing general relativity with extreme-mass-ratio-inspiral observations

    NASA Astrophysics Data System (ADS)

    Chua, A. J. K.; Hee, S.; Handley, W. J.; Higson, E.; Moore, C. J.; Gair, J. R.; Hobson, M. P.; Lasenby, A. N.

    2018-04-01

    Extreme-mass-ratio-inspiral observations from future space-based gravitational-wave detectors such as LISA will enable strong-field tests of general relativity with unprecedented precision, but at prohibitive computational cost if existing statistical techniques are used. In one such test that is currently employed for LIGO black-hole binary mergers, generic deviations from relativity are represented by N deformation parameters in a generalised waveform model; the Bayesian evidence for each of its 2N combinatorial submodels is then combined into a posterior odds ratio for modified gravity over relativity in a null-hypothesis test. We adapt and apply this test to a generalised model for extreme-mass-ratio inspirals constructed on deformed black-hole spacetimes, and focus our investigation on how computational efficiency can be increased through an evidence-free method of model selection. This method is akin to the algorithm known as product-space Markov chain Monte Carlo, but uses nested sampling and improved error estimates from a rethreading technique. We perform benchmarking and robustness checks for the method, and find order-of-magnitude computational gains over regular nested sampling in the case of synthetic data generated from the null model.

  15. Goal-directed learning and obsessive–compulsive disorder

    PubMed Central

    Gillan, Claire M.; Robbins, Trevor W.

    2014-01-01

    Obsessive–compulsive disorder (OCD) has become a paradigmatic case of goal-directed dysfunction in psychiatry. In this article, we review the neurobiological evidence, historical and recent, that originally led to this supposition and continues to support a habit hypothesis of OCD. We will then discuss a number of recent studies that have directly tested this hypothesis, using behavioural experiments in patient populations. Based on this research evidence, which suggests that rather than goal-directed avoidance behaviours, compulsions in OCD may derive from manifestations of excessive habit formation, we present the details of a novel account of the functional relationship between these habits and the full symptom profile of the disorder. Borrowing from a cognitive dissonance framework, we propose that the irrational threat beliefs (obsessions) characteristic of OCD may be a consequence, rather than an instigator, of compulsive behaviour in these patients. This lays the foundation for a potential shift in both clinical and neuropsychological conceptualization of OCD and related disorders. This model may also prove relevant to other putative disorders of compulsivity, such as substance dependence, where the experience of ‘wanting’ drugs may be better understood as post hoc rationalizations of otherwise goal-insensitive, stimulus-driven behaviour. PMID:25267818

  16. An equilibrium-point model for fast, single-joint movement: II. Similarity of single-joint isometric and isotonic descending commands.

    PubMed

    Latash, M L; Gottlieb, G L

    1991-09-01

    The model for isotonic movements introduced in the preceding article in this issue is used to account for isometric contractions. Isotonic movements and isometric contractions are analyzed as consequences of one motor program acting under different peripheral conditions. Differences in isotonic and isometric EMG patterns are analyzed theoretically. Computer simulation of the EMG patterns was performed both with and without the inclusion of possible effects of reciprocal inhibition. A series of experiments was performed to test the model. The subjects made fast isotonic movements that were unexpectedly blocked at the very beginning in some of the trials. The observed differences in the EMG patterns between blocked and unblocked trials corresponded to the model's predictions. The results suggest that these differences are due to the action of a tonic stretch reflex rather than to preprogrammed reactions. The experimental and simulation findings, and also the data from the literature, are discussed in the framework of the model and the dual-strategy hypothesis. They support the hypothesis that the motor control system uses one of a few standardized subprograms, specifying a small number of parameters to match a specific task.

  17. A Bayesian phylogenetic approach to estimating the stability of linguistic features and the genetic biasing of tone.

    PubMed

    Dediu, Dan

    2011-02-07

    Language is a hallmark of our species and understanding linguistic diversity is an area of major interest. Genetic factors influencing the cultural transmission of language provide a powerful and elegant explanation for aspects of the present day linguistic diversity and a window into the emergence and evolution of language. In particular, it has recently been proposed that linguistic tone-the usage of voice pitch to convey lexical and grammatical meaning-is biased by two genes involved in brain growth and development, ASPM and Microcephalin. This hypothesis predicts that tone is a stable characteristic of language because of its 'genetic anchoring'. The present paper tests this prediction using a Bayesian phylogenetic framework applied to a large set of linguistic features and language families, using multiple software implementations, data codings, stability estimations, linguistic classifications and outgroup choices. The results of these different methods and datasets show a large agreement, suggesting that this approach produces reliable estimates of the stability of linguistic data. Moreover, linguistic tone is found to be stable across methods and datasets, providing suggestive support for the hypothesis of genetic influences on its distribution.

  18. Assisting community management of groundwater: Irrigator attitudes in two watersheds in Rajasthan and Gujarat, India

    NASA Astrophysics Data System (ADS)

    Varua, M. E.; Ward, J.; Maheshwari, B.; Oza, S.; Purohit, R.; Hakimuddin; Chinnasamy, P.

    2016-06-01

    The absence of either state regulations or markets to coordinate the operation of individual wells has focussed attention on community level institutions as the primary loci for sustainable groundwater management in Rajasthan and Gujarat, India. The reported research relied on theoretical propositions that livelihood strategies, groundwater management and the propensity to cooperate are associated with the attitudinal orientations of well owners in the Meghraj and Dharta watersheds, located in Gujarat and Rajasthan respectively. The research tested the hypothesis that attitudes to groundwater management and farming practices, household income and trust levels of assisting agencies were not consistent across the watersheds, implying that a targeted approach, in contrast to default uniform programs, would assist communities craft rules to manage groundwater across multiple hydro-geological settings. Hierarchical cluster analysis of attitudes held by survey respondents revealed four statistically significant discrete clusters, supporting acceptance of the hypothesis. Further analyses revealed significant differences in farming practices, household wealth and willingness to adapt across the four groundwater management clusters. In conclusion, the need to account for attitudinal diversity is highlighted and a framework to guide the specific design of processes to assist communities craft coordinating instruments to sustainably manage local aquifers described.

  19. Bibliometric Evidence for a Hierarchy of the Sciences.

    PubMed

    Fanelli, Daniele; Glänzel, Wolfgang

    2013-01-01

    The hypothesis of a Hierarchy of the Sciences, first formulated in the 19(th) century, predicts that, moving from simple and general phenomena (e.g. particle dynamics) to complex and particular (e.g. human behaviour), researchers lose ability to reach theoretical and methodological consensus. This hypothesis places each field of research along a continuum of complexity and "softness", with profound implications for our understanding of scientific knowledge. Today, however, the idea is still unproven and philosophically overlooked, too often confused with simplistic dichotomies that contrast natural and social sciences, or science and the humanities. Empirical tests of the hypothesis have usually compared few fields and this, combined with other limitations, makes their results contradictory and inconclusive. We verified whether discipline characteristics reflect a hierarchy, a dichotomy or neither, by sampling nearly 29,000 papers published contemporaneously in 12 disciplines and measuring a set of parameters hypothesised to reflect theoretical and methodological consensus. The biological sciences had in most cases intermediate values between the physical and the social, with bio-molecular disciplines appearing harder than zoology, botany or ecology. In multivariable analyses, most of these parameters were independent predictors of the hierarchy, even when mathematics and the humanities were included. These results support a "gradualist" view of scientific knowledge, suggesting that the Hierarchy of the Sciences provides the best rational framework to understand disciplines' diversity. A deeper grasp of the relationship between subject matter's complexity and consensus could have profound implications for how we interpret, publish, popularize and administer scientific research.

  20. Improving data analysis in herpetology: Using Akaike's information criterion (AIC) to assess the strength of biological hypotheses

    USGS Publications Warehouse

    Mazerolle, M.J.

    2006-01-01

    In ecology, researchers frequently use observational studies to explain a given pattern, such as the number of individuals in a habitat patch, with a large number of explanatory (i.e., independent) variables. To elucidate such relationships, ecologists have long relied on hypothesis testing to include or exclude variables in regression models, although the conclusions often depend on the approach used (e.g., forward, backward, stepwise selection). Though better tools have surfaced in the mid 1970's, they are still underutilized in certain fields, particularly in herpetology. This is the case of the Akaike information criterion (AIC) which is remarkably superior in model selection (i.e., variable selection) than hypothesis-based approaches. It is simple to compute and easy to understand, but more importantly, for a given data set, it provides a measure of the strength of evidence for each model that represents a plausible biological hypothesis relative to the entire set of models considered. Using this approach, one can then compute a weighted average of the estimate and standard error for any given variable of interest across all the models considered. This procedure, termed model-averaging or multimodel inference, yields precise and robust estimates. In this paper, I illustrate the use of the AIC in model selection and inference, as well as the interpretation of results analysed in this framework with two real herpetological data sets. The AIC and measures derived from it is should be routinely adopted by herpetologists. ?? Koninklijke Brill NV 2006.

  1. A neural mediator of human anxiety sensitivity.

    PubMed

    Harrison, Ben J; Fullana, Miquel A; Soriano-Mas, Carles; Via, Esther; Pujol, Jesus; Martínez-Zalacaín, Ignacio; Tinoco-Gonzalez, Daniella; Davey, Christopher G; López-Solà, Marina; Pérez Sola, Victor; Menchón, José M; Cardoner, Narcís

    2015-10-01

    Advances in the neuroscientific understanding of bodily autonomic awareness, or interoception, have led to the hypothesis that human trait anxiety sensitivity (AS)-the fear of bodily autonomic arousal-is primarily mediated by the anterior insular cortex. Despite broad appeal, few experimental studies have comprehensively addressed this hypothesis. We recruited 55 individuals exhibiting a range of AS and assessed them with functional magnetic resonance imaging (fMRI) during aversive fear conditioning. For each participant, three primary measures of interest were derived: a trait Anxiety Sensitivity Index score; an in-scanner rating of elevated bodily anxiety sensations during fear conditioning; and a corresponding estimate of whole-brain functional activation to the conditioned versus nonconditioned stimuli. Using a voxel-wise mediation analysis framework, we formally tested for 'neural mediators' of the predicted association between trait AS score and in-scanner anxiety sensations during fear conditioning. Contrary to the anterior insular hypothesis, no evidence of significant mediation was observed for this brain region, which was instead linked to perceived anxiety sensations independently from AS. Evidence for significant mediation was obtained for the dorsal anterior cingulate cortex-a finding that we argue is more consistent with the hypothesized role of human cingulofrontal cortex in conscious threat appraisal processes, including threat-overestimation. This study offers an important neurobiological validation of the AS construct and identifies a specific neural substrate that may underlie high AS clinical phenotypes, including but not limited to panic disorder. © 2015 Wiley Periodicals, Inc.

  2. The Importance of Teaching Power in Statistical Hypothesis Testing

    ERIC Educational Resources Information Center

    Olinsky, Alan; Schumacher, Phyllis; Quinn, John

    2012-01-01

    In this paper, we discuss the importance of teaching power considerations in statistical hypothesis testing. Statistical power analysis determines the ability of a study to detect a meaningful effect size, where the effect size is the difference between the hypothesized value of the population parameter under the null hypothesis and the true value…

  3. The Relation between Parental Values and Parenting Behavior: A Test of the Kohn Hypothesis.

    ERIC Educational Resources Information Center

    Luster, Tom; And Others

    1989-01-01

    Used data on 65 mother-infant dyads to test Kohn's hypothesis concerning the relation between values and parenting behavior. Findings support Kohn's hypothesis that parents who value self-direction would emphasize supportive function of parenting and parents who value conformity would emphasize their obligations to impose restraints. (Author/NB)

  4. Cognitive Biases in the Interpretation of Autonomic Arousal: A Test of the Construal Bias Hypothesis

    ERIC Educational Resources Information Center

    Ciani, Keith D.; Easter, Matthew A.; Summers, Jessica J.; Posada, Maria L.

    2009-01-01

    According to Bandura's construal bias hypothesis, derived from social cognitive theory, persons with the same heightened state of autonomic arousal may experience either pleasant or deleterious emotions depending on the strength of perceived self-efficacy. The current study tested this hypothesis by proposing that college students' preexisting…

  5. Is Conscious Stimulus Identification Dependent on Knowledge of the Perceptual Modality? Testing the “Source Misidentification Hypothesis”

    PubMed Central

    Overgaard, Morten; Lindeløv, Jonas; Svejstrup, Stinna; Døssing, Marianne; Hvid, Tanja; Kauffmann, Oliver; Mouridsen, Kim

    2013-01-01

    This paper reports an experiment intended to test a particular hypothesis derived from blindsight research, which we name the “source misidentification hypothesis.” According to this hypothesis, a subject may be correct about a stimulus without being correct about how she had access to this knowledge (whether the stimulus was visual, auditory, or something else). We test this hypothesis in healthy subjects, asking them to report whether a masked stimulus was presented auditorily or visually, what the stimulus was, and how clearly they experienced the stimulus using the Perceptual Awareness Scale (PAS). We suggest that knowledge about perceptual modality may be a necessary precondition in order to issue correct reports of which stimulus was presented. Furthermore, we find that PAS ratings correlate with correctness, and that subjects are at chance level when reporting no conscious experience of the stimulus. To demonstrate that particular levels of reporting accuracy are obtained, we employ a statistical strategy, which operationally tests the hypothesis of non-equality, such that the usual rejection of the null-hypothesis admits the conclusion of equivalence. PMID:23508677

  6. A large scale test of the gaming-enhancement hypothesis

    PubMed Central

    Wang, John C.

    2016-01-01

    A growing research literature suggests that regular electronic game play and game-based training programs may confer practically significant benefits to cognitive functioning. Most evidence supporting this idea, the gaming-enhancement hypothesis, has been collected in small-scale studies of university students and older adults. This research investigated the hypothesis in a general way with a large sample of 1,847 school-aged children. Our aim was to examine the relations between young people’s gaming experiences and an objective test of reasoning performance. Using a Bayesian hypothesis testing approach, evidence for the gaming-enhancement and null hypotheses were compared. Results provided no substantive evidence supporting the idea that having preference for or regularly playing commercially available games was positively associated with reasoning ability. Evidence ranged from equivocal to very strong in support for the null hypothesis over what was predicted. The discussion focuses on the value of Bayesian hypothesis testing for investigating electronic gaming effects, the importance of open science practices, and pre-registered designs to improve the quality of future work. PMID:27896035

  7. Density profiles in the Scrape-Off Layer interpreted through filament dynamics

    NASA Astrophysics Data System (ADS)

    Militello, Fulvio

    2017-10-01

    We developed a new theoretical framework to clarify the relation between radial Scrape-Off Layer density profiles and the fluctuations that generate them. The framework provides an interpretation of the experimental features of the profiles and of the turbulence statistics on the basis of simple properties of the filaments, such as their radial motion and their draining towards the divertor. L-mode and inter-ELM filaments are described as a Poisson process in which each event is independent and modelled with a wave function of amplitude and width statistically distributed according to experimental observations and evolving according to fluid equations. We will rigorously show that radially accelerating filaments, less efficient parallel exhaust and also a statistical distribution of their radial velocity can contribute to induce flatter profiles in the far SOL and therefore enhance plasma-wall interactions. A quite general result of our analysis is the resiliency of this non-exponential nature of the profiles and the increase of the relative fluctuation amplitude towards the wall, as experimentally observed. According to the framework, profile broadening at high fueling rates can be caused by interactions with neutrals (e.g. charge exchange) in the divertor or by a significant radial acceleration of the filaments. The framework assumptions were tested with 3D numerical simulations of seeded SOL filaments based on a two fluid model. In particular, filaments interact through the electrostatic field they generate only when they are in close proximity (separation comparable to their width in the drift plane), thus justifying our independence hypothesis. In addition, we will discuss how isolated filament motion responds to variations in the plasma conditions, and specifically divertor conditions. Finally, using the theoretical framework we will reproduce and interpret experimental results obtained on JET, MAST and HL-2A.

  8. LIKELIHOOD RATIO TESTS OF HYPOTHESES ON MULTIVARIATE POPULATIONS, VOLUME II, TEST OF HYPOTHESIS--STATISTICAL MODELS FOR THE EVALUATION AND INTERPRETATION OF EDUCATIONAL CRITERIA. PART 4.

    ERIC Educational Resources Information Center

    SAW, J.G.

    THIS PAPER DEALS WITH SOME TESTS OF HYPOTHESIS FREQUENTLY ENCOUNTERED IN THE ANALYSIS OF MULTIVARIATE DATA. THE TYPE OF HYPOTHESIS CONSIDERED IS THAT WHICH THE STATISTICIAN CAN ANSWER IN THE NEGATIVE OR AFFIRMATIVE. THE DOOLITTLE METHOD MAKES IT POSSIBLE TO EVALUATE THE DETERMINANT OF A MATRIX OF HIGH ORDER, TO SOLVE A MATRIX EQUATION, OR TO…

  9. The Bayesian New Statistics: Hypothesis testing, estimation, meta-analysis, and power analysis from a Bayesian perspective.

    PubMed

    Kruschke, John K; Liddell, Torrin M

    2018-02-01

    In the practice of data analysis, there is a conceptual distinction between hypothesis testing, on the one hand, and estimation with quantified uncertainty on the other. Among frequentists in psychology, a shift of emphasis from hypothesis testing to estimation has been dubbed "the New Statistics" (Cumming 2014). A second conceptual distinction is between frequentist methods and Bayesian methods. Our main goal in this article is to explain how Bayesian methods achieve the goals of the New Statistics better than frequentist methods. The article reviews frequentist and Bayesian approaches to hypothesis testing and to estimation with confidence or credible intervals. The article also describes Bayesian approaches to meta-analysis, randomized controlled trials, and power analysis.

  10. Analytical Framework for Identifying and Differentiating Recent Hitchhiking and Severe Bottleneck Effects from Multi-Locus DNA Sequence Data

    DOE PAGES

    Sargsyan, Ori

    2012-05-25

    Hitchhiking and severe bottleneck effects have impact on the dynamics of genetic diversity of a population by inducing homogenization at a single locus and at the genome-wide scale, respectively. As a result, identification and differentiation of the signatures of such events from DNA sequence data at a single locus is challenging. This study develops an analytical framework for identifying and differentiating recent homogenization events at multiple neutral loci in low recombination regions. The dynamics of genetic diversity at a locus after a recent homogenization event is modeled according to the infinite-sites mutation model and the Wright-Fisher model of reproduction withmore » constant population size. In this setting, I derive analytical expressions for the distribution, mean, and variance of the number of polymorphic sites in a random sample of DNA sequences from a locus affected by a recent homogenization event. Based on this framework, three likelihood-ratio based tests are presented for identifying and differentiating recent homogenization events at multiple loci. Lastly, I apply the framework to two data sets. First, I consider human DNA sequences from four non-coding loci on different chromosomes for inferring evolutionary history of modern human populations. The results suggest, in particular, that recent homogenization events at the loci are identifiable when the effective human population size is 50000 or greater in contrast to 10000, and the estimates of the recent homogenization events are agree with the “Out of Africa” hypothesis. Second, I use HIV DNA sequences from HIV-1-infected patients to infer the times of HIV seroconversions. The estimates are contrasted with other estimates derived as the mid-time point between the last HIV-negative and first HIV-positive screening tests. Finally, the results show that significant discrepancies can exist between the estimates.« less

  11. The relationship between motor skills and psychosocial factors in young children: A test of the elaborated environmental stress hypothesis.

    PubMed

    Mancini, Vincent O; Rigoli, Daniela; Roberts, Lynne D; Heritage, Brody; Piek, Jan P

    2017-09-08

    The elaborated environmental stress hypothesis (EESH) provides a framework that describes how motor skills may indirectly cause internalizing problems through various mediating psychosocial factors. While there is evidence to support this framework, little is known about how the proposed relationships may vary across different stages of development. This study aimed to investigate whether peer problems and perceived self-competence mediated the relationship between motor skills and internalizing problems in pre-primary children, and at 18-month follow up. A community sample of 197 pre-primary school children (M = 5.40 years, SD = 0.30 years; 102 males, 95 females) participated at Time 1, with 107 completing the Time 2 follow-up. Standardized instruments were used to measure motor skills and verbal IQ. Perceived self-competence was measured using a self-report measure. Participant peer problems and internalizing problems were measured using teacher report. Age, gender, and verbal IQ were included as covariates. Mediation analysis using PROCESS showed that the relationship between motor skills and internalizing problems was mediated by peer problems at Time 1. At Time 2, the relationship was mediated by peer problems and perceived physical competence. The current results indicate the EESH may function differently across different periods of development. The transition from pre-primary to Grade 1 represents a time of important cognitive and psychosocial development, which has implications for how the relationship between motor skills and internalizing problems can be understood. These findings highlight potential age-appropriate targets for psychomotor interventions aiming to improve the emotional well-being of young children. © 2017 The British Psychological Society.

  12. A test of multiple hypotheses for the function of call sharing in female budgerigars, Melopsittacus undulatus

    PubMed Central

    Young, Anna M.; Cordier, Breanne; Mundry, Roger; Wright, Timothy F.

    2014-01-01

    In many social species group, members share acoustically similar calls. Functional hypotheses have been proposed for call sharing, but previous studies have been limited by an inability to distinguish among these hypotheses. We examined the function of vocal sharing in female budgerigars with a two-part experimental design that allowed us to distinguish between two functional hypotheses. The social association hypothesis proposes that shared calls help animals mediate affiliative and aggressive interactions, while the password hypothesis proposes that shared calls allow animals to distinguish group identity and exclude nonmembers. We also tested the labeling hypothesis, a mechanistic explanation which proposes that shared calls are used to address specific individuals within the sender–receiver relationship. We tested the social association hypothesis by creating four–member flocks of unfamiliar female budgerigars (Melopsittacus undulatus) and then monitoring the birds’ calls, social behaviors, and stress levels via fecal glucocorticoid metabolites. We tested the password hypothesis by moving immigrants into established social groups. To test the labeling hypothesis, we conducted additional recording sessions in which individuals were paired with different group members. The social association hypothesis was supported by the development of multiple shared call types in each cage and a correlation between the number of shared call types and the number of aggressive interactions between pairs of birds. We also found support for calls serving as a labeling mechanism using discriminant function analysis with a permutation procedure. Our results did not support the password hypothesis, as there was no difference in stress or directed behaviors between immigrant and control birds. PMID:24860236

  13. Cognitive processing in bipolar disorder conceptualized using the Interactive Cognitive Subsystems (ICS) model

    PubMed Central

    Lomax, C. L.; Barnard, P. J.; Lam, D.

    2009-01-01

    Background There are few theoretical proposals that attempt to account for the variation in affective processing across different affective states of bipolar disorder (BD). The Interacting Cognitive Subsystems (ICS) framework has been recently extended to account for manic states. Within the framework, positive mood state is hypothesized to tap into an implicational level of processing, which is proposed to be more extreme in states of mania. Method Thirty individuals with BD and 30 individuals with no history of affective disorder were tested in euthymic mood state and then in induced positive mood state using the Question–Answer task to examine the mode of processing of schemas. The task was designed to test whether individuals would detect discrepancies within the prevailing schemas of the sentences. Results Although the present study did not support the hypothesis that the groups differ in their ability to detect discrepancies within schemas, we did find that the BD group was significantly more likely than the control group to answer questions that were consistent with the prevailing schemas, both before and after mood induction. Conclusions These results may reflect a general cognitive bias, that individuals with BD have a tendency to operate at a more abstract level of representation. This may leave an individual prone to affective disturbance, although further research is required to replicate this finding. PMID:18796173

  14. Reversing gestational undernutrition via kick-starting early growth

    USDA-ARS?s Scientific Manuscript database

    Poor maternal nutrition enhances chronic disease risk in the offspring. The conceptual framework for this association is provided by the developmental origins of health and disease (DoHAD) hypothesis which suggests that unfavorable prenatal and postnatal environments, lead to permanent alterations i...

  15. Phase II design with sequential testing of hypotheses within each stage.

    PubMed

    Poulopoulou, Stavroula; Karlis, Dimitris; Yiannoutsos, Constantin T; Dafni, Urania

    2014-01-01

    The main goal of a Phase II clinical trial is to decide, whether a particular therapeutic regimen is effective enough to warrant further study. The hypothesis tested by Fleming's Phase II design (Fleming, 1982) is [Formula: see text] versus [Formula: see text], with level [Formula: see text] and with a power [Formula: see text] at [Formula: see text], where [Formula: see text] is chosen to represent the response probability achievable with standard treatment and [Formula: see text] is chosen such that the difference [Formula: see text] represents a targeted improvement with the new treatment. This hypothesis creates a misinterpretation mainly among clinicians that rejection of the null hypothesis is tantamount to accepting the alternative, and vice versa. As mentioned by Storer (1992), this introduces ambiguity in the evaluation of type I and II errors and the choice of the appropriate decision at the end of the study. Instead of testing this hypothesis, an alternative class of designs is proposed in which two hypotheses are tested sequentially. The hypothesis [Formula: see text] versus [Formula: see text] is tested first. If this null hypothesis is rejected, the hypothesis [Formula: see text] versus [Formula: see text] is tested next, in order to examine whether the therapy is effective enough to consider further testing in a Phase III study. For the derivation of the proposed design the exact binomial distribution is used to calculate the decision cut-points. The optimal design parameters are chosen, so as to minimize the average sample number (ASN) under specific upper bounds for error levels. The optimal values for the design were found using a simulated annealing method.

  16. An Extension of RSS-based Model Comparison Tests for Weighted Least Squares

    DTIC Science & Technology

    2012-08-22

    use the model comparison test statistic to analyze the null hypothesis. Under the null hypothesis, the weighted least squares cost functional is JWLS ...q̂WLSH ) = 10.3040×106. Under the alternative hypothesis, the weighted least squares cost functional is JWLS (q̂WLS) = 8.8394 × 106. Thus the model

  17. Energy conservation and the demand for labor: evidence from aggregate data. Final report Oct 80-May 81

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Morgenstern, R.; Vroman, W.

    1981-05-01

    The objective of this study was to explore the relationship between aggregate employment and output, with special reference to the 1979-1980 experience. An indirect relationship was posited between energy price shocks and the occupational mix. Specifically, it was hypothesized that the energy price shocks of 1978-1979 (and possibly 1973-1974) may have created new investment opportunities for simple, short term energy conservation type investments which may, in turn, have increased the demand for blue collar labor. A framework for viewing the problem is described and the basic hypothesis tested by estimating a time series model of occupational/industrial employment patterns. Included inmore » the model is a set of variables designed to measure the energy price shocks.« less

  18. On the use of attractor dimension as a feature in structural health monitoring

    USGS Publications Warehouse

    Nichols, J.M.; Virgin, L.N.; Todd, M.D.; Nichols, J.D.

    2003-01-01

    Recent works in the vibration-based structural health monitoring community have emphasised the use of correlation dimension as a discriminating statistic in seperating a damaged from undamaged response. This paper explores the utility of attractor dimension as a 'feature' and offers some comparisons between different metrics reflecting dimension. This focus is on evaluating the performance of two different measures of dimension as damage indicators in a structural health monitoring context. Results indicate that the correlation dimension is probably a poor choice of statistic for the purpose of signal discrimination. Other measures of dimension may be used for the same purposes with a higher degree of statistical reliability. The question of competing methodologies is placed in a hypothesis testing framework and answered with experimental data taken from a cantilivered beam.

  19. Proportional reasoning as a heuristic-based process: time constraint and dual task considerations.

    PubMed

    Gillard, Ellen; Van Dooren, Wim; Schaeken, Walter; Verschaffel, Lieven

    2009-01-01

    The present study interprets the overuse of proportional solution methods from a dual process framework. Dual process theories claim that analytic operations involve time-consuming executive processing, whereas heuristic operations are fast and automatic. In two experiments to test whether proportional reasoning is heuristic-based, the participants solved "proportional" problems, for which proportional solution methods provide correct answers, and "nonproportional" problems known to elicit incorrect answers based on the assumption of proportionality. In Experiment 1, the available solution time was restricted. In Experiment 2, the executive resources were burdened with a secondary task. Both manipulations induced an increase in proportional answers and a decrease in correct answers to nonproportional problems. These results support the hypothesis that the choice for proportional methods is heuristic-based.

  20. Decision-making when data and inferences are not conclusive: risk-benefit and acceptable regret approach.

    PubMed

    Hozo, Iztok; Schell, Michael J; Djulbegovic, Benjamin

    2008-07-01

    The absolute truth in research is unobtainable, as no evidence or research hypothesis is ever 100% conclusive. Therefore, all data and inferences can in principle be considered as "inconclusive." Scientific inference and decision-making need to take into account errors, which are unavoidable in the research enterprise. The errors can occur at the level of conclusions that aim to discern the truthfulness of research hypothesis based on the accuracy of research evidence and hypothesis, and decisions, the goal of which is to enable optimal decision-making under present and specific circumstances. To optimize the chance of both correct conclusions and correct decisions, the synthesis of all major statistical approaches to clinical research is needed. The integration of these approaches (frequentist, Bayesian, and decision-analytic) can be accomplished through formal risk:benefit (R:B) analysis. This chapter illustrates the rational choice of a research hypothesis using R:B analysis based on decision-theoretic expected utility theory framework and the concept of "acceptable regret" to calculate the threshold probability of the "truth" above which the benefit of accepting a research hypothesis outweighs its risks.

  1. Pursuing realistic hydrologic model under SUPERFLEX framework in a semi-humid catchment in China

    NASA Astrophysics Data System (ADS)

    Wei, Lingna; Savenije, Hubert H. G.; Gao, Hongkai; Chen, Xi

    2016-04-01

    Model realism is pursued perpetually by hydrologists for flood and drought prediction, integrated water resources management and decision support of water security. "Physical-based" distributed hydrologic models are speedily developed but they also encounter unneglectable challenges, for instance, computational time with low efficiency and parameters uncertainty. This study step-wisely tested four conceptual hydrologic models under the framework of SUPERFLEX in a small semi-humid catchment in southern Huai River basin of China. The original lumped FLEXL has hypothesized model structure of four reservoirs to represent canopy interception, unsaturated zone, subsurface flow of fast and slow components and base flow storage. Considering the uneven rainfall in space, the second model (FLEXD) is developed with same parameter set for different rain gauge controlling units. To reveal the effect of topography, terrain descriptor of height above the nearest drainage (HAND) combined with slope is applied to classify the experimental catchment into two landscapes. Then the third one (FLEXTOPO) builds different model blocks in consideration of the dominant hydrologic process corresponding to the topographical condition. The fourth one named FLEXTOPOD integrating the parallel framework of FLEXTOPO in four controlled units is designed to interpret spatial variability of rainfall patterns and topographic features. Through pairwise comparison, our results suggest that: (1) semi-distributed models (FLEXD and FLEXTOPOD) taking precipitation spatial heterogeneity into account has improved model performance with parsimonious parameter set, and (2) hydrologic model architecture with flexibility to reflect perceived dominant hydrologic processes can include the local terrain circumstances for each landscape. Hence, the modeling actions are coincided with the catchment behaviour and close to the "reality". The presented methodology is regarding hydrologic model as a tool to test our hypothesis and deepen our understanding of hydrologic processes, which will be helpful to improve modeling realism.

  2. Testing the Hydrological Coherence of High-Resolution Gridded Precipitation and Temperature Data Sets

    NASA Astrophysics Data System (ADS)

    Laiti, L.; Mallucci, S.; Piccolroaz, S.; Bellin, A.; Zardi, D.; Fiori, A.; Nikulin, G.; Majone, B.

    2018-03-01

    Assessing the accuracy of gridded climate data sets is highly relevant to climate change impact studies, since evaluation, bias correction, and statistical downscaling of climate models commonly use these products as reference. Among all impact studies those addressing hydrological fluxes are the most affected by errors and biases plaguing these data. This paper introduces a framework, coined Hydrological Coherence Test (HyCoT), for assessing the hydrological coherence of gridded data sets with hydrological observations. HyCoT provides a framework for excluding meteorological forcing data sets not complying with observations, as function of the particular goal at hand. The proposed methodology allows falsifying the hypothesis that a given data set is coherent with hydrological observations on the basis of the performance of hydrological modeling measured by a metric selected by the modeler. HyCoT is demonstrated in the Adige catchment (southeastern Alps, Italy) for streamflow analysis, using a distributed hydrological model. The comparison covers the period 1989-2008 and includes five gridded daily meteorological data sets: E-OBS, MSWEP, MESAN, APGD, and ADIGE. The analysis highlights that APGD and ADIGE, the data sets with highest effective resolution, display similar spatiotemporal precipitation patterns and produce the largest hydrological efficiency indices. Lower performances are observed for E-OBS, MESAN, and MSWEP, especially in small catchments. HyCoT reveals deficiencies in the representation of spatiotemporal patterns of gridded climate data sets, which cannot be corrected by simply rescaling the meteorological forcing fields, as often done in bias correction of climate model outputs. We recommend this framework to assess the hydrological coherence of gridded data sets to be used in large-scale hydroclimatic studies.

  3. Hypothesis testing of scientific Monte Carlo calculations.

    PubMed

    Wallerberger, Markus; Gull, Emanuel

    2017-11-01

    The steadily increasing size of scientific Monte Carlo simulations and the desire for robust, correct, and reproducible results necessitates rigorous testing procedures for scientific simulations in order to detect numerical problems and programming bugs. However, the testing paradigms developed for deterministic algorithms have proven to be ill suited for stochastic algorithms. In this paper we demonstrate explicitly how the technique of statistical hypothesis testing, which is in wide use in other fields of science, can be used to devise automatic and reliable tests for Monte Carlo methods, and we show that these tests are able to detect some of the common problems encountered in stochastic scientific simulations. We argue that hypothesis testing should become part of the standard testing toolkit for scientific simulations.

  4. Hypothesis testing of scientific Monte Carlo calculations

    NASA Astrophysics Data System (ADS)

    Wallerberger, Markus; Gull, Emanuel

    2017-11-01

    The steadily increasing size of scientific Monte Carlo simulations and the desire for robust, correct, and reproducible results necessitates rigorous testing procedures for scientific simulations in order to detect numerical problems and programming bugs. However, the testing paradigms developed for deterministic algorithms have proven to be ill suited for stochastic algorithms. In this paper we demonstrate explicitly how the technique of statistical hypothesis testing, which is in wide use in other fields of science, can be used to devise automatic and reliable tests for Monte Carlo methods, and we show that these tests are able to detect some of the common problems encountered in stochastic scientific simulations. We argue that hypothesis testing should become part of the standard testing toolkit for scientific simulations.

  5. Concreteness effects in short-term memory: a test of the item-order hypothesis.

    PubMed

    Roche, Jaclynn; Tolan, G Anne; Tehan, Gerald

    2011-12-01

    The following experiments explore word length and concreteness effects in short-term memory within an item-order processing framework. This framework asserts order memory is better for those items that are relatively easy to process at the item level. However, words that are difficult to process benefit at the item level for increased attention/resources being applied. The prediction of the model is that differential item and order processing can be detected in episodic tasks that differ in the degree to which item or order memory are required by the task. The item-order account has been applied to the word length effect such that there is a short word advantage in serial recall but a long word advantage in item recognition. The current experiment considered the possibility that concreteness effects might be explained within the same framework. In two experiments, word length (Experiment 1) and concreteness (Experiment 2) are examined using forward serial recall, backward serial recall, and item recognition. These results for word length replicate previous studies showing the dissociation in item and order tasks. The same was not true for the concreteness effect. In all three tasks concrete words were better remembered than abstract words. The concreteness effect cannot be explained in terms of an item-order trade off. PsycINFO Database Record (c) 2011 APA, all rights reserved.

  6. Combining statistical inference and decisions in ecology.

    PubMed

    Williams, Perry J; Hooten, Mevin B

    2016-09-01

    Statistical decision theory (SDT) is a sub-field of decision theory that formally incorporates statistical investigation into a decision-theoretic framework to account for uncertainties in a decision problem. SDT provides a unifying analysis of three types of information: statistical results from a data set, knowledge of the consequences of potential choices (i.e., loss), and prior beliefs about a system. SDT links the theoretical development of a large body of statistical methods, including point estimation, hypothesis testing, and confidence interval estimation. The theory and application of SDT have mainly been developed and published in the fields of mathematics, statistics, operations research, and other decision sciences, but have had limited exposure in ecology. Thus, we provide an introduction to SDT for ecologists and describe its utility for linking the conventionally separate tasks of statistical investigation and decision making in a single framework. We describe the basic framework of both Bayesian and frequentist SDT, its traditional use in statistics, and discuss its application to decision problems that occur in ecology. We demonstrate SDT with two types of decisions: Bayesian point estimation and an applied management problem of selecting a prescribed fire rotation for managing a grassland bird species. Central to SDT, and decision theory in general, are loss functions. Thus, we also provide basic guidance and references for constructing loss functions for an SDT problem. © 2016 by the Ecological Society of America.

  7. Sex ratios in the two Germanies: a test of the economic stress hypothesis.

    PubMed

    Catalano, Ralph A

    2003-09-01

    Literature describing temporal variation in the secondary sex ratio among humans reports an association between population stressors and declines in the odds of male birth. Explanations of this phenomenon draw on reports that stressed females spontaneously abort male more than female fetuses, and that stressed males exhibit reduced sperm motility. This work has led to the argument that population stress induced by a declining economy reduces the human sex ratio. No direct test of this hypothesis appears in the literature. Here, a test is offered based on a comparison of the sex ratio in East and West Germany for the years 1946 to 1999. The theory suggests that the East German sex ratio should be lower in 1991, when East Germany's economy collapsed, than expected from its own history and from the sex ratio in West Germany. The hypothesis is tested using time-series modelling methods. The data support the hypothesis. The sex ratio in East Germany was at its lowest in 1991. This first direct test supports the hypothesis that economic decline reduces the human sex ratio.

  8. Understanding suicide terrorism: premature dismissal of the religious-belief hypothesis.

    PubMed

    Liddle, James R; Machluf, Karin; Shackelford, Todd K

    2010-07-06

    We comment on work by Ginges, Hansen, and Norenzayan (2009), in which they compare two hypotheses for predicting individual support for suicide terrorism: the religious-belief hypothesis and the coalitional-commitment hypothesis. Although we appreciate the evidence provided in support of the coalitional-commitment hypothesis, we argue that their method of testing the religious-belief hypothesis is conceptually flawed, thus calling into question their conclusion that the religious-belief hypothesis has been disconfirmed. In addition to critiquing the methodology implemented by Ginges et al., we provide suggestions on how the religious-belief hypothesis may be properly tested. It is possible that the premature and unwarranted conclusions reached by Ginges et al. may deter researchers from examining the effect of specific religious beliefs on support for terrorism, and we hope that our comments can mitigate this possibility.

  9. Male resource defense mating system in primates? An experimental test in wild capuchin monkeys.

    PubMed

    Tiddi, Barbara; Heistermann, Michael; Fahy, Martin K; Wheeler, Brandon C

    2018-01-01

    Ecological models of mating systems provide a theoretical framework to predict the effect of the defendability of both breeding resources and mating partners on mating patterns. In resource-based mating systems, male control over breeding resources is tightly linked to female mating preference. To date, few field studies have experimentally investigated the relationship between male resource control and female mating preference in mammals due to difficulties in manipulating ecological factors (e.g., food contestability). We tested the within-group male resource defense hypothesis experimentally in a wild population of black capuchin monkeys (Sapajus nigritus) in Iguazú National Park, Argentina. Sapajus spp. represent an ideal study model as, in contrast to most primates, they have been previously argued to be characterized by female mate choice and a resource-based mating system in which within-group resource monopolization by high-ranking males drives female mating preference for those males. Here, we examined whether females (N = 12) showed a weaker preference for alpha males during mating seasons in which food distribution was experimentally manipulated to be less defendable relative to those in which it was highly defendable. Results did not support the within-group male resource defense hypothesis, as female sexual preferences for alpha males did not vary based on food defendability. We discuss possible reasons for our results, including the possibility of other direct and indirect benefits females receive in exercising mate choice, the potential lack of tolerance over food directed towards females by alpha males, and phylogenetic constraints.

  10. How small could a pup sound? The physical bases of signaling body size in harbor seals

    PubMed Central

    Gross, Stephanie; Garcia, Maxime; Rubio-Garcia, Ana; de Boer, Bart

    2017-01-01

    Abstract Vocal communication is a crucial aspect of animal behavior. The mechanism which most mammals use to vocalize relies on three anatomical components. First, air overpressure is generated inside the lower vocal tract. Second, as the airstream goes through the glottis, sound is produced via vocal fold vibration. Third, this sound is further filtered by the geometry and length of the upper vocal tract. Evidence from mammalian anatomy and bioacoustics suggests that some of these three components may covary with an animal’s body size. The framework provided by acoustic allometry suggests that, because vocal tract length (VTL) is more strongly constrained by the growth of the body than vocal fold length (VFL), VTL generates more reliable acoustic cues to an animal’s size. This hypothesis is often tested acoustically but rarely anatomically, especially in pinnipeds. Here, we test the anatomical bases of the acoustic allometry hypothesis in harbor seal pups Phoca vitulina. We dissected and measured vocal tract, vocal folds, and other anatomical features of 15 harbor seals post-mortem. We found that, while VTL correlates with body size, VFL does not. This suggests that, while body growth puts anatomical constraints on how vocalizations are filtered by harbor seals’ vocal tract, no such constraints appear to exist on vocal folds, at least during puppyhood. It is particularly interesting to find anatomical constraints on harbor seals’ vocal tracts, the same anatomical region partially enabling pups to produce individually distinctive vocalizations. PMID:29492005

  11. The social nature of primate cognition

    PubMed Central

    Barrett, Louise; Henzi, Peter

    2005-01-01

    The hypothesis that the enlarged brain size of the primates was selected for by social, rather than purely ecological, factors has been strongly influential in studies of primate cognition and behaviour over the past two decades. However, the Machiavellian intelligence hypothesis, also known as the social brain hypothesis, tends to emphasize certain traits and behaviours, like exploitation and deception, at the expense of others, such as tolerance and behavioural coordination, and therefore presents only one view of how social life may shape cognition. This review outlines work from other relevant disciplines, including evolutionary economics, cognitive science and neurophysiology, to illustrate how these can be used to build a more general theoretical framework, incorporating notions of embodied and distributed cognition, in which to situate questions concerning the evolution of primate social cognition. PMID:16191591

  12. Testing hypotheses and the advancement of science: recent attempts to falsify the equilibrium point hypothesis.

    PubMed

    Feldman, Anatol G; Latash, Mark L

    2005-02-01

    Criticisms of the equilibrium point (EP) hypothesis have recently appeared that are based on misunderstandings of some of its central notions. Starting from such interpretations of the hypothesis, incorrect predictions are made and tested. When the incorrect predictions prove false, the hypothesis is claimed to be falsified. In particular, the hypothesis has been rejected based on the wrong assumptions that it conflicts with empirically defined joint stiffness values or that it is incompatible with violations of equifinality under certain velocity-dependent perturbations. Typically, such attempts use notions describing the control of movements of artificial systems in place of physiologically relevant ones. While appreciating constructive criticisms of the EP hypothesis, we feel that incorrect interpretations have to be clarified by reiterating what the EP hypothesis does and does not predict. We conclude that the recent claims of falsifying the EP hypothesis and the calls for its replacement by EMG-force control hypothesis are unsubstantiated. The EP hypothesis goes far beyond the EMG-force control view. In particular, the former offers a resolution for the famous posture-movement paradox while the latter fails to resolve it.

  13. Testing option pricing with the Edgeworth expansion

    NASA Astrophysics Data System (ADS)

    Balieiro Filho, Ruy Gabriel; Rosenfeld, Rogerio

    2004-12-01

    There is a well-developed framework, the Black-Scholes theory, for the pricing of contracts based on the future prices of certain assets, called options. This theory assumes that the probability distribution of the returns of the underlying asset is a Gaussian distribution. However, it is observed in the market that this hypothesis is flawed, leading to the introduction of a fudge factor, the so-called volatility smile. Therefore, it would be interesting to explore extensions of the Black-Scholes theory to non-Gaussian distributions. In this paper, we provide an explicit formula for the price of an option when the distributions of the returns of the underlying asset is parametrized by an Edgeworth expansion, which allows for the introduction of higher independent moments of the probability distribution, namely skewness and kurtosis. We test our formula with options in the Brazilian and American markets, showing that the volatility smile can be reduced. We also check whether our approach leads to more efficient hedging strategies of these instruments.

  14. Numerical Study of Electrostatic Field Distortion on LPTPC End-Plates based on Bulk Micromegas Modules

    NASA Astrophysics Data System (ADS)

    Bhattacharya, Purba; Bhattacharya, Deb Sankar; Mukhopadhyay, Supratik; Majumdar, Nayana; Bhattacharya, Sudeb; Colas, Paul; Attié, David

    2018-02-01

    The R&D activities for the linear collider TPC (LC-TPC) are currently working on the adoption of the micro pattern devices for the gaseous amplification stage. Several beam tests have been carried out at DESY with a 5 GeV electron beam in a 1 T superconducting magnet. We worked on a large prototype TPC with an end-plate that was built, for the first time, using seven resistive bulk Micromegas modules. During experiments, reduced signal sensitivity was observed at the boundary of these modules. Electrostatic field distortion near the module boundaries was considered to be the possible major reason behind these observations. In the present work, we will explore this hypothesis through numerical simulation. Our aim has been to understand the origin of distortions observed close to the edges of the test beam modules and to explore the possibility of using the Garfield simulation framework for investigating a phenomenon as complex as distortion.

  15. Action perception as hypothesis testing.

    PubMed

    Donnarumma, Francesco; Costantini, Marcello; Ambrosini, Ettore; Friston, Karl; Pezzulo, Giovanni

    2017-04-01

    We present a novel computational model that describes action perception as an active inferential process that combines motor prediction (the reuse of our own motor system to predict perceived movements) and hypothesis testing (the use of eye movements to disambiguate amongst hypotheses). The system uses a generative model of how (arm and hand) actions are performed to generate hypothesis-specific visual predictions, and directs saccades to the most informative places of the visual scene to test these predictions - and underlying hypotheses. We test the model using eye movement data from a human action observation study. In both the human study and our model, saccades are proactive whenever context affords accurate action prediction; but uncertainty induces a more reactive gaze strategy, via tracking the observed movements. Our model offers a novel perspective on action observation that highlights its active nature based on prediction dynamics and hypothesis testing. Copyright © 2017 The Authors. Published by Elsevier Ltd.. All rights reserved.

  16. The neural signature of emotional memories in serial crimes.

    PubMed

    Chassy, Philippe

    2017-10-01

    Neural plasticity is the process whereby semantic information and emotional responses are stored in neural networks. It is hypothesized that the neural networks built over time to encode the sexual fantasies that motivate serial killers to act should display a unique, detectable activation pattern. The pathological neural watermark hypothesis posits that such networks comprise activation of brain sites that reflect four cognitive components: autobiographical memory, sexual arousal, aggression, and control over aggression. The neural sites performing these cognitive functions have been successfully identified by previous research. The key findings are reviewed to hypothesise the typical pattern of activity that serial killers should display. Through the integration of biological findings into one framework, the neural approach proposed in this paper is in stark contrast with the many theories accounting for serial killers that offer non-medical taxonomies. The pathological neural watermark hypothesis offers a new framework to understand and detect deviant individuals. The technical and legal issues are briefly discussed. Copyright © 2017 Elsevier Ltd. All rights reserved.

  17. Confidence intervals for single-case effect size measures based on randomization test inversion.

    PubMed

    Michiels, Bart; Heyvaert, Mieke; Meulders, Ann; Onghena, Patrick

    2017-02-01

    In the current paper, we present a method to construct nonparametric confidence intervals (CIs) for single-case effect size measures in the context of various single-case designs. We use the relationship between a two-sided statistical hypothesis test at significance level α and a 100 (1 - α) % two-sided CI to construct CIs for any effect size measure θ that contain all point null hypothesis θ values that cannot be rejected by the hypothesis test at significance level α. This method of hypothesis test inversion (HTI) can be employed using a randomization test as the statistical hypothesis test in order to construct a nonparametric CI for θ. We will refer to this procedure as randomization test inversion (RTI). We illustrate RTI in a situation in which θ is the unstandardized and the standardized difference in means between two treatments in a completely randomized single-case design. Additionally, we demonstrate how RTI can be extended to other types of single-case designs. Finally, we discuss a few challenges for RTI as well as possibilities when using the method with other effect size measures, such as rank-based nonoverlap indices. Supplementary to this paper, we provide easy-to-use R code, which allows the user to construct nonparametric CIs according to the proposed method.

  18. Visualizing statistical significance of disease clusters using cartograms.

    PubMed

    Kronenfeld, Barry J; Wong, David W S

    2017-05-15

    Health officials and epidemiological researchers often use maps of disease rates to identify potential disease clusters. Because these maps exaggerate the prominence of low-density districts and hide potential clusters in urban (high-density) areas, many researchers have used density-equalizing maps (cartograms) as a basis for epidemiological mapping. However, we do not have existing guidelines for visual assessment of statistical uncertainty. To address this shortcoming, we develop techniques for visual determination of statistical significance of clusters spanning one or more districts on a cartogram. We developed the techniques within a geovisual analytics framework that does not rely on automated significance testing, and can therefore facilitate visual analysis to detect clusters that automated techniques might miss. On a cartogram of the at-risk population, the statistical significance of a disease cluster is determinate from the rate, area and shape of the cluster under standard hypothesis testing scenarios. We develop formulae to determine, for a given rate, the area required for statistical significance of a priori and a posteriori designated regions under certain test assumptions. Uniquely, our approach enables dynamic inference of aggregate regions formed by combining individual districts. The method is implemented in interactive tools that provide choropleth mapping, automated legend construction and dynamic search tools to facilitate cluster detection and assessment of the validity of tested assumptions. A case study of leukemia incidence analysis in California demonstrates the ability to visually distinguish between statistically significant and insignificant regions. The proposed geovisual analytics approach enables intuitive visual assessment of statistical significance of arbitrarily defined regions on a cartogram. Our research prompts a broader discussion of the role of geovisual exploratory analyses in disease mapping and the appropriate framework for visually assessing the statistical significance of spatial clusters.

  19. Picture-Perfect Is Not Perfect for Metamemory: Testing the Perceptual Fluency Hypothesis with Degraded Images

    ERIC Educational Resources Information Center

    Besken, Miri

    2016-01-01

    The perceptual fluency hypothesis claims that items that are easy to perceive at encoding induce an illusion that they will be easier to remember, despite the finding that perception does not generally affect recall. The current set of studies tested the predictions of the perceptual fluency hypothesis with a picture generation manipulation.…

  20. Adolescents' Body Image Trajectories: A Further Test of the Self-Equilibrium Hypothesis

    ERIC Educational Resources Information Center

    Morin, Alexandre J. S.; Maïano, Christophe; Scalas, L. Francesca; Janosz, Michel; Litalien, David

    2017-01-01

    The self-equilibrium hypothesis underlines the importance of having a strong core self, which is defined as a high and developmentally stable self-concept. This study tested this hypothesis in relation to body image (BI) trajectories in a sample of 1,006 adolescents (M[subscript age] = 12.6, including 541 males and 465 females) across a 4-year…

  1. Using the Coefficient of Confidence to Make the Philosophical Switch from a Posteriori to a Priori Inferential Statistics

    ERIC Educational Resources Information Center

    Trafimow, David

    2017-01-01

    There has been much controversy over the null hypothesis significance testing procedure, with much of the criticism centered on the problem of inverse inference. Specifically, p gives the probability of the finding (or one more extreme) given the null hypothesis, whereas the null hypothesis significance testing procedure involves drawing a…

  2. Does Merit-Based Aid Improve College Affordability? Testing the Bennett Hypothesis in the Era of Merit-Based Aid

    ERIC Educational Resources Information Center

    Lee, Jungmin

    2016-01-01

    This study tested the Bennett hypothesis by examining whether four-year colleges changed listed tuition and fees, the amount of institutional grants per student, and room and board charges after their states implemented statewide merit-based aid programs. According to the Bennett hypothesis, increases in government financial aid make it easier for…

  3. Human female orgasm as evolved signal: a test of two hypotheses.

    PubMed

    Ellsworth, Ryan M; Bailey, Drew H

    2013-11-01

    We present the results of a study designed to empirically test predictions derived from two hypotheses regarding human female orgasm behavior as an evolved communicative trait or signal. One hypothesis tested was the female fidelity hypothesis, which posits that human female orgasm signals a woman's sexual satisfaction and therefore her likelihood of future fidelity to a partner. The other was sire choice hypothesis, which posits that women's orgasm behavior signals increased chances of fertilization. To test the two hypotheses of human female orgasm, we administered a questionnaire to 138 females and 121 males who reported that they were currently in a romantic relationship. Key predictions of the female fidelity hypothesis were not supported. In particular, orgasm was not associated with female sexual fidelity nor was orgasm associated with male perceptions of partner sexual fidelity. However, faked orgasm was associated with female sexual infidelity and lower male relationship satisfaction. Overall, results were in greater support of the sire choice signaling hypothesis than the female fidelity hypothesis. Results also suggest that male satisfaction with, investment in, and sexual fidelity to a mate are benefits that favored the selection of orgasmic signaling in ancestral females.

  4. Sex-Biased Parental Investment among Contemporary Chinese Peasants: Testing the Trivers-Willard Hypothesis.

    PubMed

    Luo, Liqun; Zhao, Wei; Weng, Tangmei

    2016-01-01

    The Trivers-Willard hypothesis predicts that high-status parents will bias their investment to sons, whereas low-status parents will bias their investment to daughters. Among humans, tests of this hypothesis have yielded mixed results. This study tests the hypothesis using data collected among contemporary peasants in Central South China. We use current family status (rated by our informants) and father's former class identity (assigned by the Chinese Communist Party in the early 1950s) as measures of parental status, and proportion of sons in offspring and offspring's years of education as measures of parental investment. Results show that (i) those families with a higher former class identity such as landlord and rich peasant tend to have a higher socioeconomic status currently, (ii) high-status parents are more likely to have sons than daughters among their biological offspring, and (iii) in higher-status families, the years of education obtained by sons exceed that obtained by daughters to a larger extent than in lower-status families. Thus, the first assumption and the two predictions of the hypothesis are supported by this study. This article contributes a contemporary Chinese case to the testing of the Trivers-Willard hypothesis.

  5. Hypothesis testing of a change point during cognitive decline among Alzheimer's disease patients.

    PubMed

    Ji, Ming; Xiong, Chengjie; Grundman, Michael

    2003-10-01

    In this paper, we present a statistical hypothesis test for detecting a change point over the course of cognitive decline among Alzheimer's disease patients. The model under the null hypothesis assumes a constant rate of cognitive decline over time and the model under the alternative hypothesis is a general bilinear model with an unknown change point. When the change point is unknown, however, the null distribution of the test statistics is not analytically tractable and has to be simulated by parametric bootstrap. When the alternative hypothesis that a change point exists is accepted, we propose an estimate of its location based on the Akaike's Information Criterion. We applied our method to a data set from the Neuropsychological Database Initiative by implementing our hypothesis testing method to analyze Mini Mental Status Exam scores based on a random-slope and random-intercept model with a bilinear fixed effect. Our result shows that despite large amount of missing data, accelerated decline did occur for MMSE among AD patients. Our finding supports the clinical belief of the existence of a change point during cognitive decline among AD patients and suggests the use of change point models for the longitudinal modeling of cognitive decline in AD research.

  6. Detection of Undocumented Changepoints Using Multiple Test Statistics and Composite Reference Series.

    NASA Astrophysics Data System (ADS)

    Menne, Matthew J.; Williams, Claude N., Jr.

    2005-10-01

    An evaluation of three hypothesis test statistics that are commonly used in the detection of undocumented changepoints is described. The goal of the evaluation was to determine whether the use of multiple tests could improve undocumented, artificial changepoint detection skill in climate series. The use of successive hypothesis testing is compared to optimal approaches, both of which are designed for situations in which multiple undocumented changepoints may be present. In addition, the importance of the form of the composite climate reference series is evaluated, particularly with regard to the impact of undocumented changepoints in the various component series that are used to calculate the composite.In a comparison of single test changepoint detection skill, the composite reference series formulation is shown to be less important than the choice of the hypothesis test statistic, provided that the composite is calculated from the serially complete and homogeneous component series. However, each of the evaluated composite series is not equally susceptible to the presence of changepoints in its components, which may be erroneously attributed to the target series. Moreover, a reference formulation that is based on the averaging of the first-difference component series is susceptible to random walks when the composition of the component series changes through time (e.g., values are missing), and its use is, therefore, not recommended. When more than one test is required to reject the null hypothesis of no changepoint, the number of detected changepoints is reduced proportionately less than the number of false alarms in a wide variety of Monte Carlo simulations. Consequently, a consensus of hypothesis tests appears to improve undocumented changepoint detection skill, especially when reference series homogeneity is violated. A consensus of successive hypothesis tests using a semihierarchic splitting algorithm also compares favorably to optimal solutions, even when changepoints are not hierarchic.

  7. Applying Adverse Outcome Pathways (AOPs) to support Integrated Approaches to Testing and Assessment (IATA).

    PubMed

    Tollefsen, Knut Erik; Scholz, Stefan; Cronin, Mark T; Edwards, Stephen W; de Knecht, Joop; Crofton, Kevin; Garcia-Reyero, Natalia; Hartung, Thomas; Worth, Andrew; Patlewicz, Grace

    2014-12-01

    Chemical regulation is challenged by the large number of chemicals requiring assessment for potential human health and environmental impacts. Current approaches are too resource intensive in terms of time, money and animal use to evaluate all chemicals under development or already on the market. The need for timely and robust decision making demands that regulatory toxicity testing becomes more cost-effective and efficient. One way to realize this goal is by being more strategic in directing testing resources; focusing on chemicals of highest concern, limiting testing to the most probable hazards, or targeting the most vulnerable species. Hypothesis driven Integrated Approaches to Testing and Assessment (IATA) have been proposed as practical solutions to such strategic testing. In parallel, the development of the Adverse Outcome Pathway (AOP) framework, which provides information on the causal links between a molecular initiating event (MIE), intermediate key events (KEs) and an adverse outcome (AO) of regulatory concern, offers the biological context to facilitate development of IATA for regulatory decision making. This manuscript summarizes discussions at the Workshop entitled "Advancing AOPs for Integrated Toxicology and Regulatory Applications" with particular focus on the role AOPs play in informing the development of IATA for different regulatory purposes. Copyright © 2014 Elsevier Inc. All rights reserved.

  8. An automated framework for hypotheses generation using literature.

    PubMed

    Abedi, Vida; Zand, Ramin; Yeasin, Mohammed; Faisal, Fazle Elahi

    2012-08-29

    In bio-medicine, exploratory studies and hypothesis generation often begin with researching existing literature to identify a set of factors and their association with diseases, phenotypes, or biological processes. Many scientists are overwhelmed by the sheer volume of literature on a disease when they plan to generate a new hypothesis or study a biological phenomenon. The situation is even worse for junior investigators who often find it difficult to formulate new hypotheses or, more importantly, corroborate if their hypothesis is consistent with existing literature. It is a daunting task to be abreast with so much being published and also remember all combinations of direct and indirect associations. Fortunately there is a growing trend of using literature mining and knowledge discovery tools in biomedical research. However, there is still a large gap between the huge amount of effort and resources invested in disease research and the little effort in harvesting the published knowledge. The proposed hypothesis generation framework (HGF) finds "crisp semantic associations" among entities of interest - that is a step towards bridging such gaps. The proposed HGF shares similar end goals like the SWAN but are more holistic in nature and was designed and implemented using scalable and efficient computational models of disease-disease interaction. The integration of mapping ontologies with latent semantic analysis is critical in capturing domain specific direct and indirect "crisp" associations, and making assertions about entities (such as disease X is associated with a set of factors Z). Pilot studies were performed using two diseases. A comparative analysis of the computed "associations" and "assertions" with curated expert knowledge was performed to validate the results. It was observed that the HGF is able to capture "crisp" direct and indirect associations, and provide knowledge discovery on demand. The proposed framework is fast, efficient, and robust in generating new hypotheses to identify factors associated with a disease. A full integrated Web service application is being developed for wide dissemination of the HGF. A large-scale study by the domain experts and associated researchers is underway to validate the associations and assertions computed by the HGF.

  9. Acquisition of the novel name--nameless category (N3C) principle.

    PubMed

    Mervis, C B; Bertrand, J

    1994-12-01

    Toddlers' acquisition of the Novel Name-Nameless Category (N3C) principle was examined to investigate the developmental lexical principles framework and the applicability of the specificity hypothesis to relations involving lexical principles. In Study 1, we assessed the ability of 32 children between the ages of 16 and 20 months to use the N3C principle (operationally defined as the ability to fast map). As predicted, only some of the children could fast map. This finding provided evidence for a crucial tenet of the developmental lexical principles framework: Some lexical principles are not available at the start of language acquisition. Children who had acquired the N3C principle also had significantly larger vocabularies and were significantly more likely to demonstrate 2-category exhaustive sorting abilities than children who had not acquired the principle. The 2 groups of children did not differ in either age or object permanence abilities. The 16 children who could not fast map were followed longitudinally until they attained a vocabulary spurt; at that time, their ability to fast map was retested (Study 2). Results provided a longitudinal replication of the findings of Study 1. Implications of these findings for both the developmental lexical principles framework and the specificity hypothesis are discussed.

  10. Bayesian Methods for Determining the Importance of Effects

    USDA-ARS?s Scientific Manuscript database

    Criticisms have plagued the frequentist null-hypothesis significance testing (NHST) procedure since the day it was created from the Fisher Significance Test and Hypothesis Test of Jerzy Neyman and Egon Pearson. Alternatives to NHST exist in frequentist statistics, but competing methods are also avai...

  11. Testing for purchasing power parity in the long-run for ASEAN-5

    NASA Astrophysics Data System (ADS)

    Choji, Niri Martha; Sek, Siok Kun

    2017-04-01

    For more than a decade, there has been a substantial interest in testing for the validity of the purchasing power parity (PPP) hypothesis empirically. This paper performs a test on revealing a long-run relative Purchasing Power Parity for a group of ASEAN-5 countries for the period of 1996-2016 using monthly data. For this purpose, we used the Pedroni co-integration method to test for the long-run hypothesis of purchasing power parity. We first tested for the stationarity of the variables and found that the variables are non-stationary at levels but stationary at first difference. Results of the Pedroni test rejected the null hypothesis of no co-integration meaning that we have enough evidence to support PPP in the long-run for the ASEAN-5 countries over the period of 1996-2016. In other words, the rejection of null hypothesis implies a long-run relation between nominal exchange rates and relative prices.

  12. UNIFORMLY MOST POWERFUL BAYESIAN TESTS

    PubMed Central

    Johnson, Valen E.

    2014-01-01

    Uniformly most powerful tests are statistical hypothesis tests that provide the greatest power against a fixed null hypothesis among all tests of a given size. In this article, the notion of uniformly most powerful tests is extended to the Bayesian setting by defining uniformly most powerful Bayesian tests to be tests that maximize the probability that the Bayes factor, in favor of the alternative hypothesis, exceeds a specified threshold. Like their classical counterpart, uniformly most powerful Bayesian tests are most easily defined in one-parameter exponential family models, although extensions outside of this class are possible. The connection between uniformly most powerful tests and uniformly most powerful Bayesian tests can be used to provide an approximate calibration between p-values and Bayes factors. Finally, issues regarding the strong dependence of resulting Bayes factors and p-values on sample size are discussed. PMID:24659829

  13. Motor Synergies and the Equilibrium-Point Hypothesis

    PubMed Central

    Latash, Mark L.

    2010-01-01

    The article offers a way to unite three recent developments in the field of motor control and coordination: (1) The notion of synergies is introduced based on the principle of motor abundance; (2) The uncontrolled manifold hypothesis is described as offering a computational framework to identify and quantify synergies; and (3) The equilibrium-point hypothesis is described for a single muscle, single joint, and multi-joint systems. Merging these concepts into a single coherent scheme requires focusing on control variables rather than performance variables. The principle of minimal final action is formulated as the guiding principle within the referent configuration hypothesis. Motor actions are associated with setting two types of variables by a controller, those that ultimately define average performance patterns and those that define associated synergies. Predictions of the suggested scheme are reviewed, such as the phenomenon of anticipatory synergy adjustments, quick actions without changes in synergies, atypical synergies, and changes in synergies with practice. A few models are briefly reviewed. PMID:20702893

  14. Motor synergies and the equilibrium-point hypothesis.

    PubMed

    Latash, Mark L

    2010-07-01

    The article offers a way to unite three recent developments in the field of motor control and coordination: (1) The notion of synergies is introduced based on the principle of motor abundance; (2) The uncontrolled manifold hypothesis is described as offering a computational framework to identify and quantify synergies; and (3) The equilibrium-point hypothesis is described for a single muscle, single joint, and multijoint systems. Merging these concepts into a single coherent scheme requires focusing on control variables rather than performance variables. The principle of minimal final action is formulated as the guiding principle within the referent configuration hypothesis. Motor actions are associated with setting two types of variables by a controller, those that ultimately define average performance patterns and those that define associated synergies. Predictions of the suggested scheme are reviewed, such as the phenomenon of anticipatory synergy adjustments, quick actions without changes in synergies, atypical synergies, and changes in synergies with practice. A few models are briefly reviewed.

  15. Dynamically consistent parameterization of mesoscale eddies. Part III: Deterministic approach

    NASA Astrophysics Data System (ADS)

    Berloff, Pavel

    2018-07-01

    This work continues development of dynamically consistent parameterizations for representing mesoscale eddy effects in non-eddy-resolving and eddy-permitting ocean circulation models and focuses on the classical double-gyre problem, in which the main dynamic eddy effects maintain eastward jet extension of the western boundary currents and its adjacent recirculation zones via eddy backscatter mechanism. Despite its fundamental importance, this mechanism remains poorly understood, and in this paper we, first, study it and, then, propose and test its novel parameterization. We start by decomposing the reference eddy-resolving flow solution into the large-scale and eddy components defined by spatial filtering, rather than by the Reynolds decomposition. Next, we find that the eastward jet and its recirculations are robustly present not only in the large-scale flow itself, but also in the rectified time-mean eddies, and in the transient rectified eddy component, which consists of highly anisotropic ribbons of the opposite-sign potential vorticity anomalies straddling the instantaneous eastward jet core and being responsible for its continuous amplification. The transient rectified component is separated from the flow by a novel remapping method. We hypothesize that the above three components of the eastward jet are ultimately driven by the small-scale transient eddy forcing via the eddy backscatter mechanism, rather than by the mean eddy forcing and large-scale nonlinearities. We verify this hypothesis by progressively turning down the backscatter and observing the induced flow anomalies. The backscatter analysis leads us to formulating the key eddy parameterization hypothesis: in an eddy-permitting model at least partially resolved eddy backscatter can be significantly amplified to improve the flow solution. Such amplification is a simple and novel eddy parameterization framework implemented here in terms of local, deterministic flow roughening controlled by single parameter. We test the parameterization skills in an hierarchy of non-eddy-resolving and eddy-permitting modifications of the original model and demonstrate, that indeed it can be highly efficient for restoring the eastward jet extension and its adjacent recirculation zones. The new deterministic parameterization framework not only combines remarkable simplicity with good performance but also is dynamically transparent, therefore, it provides a powerful alternative to the common eddy diffusion and emerging stochastic parameterizations.

  16. [Experimental testing of Pflüger's reflex hypothesis of menstruation in late 19th century].

    PubMed

    Simmer, H H

    1980-07-01

    Pflüger's hypothesis of a nerve reflex as the cause of menstruation published in 1865 and accepted by many, nonetheless did not lead to experimental investigations for 25 years. According to this hypothesis the nerve reflex starts in the ovary by an increase of the intraovarian pressure by the growing follicles. In 1884 Adolph Kehrer proposed a program to test the nerve reflex, but only in 1890, Cohnstein artificially increased the intraovarian pressure in women by bimanual compression from the outside and the vagina. His results were not convincing. Six years later, Strassmann injected fluids into ovaries of animals and obtained changes in the uterus resembling those of oestrus. His results seemed to verify a prognosis derived from Pflüger's hypothesis. Thus, after a long interval, that hypothesis had become a paradigma. Though reasons can be given for the delay, it is little understood, why experimental testing started so late.

  17. When Null Hypothesis Significance Testing Is Unsuitable for Research: A Reassessment.

    PubMed

    Szucs, Denes; Ioannidis, John P A

    2017-01-01

    Null hypothesis significance testing (NHST) has several shortcomings that are likely contributing factors behind the widely debated replication crisis of (cognitive) neuroscience, psychology, and biomedical science in general. We review these shortcomings and suggest that, after sustained negative experience, NHST should no longer be the default, dominant statistical practice of all biomedical and psychological research. If theoretical predictions are weak we should not rely on all or nothing hypothesis tests. Different inferential methods may be most suitable for different types of research questions. Whenever researchers use NHST they should justify its use, and publish pre-study power calculations and effect sizes, including negative findings. Hypothesis-testing studies should be pre-registered and optimally raw data published. The current statistics lite educational approach for students that has sustained the widespread, spurious use of NHST should be phased out.

  18. When Null Hypothesis Significance Testing Is Unsuitable for Research: A Reassessment

    PubMed Central

    Szucs, Denes; Ioannidis, John P. A.

    2017-01-01

    Null hypothesis significance testing (NHST) has several shortcomings that are likely contributing factors behind the widely debated replication crisis of (cognitive) neuroscience, psychology, and biomedical science in general. We review these shortcomings and suggest that, after sustained negative experience, NHST should no longer be the default, dominant statistical practice of all biomedical and psychological research. If theoretical predictions are weak we should not rely on all or nothing hypothesis tests. Different inferential methods may be most suitable for different types of research questions. Whenever researchers use NHST they should justify its use, and publish pre-study power calculations and effect sizes, including negative findings. Hypothesis-testing studies should be pre-registered and optimally raw data published. The current statistics lite educational approach for students that has sustained the widespread, spurious use of NHST should be phased out. PMID:28824397

  19. Improving Second Language Speaking Proficiency via Interactional Feedback

    ERIC Educational Resources Information Center

    Swanson, Peter B.; Schlig, Carmen

    2010-01-01

    Researchers have suggested that interactional feedback is associated with foreign/second language learning because it prompts learners to notice foreign/second language forms. Using Vygotsky's zone of proximal development and Long's interaction hypothesis as conceptual frameworks, this study explores the use of systematic explicit feedback to…

  20. A HYPOTHESIS-DRIVEN FRAMEWORK FOR ASSESSING CLIMATE INDUCED CHANGES IN COASTAL FINAL ECOSYSTEM GOODS AND SERVICES

    EPA Science Inventory

    Understanding how climate change will alter the availability of coastal final ecosystem goods and services (FEGS; such as food provisioning from fisheries, property protection, and recreation) has significant implications for coastal planning and the development of adaptive manag...

  1. Relations among Socioeconomic Status, Age, and Predictors of Phonological Awareness

    ERIC Educational Resources Information Center

    McDowell, Kimberly D.; Lonigan, Christopher J.; Goldstein, Howard

    2007-01-01

    Purpose: This study simultaneously examined predictors of phonological awareness within the framework of 2 theories: the phonological distinctness hypothesis and the lexical restructuring model. Additionally, age as a moderator of the relations between predictor variables and phonological awareness was examined. Method: This cross-sectional…

  2. Postpartum Adjustment in Primiparous Parents.

    ERIC Educational Resources Information Center

    Atkinson, A. Kathleen; Rickel, Annette U.

    Within the framework of the social stress and behavioral theories of depression, this study investigated the hypothesis that postpartum depression is a function of disruption of parents' prepartum functioning by the subsequent demands of infant caretaking. Seventy-eight primiparous married couples (N=156, 78 men and 78 women) volunteered to…

  3. Testing fundamental ecological concepts with a Pythium-Prunus pathosystem

    USDA-ARS?s Scientific Manuscript database

    The study of plant-pathogen interactions has enabled tests of basic ecological concepts on plant community assembly (Janzen-Connell Hypothesis) and plant invasion (Enemy Release Hypothesis). We used a field experiment to (#1) test whether Pythium effects depended on host (seedling) density and/or d...

  4. A checklist to facilitate objective hypothesis testing in social psychology research.

    PubMed

    Washburn, Anthony N; Morgan, G Scott; Skitka, Linda J

    2015-01-01

    Social psychology is not a very politically diverse area of inquiry, something that could negatively affect the objectivity of social psychological theory and research, as Duarte et al. argue in the target article. This commentary offers a number of checks to help researchers uncover possible biases and identify when they are engaging in hypothesis confirmation and advocacy instead of hypothesis testing.

  5. Testing the stress-gradient hypothesis during the restoration of tropical degraded land using the shrub Rhodomyrtus tomentosa as a nurse plant

    Treesearch

    Nan Liu; Hai Ren; Sufen Yuan; Qinfeng Guo; Long Yang

    2013-01-01

    The relative importance of facilitation and competition between pairwise plants across abiotic stress gradients as predicted by the stress-gradient hypothesis has been confirmed in arid and temperate ecosystems, but the hypothesis has rarely been tested in tropical systems, particularly across nutrient gradients. The current research examines the interactions between a...

  6. Phase II Clinical Trials: D-methionine to Reduce Noise-Induced Hearing Loss

    DTIC Science & Technology

    2012-03-01

    loss (NIHL) and tinnitus in our troops. Hypotheses: Primary Hypothesis: Administration of oral D-methionine prior to and during weapons...reduce or prevent noise-induced tinnitus . Primary outcome to test the primary hypothesis: Pure tone air-conduction thresholds. Primary outcome to...test the secondary hypothesis: Tinnitus questionnaires. Specific Aims: 1. To determine whether administering oral D-methionine (D-met) can

  7. An omnibus test for the global null hypothesis.

    PubMed

    Futschik, Andreas; Taus, Thomas; Zehetmayer, Sonja

    2018-01-01

    Global hypothesis tests are a useful tool in the context of clinical trials, genetic studies, or meta-analyses, when researchers are not interested in testing individual hypotheses, but in testing whether none of the hypotheses is false. There are several possibilities how to test the global null hypothesis when the individual null hypotheses are independent. If it is assumed that many of the individual null hypotheses are false, combination tests have been recommended to maximize power. If, however, it is assumed that only one or a few null hypotheses are false, global tests based on individual test statistics are more powerful (e.g. Bonferroni or Simes test). However, usually there is no a priori knowledge on the number of false individual null hypotheses. We therefore propose an omnibus test based on cumulative sums of the transformed p-values. We show that this test yields an impressive overall performance. The proposed method is implemented in an R-package called omnibus.

  8. Explorations in Statistics: Hypothesis Tests and P Values

    ERIC Educational Resources Information Center

    Curran-Everett, Douglas

    2009-01-01

    Learning about statistics is a lot like learning about science: the learning is more meaningful if you can actively explore. This second installment of "Explorations in Statistics" delves into test statistics and P values, two concepts fundamental to the test of a scientific null hypothesis. The essence of a test statistic is that it compares what…

  9. Planned Hypothesis Tests Are Not Necessarily Exempt from Multiplicity Adjustment

    ERIC Educational Resources Information Center

    Frane, Andrew V.

    2015-01-01

    Scientific research often involves testing more than one hypothesis at a time, which can inflate the probability that a Type I error (false discovery) will occur. To prevent this Type I error inflation, adjustments can be made to the testing procedure that compensate for the number of tests. Yet many researchers believe that such adjustments are…

  10. Rugby versus Soccer in South Africa: Content Familiarity Contributes to Cross-Cultural Differences in Cognitive Test Scores

    ERIC Educational Resources Information Center

    Malda, Maike; van de Vijver, Fons J. R.; Temane, Q. Michael

    2010-01-01

    In this study, cross-cultural differences in cognitive test scores are hypothesized to depend on a test's cultural complexity (Cultural Complexity Hypothesis: CCH), here conceptualized as its content familiarity, rather than on its cognitive complexity (Spearman's Hypothesis: SH). The content familiarity of tests assessing short-term memory,…

  11. A study on acceptance of mobileschool at secondary schools in Malaysia: Urban vs rural

    NASA Astrophysics Data System (ADS)

    Hashim, Ahmad Sobri; Ahmad, Wan Fatimah Wan; Sarlan, Aliza

    2017-10-01

    Developing countries are in dilemma where sophisticated technologies are more advance as compared to the way their people think. In education, there have been many novel approaches and technologies were introduced. However, very minimal efforts were put to apply in our education. MobileSchool is a mobile learning (m-learning) management system, developed for administrative, teaching and learning processes at secondary schools in Malaysia. The paper presents the acceptance of MobileSchool between urban and rural secondary schools in Malaysia. Research framework was designed based on Technology Acceptance Model (TAM). The constructs of the framework include computer anxiety, self-efficacy, facilitating condition, technological complexity, perceived behavioral control, perceive ease of use, perceive usefulness, attitude and behavioral intention. Questionnaire was applied as research instrument which involved 373 students from four secondary schools (two schools in urban category and another two in rural category) in Perak. Inferential analyses using hypothesis and t-test, and descriptive analyses using mean and percentage were used to analyze the data. Results showed that there were no big difference (<20%) of all acceptance constructs between urban and rural secondary schools except computer anxiety.

  12. Neural activation patterns of successful episodic encoding: Reorganization during childhood, maintenance in old age.

    PubMed

    Shing, Yee Lee; Brehmer, Yvonne; Heekeren, Hauke R; Bäckman, Lars; Lindenberger, Ulman

    2016-08-01

    The two-component framework of episodic memory (EM) development posits that the contributions of medial temporal lobe (MTL) and prefrontal cortex (PFC) to successful encoding differ across the lifespan. To test the framework's hypotheses, we compared subsequent memory effects (SME) of 10-12 year-old children, younger adults, and older adults using functional magnetic resonance imaging (fMRI). Memory was probed by cued recall, and SME were defined as regional activation differences during encoding between subsequently correctly recalled versus omitted items. In MTL areas, children's SME did not differ in magnitude from those of younger and older adults. In contrast, children's SME in PFC were weaker than the corresponding SME in younger and older adults, in line with the hypothesis that PFC contributes less to successful encoding in childhood. Differences in SME between younger and older adults were negligible. The present results suggest that, among individuals with high memory functioning, the neural circuitry contributing to successful episodic encoding is reorganized from middle childhood to adulthood. Successful episodic encoding in later adulthood, however, is characterized by the ability to maintain the activation patterns that emerged in young adulthood. Copyright © 2016 The Authors. Published by Elsevier Ltd.. All rights reserved.

  13. Trial-by-trial fluctuations in CNV amplitude reflect anticipatory adjustment of response caution.

    PubMed

    Boehm, Udo; van Maanen, Leendert; Forstmann, Birte; van Rijn, Hedderik

    2014-08-01

    The contingent negative variation, a slow cortical potential, occurs when humans are warned by a stimulus about an upcoming task. The cognitive processes that give rise to this EEG potential are not yet well understood. To explain these processes, we adopt a recently developed theoretical framework from the area of perceptual decision-making. This framework assumes that the basal ganglia control the tradeoff between fast and accurate decision-making in the cortex. It suggests that an increase in cortical excitability serves to lower response caution, which results in faster but more error prone responding. We propose that the CNV reflects this increased cortical excitability. To test this hypothesis, we conducted an EEG experiment in which participants performed the random dot motion task either under speed or under accuracy stress. Our results show that trial-by-trial fluctuations in participants' response speed as well as model-based estimates of response caution correlated with single-trial CNV amplitude under conditions of speed but not accuracy stress. We conclude that the CNV might reflect adjustments of response caution, which serves to enhance quick decision-making. Copyright © 2014 Elsevier Inc. All rights reserved.

  14. Is it better to select or to receive? Learning via active and passive hypothesis testing.

    PubMed

    Markant, Douglas B; Gureckis, Todd M

    2014-02-01

    People can test hypotheses through either selection or reception. In a selection task, the learner actively chooses observations to test his or her beliefs, whereas in reception tasks data are passively encountered. People routinely use both forms of testing in everyday life, but the critical psychological differences between selection and reception learning remain poorly understood. One hypothesis is that selection learning improves learning performance by enhancing generic cognitive processes related to motivation, attention, and engagement. Alternatively, we suggest that differences between these 2 learning modes derives from a hypothesis-dependent sampling bias that is introduced when a person collects data to test his or her own individual hypothesis. Drawing on influential models of sequential hypothesis-testing behavior, we show that such a bias (a) can lead to the collection of data that facilitates learning compared with reception learning and (b) can be more effective than observing the selections of another person. We then report a novel experiment based on a popular category learning paradigm that compares reception and selection learning. We additionally compare selection learners to a set of "yoked" participants who viewed the exact same sequence of observations under reception conditions. The results revealed systematic differences in performance that depended on the learner's role in collecting information and the abstract structure of the problem.

  15. Testing for purchasing power parity in 21 African countries using several unit root tests

    NASA Astrophysics Data System (ADS)

    Choji, Niri Martha; Sek, Siok Kun

    2017-04-01

    Purchasing power parity is used as a basis for international income and expenditure comparison through the exchange rate theory. However, empirical studies show disagreement on the validity of PPP. In this paper, we conduct the testing on the validity of PPP using panel data approach. We apply seven different panel unit root tests to test the validity of the purchasing power parity (PPP) hypothesis based on the quarterly data on real effective exchange rate for 21 African countries from the period 1971: Q1-2012: Q4. All the results of the seven tests rejected the hypothesis of stationarity meaning that absolute PPP does not hold in those African Countries. This result confirmed the claim from previous studies that standard panel unit tests fail to support the PPP hypothesis.

  16. Does Testing Increase Spontaneous Mediation in Learning Semantically Related Paired Associates?

    ERIC Educational Resources Information Center

    Cho, Kit W.; Neely, James H.; Brennan, Michael K.; Vitrano, Deana; Crocco, Stephanie

    2017-01-01

    Carpenter (2011) argued that the testing effect she observed for semantically related but associatively unrelated paired associates supports the mediator effectiveness hypothesis. This hypothesis asserts that after the cue-target pair "mother-child" is learned, relative to restudying mother-child, a review test in which…

  17. A HYPOTHESIS FOR THE COLOR BIMODALITY OF JUPITER TROJANS

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Wong, Ian; Brown, Michael E., E-mail: iwong@caltech.edu

    One of the most enigmatic and hitherto unexplained properties of Jupiter Trojans is their bimodal color distribution. This bimodality is indicative of two sub-populations within the Trojans, which have distinct size distributions. In this paper, we present a simple, plausible hypothesis for the origin and evolution of the two Trojan color sub-populations. In the framework of dynamical instability models of early solar system evolution, which suggest a common primordial progenitor population for both Trojans and Kuiper Belt objects, we use observational constraints to assert that the color bimodalities evident in both minor body populations developed within the primordial population priormore » to the onset of instability. We show that, beginning with an initial composition of rock and ices, location-dependent volatile loss through sublimation in this primordial population could have led to sharp changes in the surface composition with heliocentric distance. We propose that the depletion or retention of H{sub 2}S ice on the surface of these objects was the key factor in creating an initial color bimodality. Objects that retained H{sub 2}S on their surfaces developed characteristically redder colors upon irradiation than those that did not. After the bodies from the primordial population were scattered and emplaced into their current positions, they preserved this primordial color bimodality to the present day. We explore predictions of the volatile loss model—in particular, the effect of collisions within the Trojan population on the size distributions of the two sub-populations—and propose further experimental and observational tests of our hypothesis.« less

  18. Global variation in thermal tolerances and vulnerability of endotherms to climate change

    PubMed Central

    Khaliq, Imran; Hof, Christian; Prinzinger, Roland; Böhning-Gaese, Katrin; Pfenninger, Markus

    2014-01-01

    The relationships among species' physiological capacities and the geographical variation of ambient climate are of key importance to understanding the distribution of life on the Earth. Furthermore, predictions of how species will respond to climate change will profit from the explicit consideration of their physiological tolerances. The climatic variability hypothesis, which predicts that climatic tolerances are broader in more variable climates, provides an analytical framework for studying these relationships between physiology and biogeography. However, direct empirical support for the hypothesis is mostly lacking for endotherms, and few studies have tried to integrate physiological data into assessments of species' climatic vulnerability at the global scale. Here, we test the climatic variability hypothesis for endotherms, with a comprehensive dataset on thermal tolerances derived from physiological experiments, and use these data to assess the vulnerability of species to projected climate change. We find the expected relationship between thermal tolerance and ambient climatic variability in birds, but not in mammals—a contrast possibly resulting from different adaptation strategies to ambient climate via behaviour, morphology or physiology. We show that currently most of the species are experiencing ambient temperatures well within their tolerance limits and that in the future many species may be able to tolerate projected temperature increases across significant proportions of their distributions. However, our findings also underline the high vulnerability of tropical regions to changes in temperature and other threats of anthropogenic global changes. Our study demonstrates that a better understanding of the interplay among species' physiology and the geography of climate change will advance assessments of species' vulnerability to climate change. PMID:25009066

  19. Paroxetine exposure skews litter sex ratios in mice suggesting a Trivers–Willard process

    PubMed Central

    Gaukler, Shannon Marie; Ruff, James Steven; Potts, Wayne K.

    2016-01-01

    While conducting a toxicity assessment of the antidepressant paroxetine (Paxil®), in wild-derived mice (Mus musculus), we observed that exposed dams (P0) produced female biased litters (32:68 M:F). Though numerous experimental manipulations have induced sex ratio bias in mice, none have assessed the fitness of the offspring from these litters relative to controls. Here, we retrospectively analyze experimentally derived fitness data gathered for the purpose of toxicological assessment in light of 2 leading hypothesis (Trivers–Willard hypothesis [TWH] and cost of reproduction hypothesis [CRH]), seeking to test if this facultative sex ratio adjustment fits into an adaptive framework. Control F1 males were heavier than F1 females, but no differences in mass were detected between exposed F1 males and females, suggesting that exposed dams did not save energy by producing fewer males, despite producing 29.2% lighter litters relative to controls. F1 offspring of both treatments were released into seminatural enclosures where fitness was quantified. In enclosures, the relative reproductive success of F1-exposed males (compared with controls) was reduced by ~20% compared with the relative reproductive success of F1-exposed females. Thus, exposed dams increased their fitness by adjusting litters toward females who were less negatively affected by the exposure than males. Collectively, these data provide less support that the observed sex ratio bias results in energetic savings (CRH), and more support for the TWH because fitness was increased by biasing litters toward female offspring. These mammalian data are unique in their ability to support the TWH through the use of relevant fitness data. PMID:27418753

  20. Pursuing the method of multiple working hypotheses to understand differences in process-based snow models

    NASA Astrophysics Data System (ADS)

    Clark, Martyn; Essery, Richard

    2017-04-01

    When faced with the complex and interdisciplinary challenge of building process-based land models, different modelers make different decisions at different points in the model development process. These modeling decisions are generally based on several considerations, including fidelity (e.g., what approaches faithfully simulate observed processes), complexity (e.g., which processes should be represented explicitly), practicality (e.g., what is the computational cost of the model simulations; are there sufficient resources to implement the desired modeling concepts), and data availability (e.g., is there sufficient data to force and evaluate models). Consequently the research community, comprising modelers of diverse background, experience, and modeling philosophy, has amassed a wide range of models, which differ in almost every aspect of their conceptualization and implementation. Model comparison studies have been undertaken to explore model differences, but have not been able to meaningfully attribute inter-model differences in predictive ability to individual model components because there are often too many structural and implementation differences among the different models considered. As a consequence, model comparison studies to date have provided limited insight into the causes of differences in model behavior, and model development has often relied on the inspiration and experience of individual modelers rather than on a systematic analysis of model shortcomings. This presentation will summarize the use of "multiple-hypothesis" modeling frameworks to understand differences in process-based snow models. Multiple-hypothesis frameworks define a master modeling template, and include a a wide variety of process parameterizations and spatial configurations that are used in existing models. Such frameworks provide the capability to decompose complex models into the individual decisions that are made as part of model development, and evaluate each decision in isolation. It is hence possible to attribute differences in system-scale model predictions to individual modeling decisions, providing scope to mimic the behavior of existing models, understand why models differ, characterize model uncertainty, and identify productive pathways to model improvement. Results will be presented applying multiple hypothesis frameworks to snow model comparison projects, including PILPS, SnowMIP, and the upcoming ESM-SnowMIP project.

  1. Independent control of joint stiffness in the framework of the equilibrium-point hypothesis.

    PubMed

    Latash, M L

    1992-01-01

    In the framework of the equilibrium-point hypothesis, virtual trajectories and joint stiffness patterns have been reconstructed during two motor tasks practiced against a constant bias torque. One task required a voluntary increase in joint stiffness while preserving the original joint position. The other task involved fast elbow flexions over 36 degrees. Joint stiffness gradually subsided after the termination of fast movements. In both tasks, the external torque could slowly and unexpectedly change. The subjects were required not to change their motor commands if the torque changed, i.e. "to do the same no matter what the motor did". In both tasks, changes in joint stiffness were accompanied by unchanged virtual trajectories that were also independent of the absolute value of the bias torque. By contrast, the intercept of the joint compliant characteristic with the angle axis, r(t)-function, has demonstrated a clear dependence upon both the level of coactivation and external load. We assume that a template virtual trajectory is generated at a certain level of the motor hierarchy and is later scaled taking into account some commonly changing dynamic factors of the movement execution, for example, external load. The scaling leads to the generation of commands to the segmental structures that can be expressed, according to the equilibrium-point hypothesis, as changes in the thresholds of the tonic stretch reflex for corresponding muscles.

  2. Exploring the correlation between annual precipitation and potential evaporation

    NASA Astrophysics Data System (ADS)

    Chen, X.; Buchberger, S. G.

    2017-12-01

    The interdependence between precipitation and potential evaporation is closely related to the classic Budyko framework. In this study, a systematic investigation of the correlation between precipitation and potential evaporation at the annual time step is conducted at both point scale and watershed scale. The point scale precipitation and potential evaporation data over the period of 1984-2015 are collected from 259 weather stations across the United States. The watershed scale precipitation data of 203 watersheds across the United States are obtained from the Model Parameter Estimation Experiment (MOPEX) dataset from 1983 to 2002; and potential evaporation data of these 203 watersheds in the same period are obtained from a remote-sensing algorithm. The results show that majority of the weather stations (77%) and watersheds (79%) exhibit a statistically significant negative correlation between annual precipitation and annual potential evaporation. The aggregated data cloud of precipitation versus potential evaporation follows a curve based on the combination of the Budyko-type equation and Bouchet's complementary relationship. Our result suggests that annual precipitation and potential evaporation are not independent when both Budyko's hypothesis and Bouchet's hypothesis are valid. Furthermore, we find that the wet surface evaporation, which is controlled primarily by short wave radiation as defined in Bouchet's hypothesis, exhibits less dependence on precipitation than the potential evaporation. As a result, we suggest that wet surface evaporation is a better representation of energy supply than potential evaporation in the Budyko framework.

  3. Retrodiction for Bayesian multiple-hypothesis/multiple-target tracking in densely cluttered environment

    NASA Astrophysics Data System (ADS)

    Koch, Wolfgang

    1996-05-01

    Sensor data processing in a dense target/dense clutter environment is inevitably confronted with data association conflicts which correspond with the multiple hypothesis character of many modern approaches (MHT: multiple hypothesis tracking). In this paper we analyze the efficiency of retrodictive techniques that generalize standard fixed interval smoothing to MHT applications. 'Delayed estimation' based on retrodiction provides uniquely interpretable and accurate trajectories from ambiguous MHT output if a certain time delay is tolerated. In a Bayesian framework the theoretical background of retrodiction and its intimate relation to Bayesian MHT is sketched. By a simulated example with two closely-spaced targets, relatively low detection probabilities, and rather high false return densities, we demonstrate the benefits of retrodiction and quantitatively discuss the achievable track accuracies and the time delays involved for typical radar parameters.

  4. A HYPOTHESIS ACCOUNTING FOR THE PARADOXICAL EXPRESSION OF THE D GENE SEGMENT IN THE BCR AND THE TCR

    PubMed Central

    Cohn, Melvin

    2009-01-01

    The D gene segment expressed in both the TCR and BCR has a challenging behavior that begs interpretation. It is incorporated in three reading frames in the rearranged transcription unit but is expressed in antigen-selected cells in a preferred frame. Why was it so important to waste 2/3 of newborn cells? The hypothesis is presented that the D region is framework playing a role in both the TCR and the BCR by determining whether a signal is transmitted to the cell upon interaction with a cognate ligand. This assumption operates in determining haplotype exclusion for the BCR and in regulating the signaling orientation for the TCR. Relevant data as well as a definitive experiment challenging the validity of this hypothesis, are discussed. PMID:18546143

  5. The intrapsychics of gender: a model of self-socialization.

    PubMed

    Tobin, Desiree D; Menon, Meenakshi; Menon, Madhavi; Spatta, Brooke C; Hodges, Ernest V E; Perry, David G

    2010-04-01

    This article outlines a model of the structure and the dynamics of gender cognition in childhood. The model incorporates 3 hypotheses featured in different contemporary theories of childhood gender cognition and unites them under a single theoretical framework. Adapted from Greenwald et al. (2002), the model distinguishes three constructs: gender identity, gender stereotypes, and attribute self-perceptions. The model specifies 3 causal processes among the constructs: Gender identity and stereotypes interactively influence attribute self-perceptions (stereotype emulation hypothesis); gender identity and attribute self-perceptions interactively influence gender stereotypes (stereotype construction hypothesis); and gender stereotypes and attribute self-perceptions interactively influence identity (identity construction hypothesis). The model resolves nagging ambiguities in terminology, organizes diverse hypotheses and empirical findings under a unifying conceptual umbrella, and stimulates many new research directions. PsycINFO Database Record (c) 2010 APA, all rights reserved.

  6. Promoting Task-Based Pragmatics Instruction in EFL Classroom Contexts: The Role of Task Complexity

    ERIC Educational Resources Information Center

    Kim, Youjin; Taguchi, Naoko

    2015-01-01

    Robinson's (2001) Cognition Hypothesis claims that more complex tasks promote interaction and language development. This study examined the effect of task complexity in the learning of request-making expressions. Task complexity was operationalized as [+/- reasoning] following Robinson's framework. The study employed a pretest-posttest research…

  7. Mutual Intercultural Relations among University Students in Canada

    ERIC Educational Resources Information Center

    Gui, Yongxia; Safdar, Saba; Berry, John

    2016-01-01

    The current study examies the views of both international and domestic students in Canada using the conceptual and empirical framework from the MIRIPS (Mutual Intercultural Relations in Plural Societies) project (http://www.victoria.ac.nz/cacr/research/mirips). Two hypotheses were examined. First is the "multiculturalism hypothesis"…

  8. Gilligan's Moral Orientation Hypothesis: Strategies of Justification and Practical Deliberation.

    ERIC Educational Resources Information Center

    Keefer, Matthew Wilks

    Previous studies failed to determine whether Gilligan's (1982) justice and care perspectives represent two distinct orientations of moral reasoning. Using methods developed in research on reasoning and discourse processes, a study used a discursive framework to validate an alternate methodology for the investigation of moral orientation reasoning.…

  9. Agile Software Development in the Department of Defense Environment

    DTIC Science & Technology

    2017-03-31

    Research Methodology .............................................................................................. 17 Research Hypothesis...acquisition framework to enable greater adoption of Agile methodologies . Overview of the Research Methodology The strategy for this study was to...guidance. 17 Chapter 3 – Research Methodology This chapter defines the research methodology and processes used in the study, in an effort to

  10. Understanding Quantum Numbers in General Chemistry Textbooks

    ERIC Educational Resources Information Center

    Niaz, Mansoor; Fernandez, Ramon

    2008-01-01

    Quantum numbers and electron configurations form an important part of the general chemistry curriculum and textbooks. The objectives of this study are: (1) Elaboration of a framework based on the following aspects: (a) Origin of the quantum hypothesis, (b) Alternative interpretations of quantum mechanics, (c) Differentiation between an orbital and…

  11. Exploring the Role of Conventionality in Children's Interpretation of Ironic Remarks

    ERIC Educational Resources Information Center

    Burnett, Debra L.

    2015-01-01

    Irony comprehension in seven- and eight-year-old children with typically developing language skills was explored under the framework of the graded salience hypothesis. Target ironic remarks, either conventional or novel/situation-specific, were presented following brief story contexts. Children's responses to comprehension questions were used to…

  12. Harmony Theory: A Mathematical Framework for Stochastic Parallel Processing.

    ERIC Educational Resources Information Center

    Smolensky, Paul

    This paper presents preliminary results of research founded on the hypothesis that in real environments there exist regularities that can be idealized as mathematical structures that are simple enough to be analyzed. The author considered three steps in analyzing the encoding of modularity of the environment. First, a general information…

  13. [A test of the focusing hypothesis for category judgment: an explanation using the mental-box model].

    PubMed

    Hatori, Tsuyoshi; Takemura, Kazuhisa; Fujii, Satoshi; Ideno, Takashi

    2011-06-01

    This paper presents a new model of category judgment. The model hypothesizes that, when more attention is focused on a category, the psychological range of the category gets narrower (category-focusing hypothesis). We explain this hypothesis by using the metaphor of a "mental-box" model: the more attention that is focused on a mental box (i.e., a category set), the smaller the size of the box becomes (i.e., a cardinal number of the category set). The hypothesis was tested in an experiment (N = 40), where the focus of attention on prescribed verbal categories was manipulated. The obtained data gave support to the hypothesis: category-focusing effects were found in three experimental tasks (regarding the category of "food", "height", and "income"). The validity of the hypothesis was discussed based on the results.

  14. Electric Fields and Enzyme Catalysis

    PubMed Central

    Fried, Stephen D.; Boxer, Steven G.

    2017-01-01

    What happens inside an enzyme’s active site to allow slow and difficult chemical reactions to occur so rapidly? This question has occupied biochemists’ attention for a long time. Computer models of increasing sophistication have predicted an important role for electrostatic interactions in enzymatic reactions, yet this hypothesis has proved vexingly difficult to test experimentally. Recent experiments utilizing the vibrational Stark effect make it possible to measure the electric field a substrate molecule experiences when bound inside its enzyme’s active site. These experiments have provided compelling evidence supporting a major electrostatic contribution to enzymatic catalysis. Here, we review these results and develop a simple model for electrostatic catalysis that enables us to incorporate disparate concepts introduced by many investigators to describe how enzymes work into a more unified framework stressing the importance of electric fields at the active site. PMID:28375745

  15. PTSD as a mediator between lifetime sexual abuse and substance use among jail diversion participants.

    PubMed

    Cusack, Karen J; Herring, Amy H; Steadman, Henry J

    2013-08-01

    Many of the individuals with serious mental illness involved in the criminal justice system have experienced interpersonal victimization, such as sexual abuse, and have high rates of alcohol and drug use disorders. Little attention has been paid to the prevalence of posttraumatic stress disorder (PTSD) and its potential role in the substance misuse of offenders with mental illness. The study used a path analytic framework to test the hypothesis that PTSD mediates the relationship between sexual abuse and level of alcohol and drug use among individuals (N=386) with mental illness enrolled in a multisite (N=7) jail diversion project. Sexual abuse was strongly associated with PTSD, which was in turn associated with both heavy drug use and heavy drinking. These findings suggest that PTSD may be an important target for jail diversion programs.

  16. Relaxorlike dielectric behavior in Ba0.7Sr0.3TiO3 thin films

    NASA Astrophysics Data System (ADS)

    Zednik, Ricardo J.; McIntyre, Paul C.; Baniecki, John D.; Ishii, Masatoshi; Shioga, Takeshi; Kurihara, Kazuaki

    2007-03-01

    We present the results of a systematic dielectric study for sputter deposited barium strontium titanate thin film planar capacitors measured over a wide temperature range of 20-575K for frequencies between 1kHz and 1MHz. Our observations of dielectric loss peaks in the temperature and frequency domains cannot be understood in the typical framework of intrinsic phonon losses. We find that the accepted phenomenological Curie-von Schweidler dielectric behavior (universal relaxation law) in our barium strontium titanate films is only applicable over a narrow temperature range. An excellent fit to the Vogel-Fulcher expression suggests relaxorlike behavior in these films. The activation energy of the observed phenomenon suggests that oxygen ion motion play a role in the apparent relaxor behavior, although further experimental work is required to test this hypothesis.

  17. Modeling the impact of hydromorphological alterations by dams and channelization on fish habitat.

    NASA Astrophysics Data System (ADS)

    Parasiewicz, Piotr; Suska, Katarzyna

    2017-04-01

    As a consequence of introduction of Water Framework Directive it has been discovered that hydromorphological pressures are one of the main causes of impact on aquatic fauna. However, the impact may vary depending on river type and fish community. To test this hypothesis, we modelled alterations of fish habitat on 6 river sections across Poland using MesoHABSIM approach. The original models of habitat for Target Fish Community were based on repeated field surveys in reference river sections, classified into four fish-ecological classes. Introducing to the models three hydromorphological modification types (damming, channelization and dredging) changed persistent habitat availability for the fish community. The change was measured with Habitat Stress Days Alteration index. Overall the modifications caused increase of habitat stress days, but impact varied depending on season, hydromorphologic river type and expected fish community.

  18. Emotional modulation of cognitive control: approach-withdrawal states double-dissociate spatial from verbal two-back task performance.

    PubMed

    Gray, J R

    2001-09-01

    Emotional states might selectively modulate components of cognitive control. To test this hypothesis, the author randomly assigned 152 undergraduates (equal numbers of men and women) to watch short videos intended to induce emotional states (approach, neutral, or withdrawal). Each video was followed by a computerized 2-back working memory task (spatial or verbal, equated for difficulty and appearance). Spatial 2-back performance was enhanced by a withdrawal state and impaired by an approach state; the opposite pattern held for verbal performance. The double dissociation held more strongly for participants who made more errors than average across conditions. The results suggest that approach-withdrawal states can have selective influences on components of cognitive control, possibly on a hemispheric basis. They support and extend several frameworks for conceptualizing emotion-cognition interactions.

  19. An evolutionary psychological investigation of parental distress and reproductive coercion during the "coming out" of gay sons.

    PubMed

    Wisniewski, Timothy J; Robinson, Thomas N; Deluty, Robert H

    2010-01-01

    The lack of success of the "coming out" studies over the last three decades to explain and predict parental responses has motivated an evolutionary psychological reconceptualization. According to this reconceptualization, it was predicted that (a) biological mothers would experience more distress and apply more pressure on gay sons to change than would biological fathers and; (b) obligate investment for fathers on dependent sons would cause fathers to experience more distress and apply more pressure on gay sons to change than it would fathers without this obligate investment. In contrast, a cultural-norm hypothesis predicted that fathers would experience more distress and apply more pressure on gay sons to change than mothers. The majority of predictions were tested using 787 participants from two-biological parent families, who were drawn from a total sample of 891 participants from various family backgrounds. As predicted by the evolutionary hypothesis, biological mothers were reported to have been more distressed and coercive than biological fathers, in spite of a strong, societal expectation to the contrary. Furthermore, the results supported the obligate investment argument for paternal reactions. The model not only correctly explained and predicted parental behavior during coming out, but also was shown to unify within its theoretical framework discrepant results from the literature previously considered inconsistent.

  20. Identification of alterations associated with age in the clustering structure of functional brain networks.

    PubMed

    Guzman, Grover E C; Sato, Joao R; Vidal, Maciel C; Fujita, Andre

    2018-01-01

    Initial studies using resting-state functional magnetic resonance imaging on the trajectories of the brain network from childhood to adulthood found evidence of functional integration and segregation over time. The comprehension of how healthy individuals' functional integration and segregation occur is crucial to enhance our understanding of possible deviations that may lead to brain disorders. Recent approaches have focused on the framework wherein the functional brain network is organized into spatially distributed modules that have been associated with specific cognitive functions. Here, we tested the hypothesis that the clustering structure of brain networks evolves during development. To address this hypothesis, we defined a measure of how well a brain region is clustered (network fitness index), and developed a method to evaluate its association with age. Then, we applied this method to a functional magnetic resonance imaging data set composed of 397 males under 31 years of age collected as part of the Autism Brain Imaging Data Exchange Consortium. As results, we identified two brain regions for which the clustering change over time, namely, the left middle temporal gyrus and the left putamen. Since the network fitness index is associated with both integration and segregation, our finding suggests that the identified brain region plays a role in the development of brain systems.

  1. Debates—Hypothesis testing in hydrology: Introduction

    NASA Astrophysics Data System (ADS)

    Blöschl, Günter

    2017-03-01

    This paper introduces the papers in the "Debates—Hypothesis testing in hydrology" series. The four articles in the series discuss whether and how the process of testing hypotheses leads to progress in hydrology. Repeated experiments with controlled boundary conditions are rarely feasible in hydrology. Research is therefore not easily aligned with the classical scientific method of testing hypotheses. Hypotheses in hydrology are often enshrined in computer models which are tested against observed data. Testability may be limited due to model complexity and data uncertainty. All four articles suggest that hypothesis testing has contributed to progress in hydrology and is needed in the future. However, the procedure is usually not as systematic as the philosophy of science suggests. A greater emphasis on a creative reasoning process on the basis of clues and explorative analyses is therefore needed.

  2. Reasoning Maps: A Generally Applicable Method for Characterizing Hypothesis-Testing Behaviour. Research Report

    ERIC Educational Resources Information Center

    White, Brian

    2004-01-01

    This paper presents a generally applicable method for characterizing subjects' hypothesis-testing behaviour based on a synthesis that extends on previous work. Beginning with a transcript of subjects' speech and videotape of their actions, a Reasoning Map is created that depicts the flow of their hypotheses, tests, predictions, results, and…

  3. Why Is Test-Restudy Practice Beneficial for Memory? An Evaluation of the Mediator Shift Hypothesis

    ERIC Educational Resources Information Center

    Pyc, Mary A.; Rawson, Katherine A.

    2012-01-01

    Although the memorial benefits of testing are well established empirically, the mechanisms underlying this benefit are not well understood. The authors evaluated the mediator shift hypothesis, which states that test-restudy practice is beneficial for memory because retrieval failures during practice allow individuals to evaluate the effectiveness…

  4. Bayesian Approaches to Imputation, Hypothesis Testing, and Parameter Estimation

    ERIC Educational Resources Information Center

    Ross, Steven J.; Mackey, Beth

    2015-01-01

    This chapter introduces three applications of Bayesian inference to common and novel issues in second language research. After a review of the critiques of conventional hypothesis testing, our focus centers on ways Bayesian inference can be used for dealing with missing data, for testing theory-driven substantive hypotheses without a default null…

  5. Distrust and the positive test heuristic: dispositional and situated social distrust improves performance on the Wason rule discovery task.

    PubMed

    Mayo, Ruth; Alfasi, Dana; Schwarz, Norbert

    2014-06-01

    Feelings of distrust alert people not to take information at face value, which may influence their reasoning strategy. Using the Wason (1960) rule identification task, we tested whether chronic and temporary distrust increase the use of negative hypothesis testing strategies suited to falsify one's own initial hunch. In Study 1, participants who were low in dispositional trust were more likely to engage in negative hypothesis testing than participants high in dispositional trust. In Study 2, trust and distrust were induced through an alleged person-memory task. Paralleling the effects of chronic distrust, participants exposed to a single distrust-eliciting face were 3 times as likely to engage in negative hypothesis testing as participants exposed to a trust-eliciting face. In both studies, distrust increased negative hypothesis testing, which was associated with better performance on the Wason task. In contrast, participants' initial rule generation was not consistently affected by distrust. These findings provide first evidence that distrust can influence which reasoning strategy people adopt. PsycINFO Database Record (c) 2014 APA, all rights reserved.

  6. In Defense of the Play-Creativity Hypothesis

    ERIC Educational Resources Information Center

    Silverman, Irwin W.

    2016-01-01

    The hypothesis that pretend play facilitates the creative thought process in children has received a great deal of attention. In a literature review, Lillard et al. (2013, p. 8) concluded that the evidence for this hypothesis was "not convincing." This article focuses on experimental and training studies that have tested this hypothesis.…

  7. The frequentist implications of optional stopping on Bayesian hypothesis tests.

    PubMed

    Sanborn, Adam N; Hills, Thomas T

    2014-04-01

    Null hypothesis significance testing (NHST) is the most commonly used statistical methodology in psychology. The probability of achieving a value as extreme or more extreme than the statistic obtained from the data is evaluated, and if it is low enough, the null hypothesis is rejected. However, because common experimental practice often clashes with the assumptions underlying NHST, these calculated probabilities are often incorrect. Most commonly, experimenters use tests that assume that sample sizes are fixed in advance of data collection but then use the data to determine when to stop; in the limit, experimenters can use data monitoring to guarantee that the null hypothesis will be rejected. Bayesian hypothesis testing (BHT) provides a solution to these ills because the stopping rule used is irrelevant to the calculation of a Bayes factor. In addition, there are strong mathematical guarantees on the frequentist properties of BHT that are comforting for researchers concerned that stopping rules could influence the Bayes factors produced. Here, we show that these guaranteed bounds have limited scope and often do not apply in psychological research. Specifically, we quantitatively demonstrate the impact of optional stopping on the resulting Bayes factors in two common situations: (1) when the truth is a combination of the hypotheses, such as in a heterogeneous population, and (2) when a hypothesis is composite-taking multiple parameter values-such as the alternative hypothesis in a t-test. We found that, for these situations, while the Bayesian interpretation remains correct regardless of the stopping rule used, the choice of stopping rule can, in some situations, greatly increase the chance of experimenters finding evidence in the direction they desire. We suggest ways to control these frequentist implications of stopping rules on BHT.

  8. The relationship between motor proficiency and mental health outcomes in young adults: A test of the Environmental Stress Hypothesis.

    PubMed

    Rigoli, D; Kane, R T; Mancini, V; Thornton, A; Licari, M; Hands, B; McIntyre, F; Piek, J

    2017-06-01

    Growing evidence has highlighted the importance of motor proficiency in relation to psychosocial outcomes including self-perceived competence in various domains, perceived social support, and emotional areas such as anxiety and depression. The Environmental Stress Hypothesis-elaborated (Cairney, Rigoli, & Piek, 2013) is a proposed theoretical framework for understanding these relationships and recent studies have begun examining parts of this model using child and adolescent populations. However, the extent to which the relationships between these areas exist, persist or change during early adulthood is currently unclear. The current study aimed to investigate the Environmental Stress Hypothesis in a sample of 95 young adults aged 18-30years and examined the mediating role of physical self-worth and perceived social support in the relationship between motor proficiency and internalising symptoms. The McCarron Assessment of Neuromuscular Development (McCarron, 1997) was used to assess motor proficiency, the Depression Anxiety Stress Scale (Lovibond & Lovibond, 1995) provided a measure of internalising symptoms, and the Physical Self Perceptions Profile (Fox & Corbin, 1989) and the Multidimensional Scale of Perceived Social Support (Zimet, Dahlem, Zimet, & Farley, 1988) were used to investigate the possible mediating role of physical self-worth and perceived social support respectively. Potential confounding variables such as age, gender and BMI were also considered in the analysis. Structural Equation Modelling revealed that perceived social support mediated the relationship between motor proficiency and internalising symptoms, whereas, the mediating role of physical self-worth was non-significant. The current results provide support for part of the model pathways as described in the Environmental Stress Hypothesis and suggest an important relationship between motor proficiency and psychosocial outcomes in young adults. Specifically, the results support previous literature regarding the significant role of perceived social support for mental well-being and suggest that an intervention that considers social support may also indirectly influence mental health outcomes in young adults who experience movement difficulties. Copyright © 2016 Elsevier B.V. All rights reserved.

  9. Tailoring biocontrol to maximize top-down effects: on the importance of underlying site fertility.

    PubMed

    Hovick, Stephen M; Carson, Walter P

    2015-01-01

    The degree to which biocontrol agents impact invasive plants varies widely across landscapes, often for unknown reasons. Understanding this variability can help optimize invasive species management while also informing our understanding of trophic linkages. To address these issues, we tested three hypotheses with contrasting predictions regarding the likelihood of biocontrol success. (1) The biocontrol effort hypothesis: invasive populations are regulated primarily by top-down effects, predicting that increased biocontrol efforts alone (e.g., more individuals of a given biocontrol agent or more time since agent release) will enhance biocontrol success. (2) The relative fertility hypothesis: invasive populations are regulated primarily by bottom-up effects, predicting that nutrient enrichment will increase dominance by invasives and thus reduce biocontrol success, regardless of biocontrol efforts. (3) The fertility-dependent biocontrol effort hypothesis: top-down effects will only regulate invasive populations if bottom-up effects are weak. It predicts that greater biocontrol efforts will increase biocontrol success, but only in low-nutrient sites. To test these hypotheses, we surveyed 46 sites across three states with prior releases of Galerucella beetles, the most common biocontrol agents used against invasive purple loosestrife (Lythrum salicaria). We found strong support for the fertility-dependent biocontrol effort hypothesis, as biocontrol success occurred most often with greater biocontrol efforts, but only in low-fertility sites. This result held for early stage metrics of biocontrol success (higher Galerucella abundance) and ultimate biocontrol outcomes (decreased loosestrife plant size and abundance). Presence of the invasive grass Phalaris arundinacea was also inversely related to loosestrife abundance, suggesting that biocontrol-based reductions in loosestrife made secondary invasion by P. arundinacea more likely. Our data suggest that low-nutrient sites be prioritized for loosestrife biocontrol and that future monitoring account for variation in site fertility or work to mitigate it. We introduce a new framework that integrates our findings with conflicting patterns previously reported from other biocontrol systems, proposing a unimodal relationship whereby nutrient availability enhances biocontrol success in low-nutrient sites but hampers it in high-nutrient sites. Our results represent one of the first examples of biocontrol success depending on site fertility, which has the potential to inform biocontrol-based management decisions across entire regions and among contrasting systems.

  10. Evoking and Measuring Identification with Narrative Characters – A Linguistic Cues Framework

    PubMed Central

    van Krieken, Kobie; Hoeken, Hans; Sanders, José

    2017-01-01

    Current research on identification with narrative characters poses two problems. First, although identification is seen as a dynamic process of which the intensity varies during reading, it is usually measured by means of post-reading questionnaires containing self-report items. Second, it is not clear which linguistic characteristics evoke identification. The present paper proposes that an interdisciplinary framework allows for more precise manipulations and measurements of identification, which will ultimately advance our understanding of the antecedents and nature of this process. The central hypothesis of our Linguistic Cues Framework is that identification with a narrative character is a multidimensional experience for which different dimensions are evoked by different linguistic cues. The first part of the paper presents a literature review on identification, resulting in a renewed conceptualization of identification which distinguishes six dimensions: a spatiotemporal, a perceptual, a cognitive, a moral, an emotional, and an embodied dimension. The second part argues that each of these dimensions is influenced by specific linguistic cues which represent various aspects of the narrative character’s perspective. The proposed relations between linguistic cues and identification dimensions are specified in six propositions. The third part discusses what psychological and neurocognitive methods enable the measurement of the various identification dimensions in order to test the propositions. By establishing explicit connections between the linguistic characteristics of narratives and readers’ physical, psychological, and neurocognitive responses to narratives, this paper develops a research agenda for future empirical research on identification with narrative characters. PMID:28751875

  11. Evoking and Measuring Identification with Narrative Characters - A Linguistic Cues Framework.

    PubMed

    van Krieken, Kobie; Hoeken, Hans; Sanders, José

    2017-01-01

    Current research on identification with narrative characters poses two problems. First, although identification is seen as a dynamic process of which the intensity varies during reading, it is usually measured by means of post-reading questionnaires containing self-report items. Second, it is not clear which linguistic characteristics evoke identification. The present paper proposes that an interdisciplinary framework allows for more precise manipulations and measurements of identification, which will ultimately advance our understanding of the antecedents and nature of this process. The central hypothesis of our Linguistic Cues Framework is that identification with a narrative character is a multidimensional experience for which different dimensions are evoked by different linguistic cues. The first part of the paper presents a literature review on identification, resulting in a renewed conceptualization of identification which distinguishes six dimensions: a spatiotemporal, a perceptual, a cognitive, a moral, an emotional, and an embodied dimension. The second part argues that each of these dimensions is influenced by specific linguistic cues which represent various aspects of the narrative character's perspective. The proposed relations between linguistic cues and identification dimensions are specified in six propositions. The third part discusses what psychological and neurocognitive methods enable the measurement of the various identification dimensions in order to test the propositions. By establishing explicit connections between the linguistic characteristics of narratives and readers' physical, psychological, and neurocognitive responses to narratives, this paper develops a research agenda for future empirical research on identification with narrative characters.

  12. Close relationship processes and health: implications of attachment theory for health and disease.

    PubMed

    Pietromonaco, Paula R; Uchino, Bert; Dunkel Schetter, Christine

    2013-05-01

    Health psychology has contributed significantly to understanding the link between psychological factors and health and well-being, but it has not often incorporated advances in relationship science into hypothesis generation and study design. We present one example of a theoretical model, following from a major relationship theory (attachment theory) that integrates relationship constructs and processes with biopsychosocial processes and health outcomes. We briefly describe attachment theory and present a general framework linking it to dyadic relationship processes (relationship behaviors, mediators, and outcomes) and health processes (physiology, affective states, health behavior, and health outcomes). We discuss the utility of the model for research in several health domains (e.g., self-regulation of health behavior, pain, chronic disease) and its implications for interventions and future research. This framework revealed important gaps in knowledge about relationships and health. Future work in this area will benefit from taking into account individual differences in attachment, adopting a more explicit dyadic approach, examining more integrated models that test for mediating processes, and incorporating a broader range of relationship constructs that have implications for health. A theoretical framework for studying health that is based in relationship science can accelerate progress by generating new research directions designed to pinpoint the mechanisms through which close relationships promote or undermine health. Furthermore, this knowledge can be applied to develop more effective interventions to help individuals and their relationship partners with health-related challenges. PsycINFO Database Record (c) 2013 APA, all rights reserved.

  13. Close Relationship Processes and Health: Implications of Attachment Theory for Health and Disease

    PubMed Central

    Pietromonaco, Paula R.; Uchino, Bert; Dunkel Schetter, Christine

    2013-01-01

    Objectives Health psychology has contributed significantly to understanding the link between psychological factors and health and well-being, but it has not often incorporated advances in relationship science into hypothesis generation and study design. We present one example of a theoretical model following from a major relationship theory (attachment theory) that integrates relationship constructs and processes with biopsychosocial processes and health outcomes. Methods We briefly describe attachment theory and present a general framework linking it to dyadic relationship processes (relationship behaviors, mediators and outcomes) and health processes (physiology, affective states, health behavior and health outcomes). We discuss the utility of the model for research in several health domains (e.g., self-regulation of health behavior, pain, chronic disease) and its implications for interventions and future research. Results This framework revealed important gaps in knowledge about relationships and health. Future work in this area will benefit from taking into account individual differences in attachment, adopting a more explicit dyadic approach, examining more integrated models that test for mediating processes, and incorporating a broader range of relationship constructs that have implications for health. Conclusions A theoretical framework for studying health that is based in relationship science can accelerate progress by generating new research directions designed to pinpoint the mechanisms through which close relationships promote or undermine health. Furthermore, this knowledge can be applied to develop more effective interventions to help individuals and their relationship partners with health-related challenges. PMID:23646833

  14. TRANSGENIC MOUSE MODELS AND PARTICULATE MATTER (PM)

    EPA Science Inventory

    The hypothesis to be tested is that metal catalyzed oxidative stress can contribute to the biological effects of particulate matter. We acquired several transgenic mouse strains to test this hypothesis. Breeding of the mice was accomplished by Duke University. Particles employed ...

  15. Hypothesis Testing Using the Films of the Three Stooges

    ERIC Educational Resources Information Center

    Gardner, Robert; Davidson, Robert

    2010-01-01

    The use of The Three Stooges' films as a source of data in an introductory statistics class is described. The Stooges' films are separated into three populations. Using these populations, students may conduct hypothesis tests with data they collect.

  16. Hybridisation is associated with increased fecundity and size in invasive taxa: meta-analytic support for the hybridisation-invasion hypothesis

    PubMed Central

    Hovick, Stephen M; Whitney, Kenneth D

    2014-01-01

    The hypothesis that interspecific hybridisation promotes invasiveness has received much recent attention, but tests of the hypothesis can suffer from important limitations. Here, we provide the first systematic review of studies experimentally testing the hybridisation-invasion (H-I) hypothesis in plants, animals and fungi. We identified 72 hybrid systems for which hybridisation has been putatively associated with invasiveness, weediness or range expansion. Within this group, 15 systems (comprising 34 studies) experimentally tested performance of hybrids vs. their parental species and met our other criteria. Both phylogenetic and non-phylogenetic meta-analyses demonstrated that wild hybrids were significantly more fecund and larger than their parental taxa, but did not differ in survival. Resynthesised hybrids (which typically represent earlier generations than do wild hybrids) did not consistently differ from parental species in fecundity, survival or size. Using meta-regression, we found that fecundity increased (but survival decreased) with generation in resynthesised hybrids, suggesting that natural selection can play an important role in shaping hybrid performance – and thus invasiveness – over time. We conclude that the available evidence supports the H-I hypothesis, with the caveat that our results are clearly driven by tests in plants, which are more numerous than tests in animals and fungi. PMID:25234578

  17. The Harm Done to Reproducibility by the Culture of Null Hypothesis Significance Testing.

    PubMed

    Lash, Timothy L

    2017-09-15

    In the last few years, stakeholders in the scientific community have raised alarms about a perceived lack of reproducibility of scientific results. In reaction, guidelines for journals have been promulgated and grant applicants have been asked to address the rigor and reproducibility of their proposed projects. Neither solution addresses a primary culprit, which is the culture of null hypothesis significance testing that dominates statistical analysis and inference. In an innovative research enterprise, selection of results for further evaluation based on null hypothesis significance testing is doomed to yield a low proportion of reproducible results and a high proportion of effects that are initially overestimated. In addition, the culture of null hypothesis significance testing discourages quantitative adjustments to account for systematic errors and quantitative incorporation of prior information. These strategies would otherwise improve reproducibility and have not been previously proposed in the widely cited literature on this topic. Without discarding the culture of null hypothesis significance testing and implementing these alternative methods for statistical analysis and inference, all other strategies for improving reproducibility will yield marginal gains at best. © The Author(s) 2017. Published by Oxford University Press on behalf of the Johns Hopkins Bloomberg School of Public Health. All rights reserved. For permissions, please e-mail: journals.permissions@oup.com.

  18. Scaling of Topologically Similar Functional Modules Defines Mouse Primary Auditory and Somatosensory Microcircuitry

    PubMed Central

    Sadovsky, Alexander J.

    2013-01-01

    Mapping the flow of activity through neocortical microcircuits provides key insights into the underlying circuit architecture. Using a comparative analysis we determined the extent to which the dynamics of microcircuits in mouse primary somatosensory barrel field (S1BF) and auditory (A1) neocortex generalize. We imaged the simultaneous dynamics of up to 1126 neurons spanning multiple columns and layers using high-speed multiphoton imaging. The temporal progression and reliability of reactivation of circuit events in both regions suggested common underlying cortical design features. We used circuit activity flow to generate functional connectivity maps, or graphs, to test the microcircuit hypothesis within a functional framework. S1BF and A1 present a useful test of the postulate as both regions map sensory input anatomically, but each area appears organized according to different design principles. We projected the functional topologies into anatomical space and found benchmarks of organization that had been previously described using physiology and anatomical methods, consistent with a close mapping between anatomy and functional dynamics. By comparing graphs representing activity flow we found that each region is similarly organized as highlighted by hallmarks of small world, scale free, and hierarchical modular topologies. Models of prototypical functional circuits from each area of cortex were sufficient to recapitulate experimentally observed circuit activity. Convergence to common behavior by these models was accomplished using preferential attachment to scale from an auditory up to a somatosensory circuit. These functional data imply that the microcircuit hypothesis be framed as scalable principles of neocortical circuit design. PMID:23986241

  19. A common optimization principle for motor execution in healthy subjects and parkinsonian patients.

    PubMed

    Baraduc, Pierre; Thobois, Stéphane; Gan, Jing; Broussolle, Emmanuel; Desmurget, Michel

    2013-01-09

    Recent research on Parkinson's disease (PD) has emphasized that parkinsonian movement, although bradykinetic, shares many attributes with healthy behavior. This observation led to the suggestion that bradykinesia in PD could be due to a reduction in motor motivation. This hypothesis can be tested in the framework of optimal control theory, which accounts for many characteristics of healthy human movement while providing a link between the motor behavior and a cost/benefit trade-off. This approach offers the opportunity to interpret movement deficits of PD patients in the light of a computational theory of normal motor control. We studied 14 PD patients with bilateral subthalamic nucleus (STN) stimulation and 16 age-matched healthy controls, and tested whether reaching movements were governed by similar rules in these two groups. A single optimal control model accounted for the reaching movements of healthy subjects and PD patients, whatever the condition of STN stimulation (on or off). The choice of movement speed was explained in all subjects by the existence of a preset dynamic range for the motor signals. This range was idiosyncratic and applied to all movements regardless of their amplitude. In PD patients this dynamic range was abnormally narrow and correlated with bradykinesia. STN stimulation reduced bradykinesia and widened this range in all patients, but did not restore it to a normal value. These results, consistent with the motor motivation hypothesis, suggest that constrained optimization of motor effort is the main determinant of movement planning (choice of speed) and movement production, in both healthy and PD subjects.

  20. Phylogenetic relationships among the North American cleomoids (Cleomaceae): a test of Iltis's reduction series.

    PubMed

    Riser, James P; Cardinal-McTeague, Warren M; Hall, Jocelyn C; Hahn, William J; Sytsma, Kenneth J; Roalson, Eric H

    2013-10-01

    A monophyletic group composed of five genera of the Cleomaceae represents an intriguing lineage with outstanding taxonomic and evolutionary questions. Generic boundaries are poorly defined, and historical hypotheses regarding the evolution of fruit type and phylogenetic relationships provide testable questions. This is the first detailed phylogenetic investigation of all 22 species in this group. We use this phylogenetic framework to assess generic monophyly and test Iltis's evolutionary "reduction series" hypothesis regarding phylogeny and fruit type/seed number. • Maximum likelihood and Bayesian analyses of four plastid intergenic spacer region sequences (rpl32-trnL, trnQ-rps16, ycf1-rps15, and psbA-trnH) and one nuclear (ITS) region were used to reconstruct phylogenetic relationships among the NA cleomoid species. Stochastic mapping and ancestral-state reconstruction were used to study the evolution of fruit type. • Both analyses recovered nearly identical phylogenies. Three of the currently recognized genera (Wislizenia, Carsonia, and Oxystylis) are monophyletic while two (Cleomella and Peritoma) are para- or polyphyletic. There was a single origin of the two-seeded schizocarp in the ancestor of the Oxystylis-Wislizenia clade and a secondary derivation of elongated capsule-type fruits in Peritoma from a truncated capsule state in Cleomella. • Our well-resolved phylogeny supports most of the current species circumscriptions but not current generic circumscriptions. Additionally, our results are inconsistent with Iltis's hypothesis of species with elongated many-seed fruits giving rise to species with truncated few-seeded fruits. Instead, we find support for the reversion to elongated multiseeded fruits from a truncate few-seeded ancestor in Peritoma.

  1. Cross multivariate correlation coefficients as screening tool for analysis of concurrent EEG-fMRI recordings.

    PubMed

    Ji, Hong; Petro, Nathan M; Chen, Badong; Yuan, Zejian; Wang, Jianji; Zheng, Nanning; Keil, Andreas

    2018-02-06

    Over the past decade, the simultaneous recording of electroencephalogram (EEG) and functional magnetic resonance imaging (fMRI) data has garnered growing interest because it may provide an avenue towards combining the strengths of both imaging modalities. Given their pronounced differences in temporal and spatial statistics, the combination of EEG and fMRI data is however methodologically challenging. Here, we propose a novel screening approach that relies on a Cross Multivariate Correlation Coefficient (xMCC) framework. This approach accomplishes three tasks: (1) It provides a measure for testing multivariate correlation and multivariate uncorrelation of the two modalities; (2) it provides criterion for the selection of EEG features; (3) it performs a screening of relevant EEG information by grouping the EEG channels into clusters to improve efficiency and to reduce computational load when searching for the best predictors of the BOLD signal. The present report applies this approach to a data set with concurrent recordings of steady-state-visual evoked potentials (ssVEPs) and fMRI, recorded while observers viewed phase-reversing Gabor patches. We test the hypothesis that fluctuations in visuo-cortical mass potentials systematically covary with BOLD fluctuations not only in visual cortical, but also in anterior temporal and prefrontal areas. Results supported the hypothesis and showed that the xMCC-based analysis provides straightforward identification of neurophysiological plausible brain regions with EEG-fMRI covariance. Furthermore xMCC converged with other extant methods for EEG-fMRI analysis. © 2018 The Authors Journal of Neuroscience Research Published by Wiley Periodicals, Inc.

  2. Does job burnout mediate negative effects of job demands on mental and physical health in a group of teachers? Testing the energetic process of Job Demands-Resources model.

    PubMed

    Baka, Łukasz

    2015-01-01

    The aim of the study was to investigate the direct and indirect - mediated by job burnout - effects of job demands on mental and physical health problems. The Job Demands-Resources model was the theoretical framework of the study. Three job demands were taken into account - interpersonal conflicts at work, organizational constraints and workload. Indicators of mental and physical health problems included depression and physical symptoms, respectively. Three hundred and sixteen Polish teachers from 8 schools participated in the study. The hypotheses were tested with the use of tools measuring job demands (Interpersonal Conflicts at Work, Organizational Constraints, Quantitative Workload), job burnout (the Oldenburg Burnout Inventory), depression (the Beck Hopelessness Scale), and physical symptoms (the Physical Symptoms Inventory). The regression analysis with bootstrapping, using the PROCESS macros of Hayes was applied. The results support the hypotheses partially. The indirect effect and to some extent the direct effect of job demands turned out to be statistically important. The negative impact of 3 job demands on mental (hypothesis 1 - H1) and physical (hypothesis 2 - H2) health were mediated by the increasing job burnout. Only organizational constraints were directly associated with mental (and not physical) health. The results partially support the notion of the Job Demands-Resources model and provide further insight into processes leading to the low well-being of teachers in the workplace. This work is available in Open Access model and licensed under a CC BY-NC 3.0 PL license.

  3. Optimality and stability of intentional and unintentional actions: I. Origins of drifts in performance.

    PubMed

    Parsa, Behnoosh; Terekhov, Alexander; Zatsiorsky, Vladimir M; Latash, Mark L

    2017-02-01

    We address the nature of unintentional changes in performance in two papers. This first paper tested a hypothesis that unintentional changes in performance variables during continuous tasks without visual feedback are due to two processes. First, there is a drift of the referent coordinate for the salient performance variable toward the actual coordinate of the effector. Second, there is a drift toward minimum of a cost function. We tested this hypothesis in four-finger isometric pressing tasks that required the accurate production of a combination of total moment and total force with natural and modified finger involvement. Subjects performed accurate force-moment production tasks under visual feedback, and then visual feedback was removed for some or all of the salient variables. Analytical inverse optimization was used to compute a cost function. Without visual feedback, both force and moment drifted slowly toward lower absolute magnitudes. Over 15 s, the force drop could reach 20% of its initial magnitude while moment drop could reach 30% of its initial magnitude. Individual finger forces could show drifts toward both higher and lower forces. The cost function estimated using the analytical inverse optimization reduced its value as a consequence of the drift. We interpret the results within the framework of hierarchical control with referent spatial coordinates for salient variables at each level of the hierarchy combined with synergic control of salient variables. The force drift is discussed as a natural relaxation process toward states with lower potential energy in the physical (physiological) system involved in the task.

  4. Optimality and stability of intentional and unintentional actions: I. Origins of drifts in performance

    PubMed Central

    Parsa, Behnoosh; Terekhov, Alexander; Zatsiorsky, Vladimir M.; Latash, Mark L.

    2016-01-01

    We address the nature of unintentional changes in performance in two papers. This first paper tested a hypothesis that unintentional changes in performance variables during continuous tasks without visual feedback are due to two processes. First, there is a drift of the referent coordinate for the salient performance variable toward the actual coordinate of the effector. Second, there is a drift toward minimum of a cost function. We tested this hypothesis in four-finger isometric pressing tasks that required the accurate production of a combination of total moment and total force with natural and modified finger involvement. Subjects performed accurate force/moment production tasks under visual feedback, and then visual feedback was removed for some or all of the salient variables. Analytical inverse optimization was used to compute a cost function. Without visual feedback, both force and moment drifted slowly toward lower absolute magnitudes. Over 15 s, the force drop could reach 20% of its initial magnitude while moment drop could reach 30% of its initial magnitude. Individual finger forces could show drifts toward both higher and lower forces. The cost function estimated using the analytical inverse optimization reduced its value as a consequence of the drift. We interpret the results within the framework of hierarchical control with referent spatial coordinates for salient variables at each level of the hierarchy combined with synergic control of salient variables. The force drift is discussed as a natural relaxation process toward states with lower potential energy in the physical (physiological) system involved in the task. PMID:27785549

  5. The Impact of Economic Factors and Acquisition Reforms on the Cost of Defense Weapon Systems

    DTIC Science & Technology

    2006-03-01

    test for homoskedasticity, the Breusch - Pagan test is employed. The null hypothesis of the Breusch - Pagan test is that the variance is equal to zero...made. Using the Breusch - Pagan test shown in Table 19 below, the prob>chi2 is greater than 05.=α , therefore we fail to reject the null hypothesis...overrunpercentfp100 Breusch - Pagan Test (Ho=Constant Variance) Estimated Results Variance Standard Deviation overrunpercent100

  6. An Inferential Confidence Interval Method of Establishing Statistical Equivalence that Corrects Tryon's (2001) Reduction Factor

    ERIC Educational Resources Information Center

    Tryon, Warren W.; Lewis, Charles

    2008-01-01

    Evidence of group matching frequently takes the form of a nonsignificant test of statistical difference. Theoretical hypotheses of no difference are also tested in this way. These practices are flawed in that null hypothesis statistical testing provides evidence against the null hypothesis and failing to reject H[subscript 0] is not evidence…

  7. Effects of Item Exposure for Conventional Examinations in a Continuous Testing Environment.

    ERIC Educational Resources Information Center

    Hertz, Norman R.; Chinn, Roberta N.

    This study explored the effect of item exposure on two conventional examinations administered as computer-based tests. A principal hypothesis was that item exposure would have little or no effect on average difficulty of the items over the course of an administrative cycle. This hypothesis was tested by exploring conventional item statistics and…

  8. Directional and Non-directional Hypothesis Testing: A Survey of SIG Members, Journals, and Textbooks.

    ERIC Educational Resources Information Center

    McNeil, Keith

    The use of directional and nondirectional hypothesis testing was examined from the perspectives of textbooks, journal articles, and members of editorial boards. Three widely used statistical texts were reviewed in terms of how directional and nondirectional tests of significance were presented. Texts reviewed were written by: (1) D. E. Hinkle, W.…

  9. The Feminization of School Hypothesis Called into Question among Junior and High School Students

    ERIC Educational Resources Information Center

    Verniers, Catherine; Martinot, Delphine; Dompnier, Benoît

    2016-01-01

    Background: The feminization of school hypothesis suggests that boys underachieve in school compared to girls because school rewards feminine characteristics that are at odds with boys' masculine features. Aims: The feminization of school hypothesis lacks empirical evidence. The aim of this study was to test this hypothesis by examining the extent…

  10. Marital Conflict and Early Adolescents' Self-Evaluation: The Role of Parenting Quality and Early Adolescents' Appraisals

    ERIC Educational Resources Information Center

    Siffert, Andrea; Schwarz, Beate; Stutz, Melanie

    2012-01-01

    Cognitive appraisals and family dynamics have been identified as mediators of the relationship between marital conflict and children's adjustment. Surprisingly little research has investigated both meditational processes in the same study. Guided by the cognitive-contextual framework and the spillover hypothesis, the present study integrated…

  11. Multi-Tiered System of Support: Best Differentiation Practices for English Language Learners in Tier 1

    ERIC Educational Resources Information Center

    Izaguirre, Cecilia

    2017-01-01

    Purpose: This qualitative case study explored the best practices of differentiation of Tier 1 instruction within a multi-tiered system of support for English Language Learners who were predominately Spanish speaking. Theoretical Framework: The zone of proximal development theory, cognitive theory, and the affective filter hypothesis guided this…

  12. Soilborne fungi have host affinity and host-specific effects on seed germination and survival in a lowland tropical forest

    USDA-ARS?s Scientific Manuscript database

    The Janzen-Connell (JC) hypothesis provides a powerful framework for explaining the maintenance of tree diversity in tropical forests. Its central tenet -- that recruits experience high mortality near conspecifics and at high densities -- assumes a degree of host specialization in interactions betwe...

  13. Substituted-Letter and Transposed-Letter Effects in a Masked Priming Paradigm with French Developing Readers and Dyslexics

    ERIC Educational Resources Information Center

    Lete, Bernard; Fayol, Michel

    2013-01-01

    The aim of the study was to undertake a behavioral investigation of the development of automatic orthographic processing during reading acquisition in French. Following Castles and colleagues' 2007 study ("Journal of Experimental Child Psychology, 97," 165-182) and their lexical tuning hypothesis framework, substituted-letter and…

  14. Negotiation of Meaning in Synchronous Computer-Mediated Communication in Relation to Task Types

    ERIC Educational Resources Information Center

    Cho, Hye-jin

    2011-01-01

    The present study explored how negotiation of meaning occurred in task-based synchronous computer-mediated communication (SCMC) environment among college English learners. Based on the theoretical framework of the interaction hypothesis and negotiation of meaning, four research questions arose: (1) how negotiation of meaning occur in non-native…

  15. Personality Traits, Vocational Interests, and Career Exploration: A Cross-Cultural Comparison between American and Hong Kong Students

    ERIC Educational Resources Information Center

    Fan, Weiqiao; Cheung, Fanny M.; Leong, Frederick T. L.; Cheung, Shu Fai

    2012-01-01

    This study compared the pattern of relationships among personality, vocational interests, and career exploration within an integrated framework between 369 American and 392 Hong Kong university students. The first hypothesis predicted differential contributions of the universal and indigenous personality dimensions based on the Cross-cultural…

  16. The Narrative Worlds of "What Is" and "What if"

    ERIC Educational Resources Information Center

    Engel, Susan

    2005-01-01

    This paper advances the hypothesis that young children use narrative play and stories to construct two types of fiction, the worlds of "what is" and "what if." Heinz Werner's conceptualization of children's spheres of reality, in which actions, symbols, and events are constructed in particular ways, is used as a theoretical framework for…

  17. Minnows as a Classroom Model for Human Environmental Health

    ERIC Educational Resources Information Center

    Weber, Daniel N.; Hesselbach, Renee; Kane, Andrew S.; Petering, David H.; Petering, Louise; Berg, Craig A.

    2013-01-01

    Understanding human environmental health is difficult for high school students, as is the process of scientific investigation. This module provides a framework to address both concerns through an inquiry-based approach using a hypothesis-driven set of experiments that draws upon a real-life concern, environmental exposures to lead (Pb2+). Students…

  18. The limits to pride: A test of the pro-anorexia hypothesis.

    PubMed

    Cornelius, Talea; Blanton, Hart

    2016-01-01

    Many social psychological models propose that positive self-conceptions promote self-esteem. An extreme version of this hypothesis is advanced in "pro-anorexia" communities: identifying with anorexia, in conjunction with disordered eating, can lead to higher self-esteem. The current study empirically tested this hypothesis. Results challenge the pro-anorexia hypothesis. Although those with higher levels of pro-anorexia identification trended towards higher self-esteem with increased disordered eating, this did not overcome the strong negative main effect of pro-anorexia identification. These data suggest a more effective strategy for promoting self-esteem is to encourage rejection of disordered eating and an anorexic identity.

  19. Decision making and sequential sampling from memory

    PubMed Central

    Shadlen, Michael N.; Shohamy, Daphna

    2016-01-01

    Decisions take time, and as a rule more difficult decisions take more time. But this only raises the question of what consumes the time. For decisions informed by a sequence of samples of evidence, the answer is straightforward: more samples are available with more time. Indeed the speed and accuracy of such decisions are explained by the accumulation of evidence to a threshold or bound. However, the same framework seems to apply to decisions that are not obviously informed by sequences of evidence samples. Here we proffer the hypothesis that the sequential character of such tasks involves retrieval of evidence from memory. We explore this hypothesis by focusing on value-based decisions and argue that mnemonic processes can account for regularities in choice and decision time. We speculate on the neural mechanisms that link sampling of evidence from memory to circuits that represent the accumulated evidence bearing on a choice. We propose that memory processes may contribute to a wider class of decisions that conform to the regularities of choice-reaction time predicted by the sequential sampling framework. PMID:27253447

  20. FBK Optical Data Association in a Multi-Hypothesis Framework with Maneuvers

    NASA Astrophysics Data System (ADS)

    Faber, W. R.; Hussein, I. I.; Kent, J. T.; Bhattacharjee, S. Jah, M. K.

    In Space Situational Awareness (SSA), one may encounter scenarios where the measurements received at a certain time do not correlate to a known Resident Space Object (RSO). Without information that uniquely assigns the measurement to a particular RSO there can be no certainty on the identity of the object. It could be that the measurement was produced by clutter or perhaps a newly birthed RSO. It is also a possibility that the measurement came from a previously known object that maneuvered away from its predicted location. Typically, tracking methods tend to associate uncorrelated measurements to new objects and wait for more information to determine the true RSO population. This can lead to the loss of object custody. The goal of this paper is to utilize a multiple hypothesis framework coupled with some knowledge of RSO maneuvers that allows the user to maintain object custody in scenarios with uncorrelated optical measurement returns. This is achieved by fitting a Fisher-Bingham-Kent type distribution to the hypothesized maneuvers for accurate data association using directional discriminant analysis.

  1. Does the Slow-Growth, High-Mortality Hypothesis Apply Below Ground?

    PubMed

    Hourston, James E; Bennett, Alison E; Johnson, Scott N; Gange, Alan C

    2016-01-01

    Belowground tri-trophic study systems present a challenging environment in which to study plant-herbivore-natural enemy interactions. For this reason, belowground examples are rarely available for testing general ecological theories. To redress this imbalance, we present, for the first time, data on a belowground tri-trophic system to test the slow growth, high mortality hypothesis. We investigated whether the differing performance of entomopathogenic nematodes (EPNs) in controlling the common pest black vine weevil Otiorhynchus sulcatus could be linked to differently resistant cultivars of the red raspberry Rubus idaeus. The O. sulcatus larvae recovered from R. idaeus plants showed significantly slower growth and higher mortality on the Glen Rosa cultivar, relative to the more commercially favored Glen Ample cultivar creating a convenient system for testing this hypothesis. Heterorhabditis megidis was found to be less effective at controlling O. sulcatus than Steinernema kraussei, but conformed to the hypothesis. However, S. kraussei maintained high levels of O. sulcatus mortality regardless of how larval growth was influenced by R. idaeus cultivar. We link this to direct effects that S. kraussei had on reducing O. sulcatus larval mass, indicating potential sub-lethal effects of S. kraussei, which the slow-growth, high-mortality hypothesis does not account for. Possible origins of these sub-lethal effects of EPN infection and how they may impact on a hypothesis designed and tested with aboveground predator and parasitoid systems are discussed.

  2. The epistemological function of Hill's criteria.

    PubMed

    Bird, Alexander

    2011-10-01

    This article outlines an epistemological framework for understanding how Hill's criteria may aid us in establishing a causal hypothesis (A causes B) in an observational study. We consider Hill's criteria in turn with respect to their ability or otherwise to exclude alternative hypotheses (B causes A; there is a common cause of A and B; there is no causal connection between A and B). We may classify Hill's criteria according to which of the alternative hypotheses they are able to exclude, and also on the basis of whether they relate to (a) evidence from within observational study or (b) evidence independent of that study. It is noted that no criterion is able to exclude the common cause hypothesis in a systematic way. Observational studies are typically weaker than experimental studies, since the latter can systematically exclude competing hypotheses, whereas observational studies lack a systematic way of ruling out the common cause hypothesis. Copyright © 2011 Elsevier Inc. All rights reserved.

  3. Mirror neurons and the social nature of language: the neural exploitation hypothesis.

    PubMed

    Gallese, Vittorio

    2008-01-01

    This paper discusses the relevance of the discovery of mirror neurons in monkeys and of the mirror neuron system in humans to a neuroscientific account of primates' social cognition and its evolution. It is proposed that mirror neurons and the functional mechanism they underpin, embodied simulation, can ground within a unitary neurophysiological explanatory framework important aspects of human social cognition. In particular, the main focus is on language, here conceived according to a neurophenomenological perspective, grounding meaning on the social experience of action. A neurophysiological hypothesis--the "neural exploitation hypothesis"--is introduced to explain how key aspects of human social cognition are underpinned by brain mechanisms originally evolved for sensorimotor integration. It is proposed that these mechanisms were later on adapted as new neurofunctional architecture for thought and language, while retaining their original functions as well. By neural exploitation, social cognition and language can be linked to the experiential domain of action.

  4. Integrating prior knowledge in multiple testing under dependence with applications to detecting differential DNA methylation.

    PubMed

    Kuan, Pei Fen; Chiang, Derek Y

    2012-09-01

    DNA methylation has emerged as an important hallmark of epigenetics. Numerous platforms including tiling arrays and next generation sequencing, and experimental protocols are available for profiling DNA methylation. Similar to other tiling array data, DNA methylation data shares the characteristics of inherent correlation structure among nearby probes. However, unlike gene expression or protein DNA binding data, the varying CpG density which gives rise to CpG island, shore and shelf definition provides exogenous information in detecting differential methylation. This article aims to introduce a robust testing and probe ranking procedure based on a nonhomogeneous hidden Markov model that incorporates the above-mentioned features for detecting differential methylation. We revisit the seminal work of Sun and Cai (2009, Journal of the Royal Statistical Society: Series B (Statistical Methodology)71, 393-424) and propose modeling the nonnull using a nonparametric symmetric distribution in two-sided hypothesis testing. We show that this model improves probe ranking and is robust to model misspecification based on extensive simulation studies. We further illustrate that our proposed framework achieves good operating characteristics as compared to commonly used methods in real DNA methylation data that aims to detect differential methylation sites. © 2012, The International Biometric Society.

  5. Psychosocial work conditions, perceived stress, perceived muscular tension, and neck/shoulder symptoms among medical secretaries.

    PubMed

    Larsman, Pernilla; Kadefors, Roland; Sandsjö, Leif

    2013-01-01

    Unfavorable psychosocial working conditions are hypothesized to lead to perceived stress, which, in turn, can be related to an increased risk of development of neck/shoulder symptoms through increased and sustained muscle activation. The aim of the present study was to test this hypothesized process model among medical secretaries, a female-dominated profession characterized by a high amount of visual display unit use and a high prevalence of neck/shoulder symptoms. In this cross-sectional study, a questionnaire survey was conducted among medical secretaries (n = 200). The proposed process model was tested using a path model framework. The results indicate that high work demands were related to high perceived stress, which in turn was related to a high perceived muscle tension and neck/shoulder symptoms. Low influence at work was not related to perceived stress, but was directly related to a high perceived muscle tension. In general, these cross-sectional results lend tentative support for the hypothesis that adverse psychosocial work conditions (high work demands) may contribute to the development of neck/shoulder symptoms through the mechanism of stress-induced sustained muscular activation. This process model needs to be further tested in longitudinal studies.

  6. Goodness of fit of probability distributions for sightings as species approach extinction.

    PubMed

    Vogel, Richard M; Hosking, Jonathan R M; Elphick, Chris S; Roberts, David L; Reed, J Michael

    2009-04-01

    Estimating the probability that a species is extinct and the timing of extinctions is useful in biological fields ranging from paleoecology to conservation biology. Various statistical methods have been introduced to infer the time of extinction and extinction probability from a series of individual sightings. There is little evidence, however, as to which of these models provide adequate fit to actual sighting records. We use L-moment diagrams and probability plot correlation coefficient (PPCC) hypothesis tests to evaluate the goodness of fit of various probabilistic models to sighting data collected for a set of North American and Hawaiian bird populations that have either gone extinct, or are suspected of having gone extinct, during the past 150 years. For our data, the uniform, truncated exponential, and generalized Pareto models performed moderately well, but the Weibull model performed poorly. Of the acceptable models, the uniform distribution performed best based on PPCC goodness of fit comparisons and sequential Bonferroni-type tests. Further analyses using field significance tests suggest that although the uniform distribution is the best of those considered, additional work remains to evaluate the truncated exponential model more fully. The methods we present here provide a framework for evaluating subsequent models.

  7. A critique of statistical hypothesis testing in clinical research

    PubMed Central

    Raha, Somik

    2011-01-01

    Many have documented the difficulty of using the current paradigm of Randomized Controlled Trials (RCTs) to test and validate the effectiveness of alternative medical systems such as Ayurveda. This paper critiques the applicability of RCTs for all clinical knowledge-seeking endeavors, of which Ayurveda research is a part. This is done by examining statistical hypothesis testing, the underlying foundation of RCTs, from a practical and philosophical perspective. In the philosophical critique, the two main worldviews of probability are that of the Bayesian and the frequentist. The frequentist worldview is a special case of the Bayesian worldview requiring the unrealistic assumptions of knowing nothing about the universe and believing that all observations are unrelated to each other. Many have claimed that the first belief is necessary for science, and this claim is debunked by comparing variations in learning with different prior beliefs. Moving beyond the Bayesian and frequentist worldviews, the notion of hypothesis testing itself is challenged on the grounds that a hypothesis is an unclear distinction, and assigning a probability on an unclear distinction is an exercise that does not lead to clarity of action. This critique is of the theory itself and not any particular application of statistical hypothesis testing. A decision-making frame is proposed as a way of both addressing this critique and transcending ideological debates on probability. An example of a Bayesian decision-making approach is shown as an alternative to statistical hypothesis testing, utilizing data from a past clinical trial that studied the effect of Aspirin on heart attacks in a sample population of doctors. As a big reason for the prevalence of RCTs in academia is legislation requiring it, the ethics of legislating the use of statistical methods for clinical research is also examined. PMID:22022152

  8. Robust inference from multiple test statistics via permutations: a better alternative to the single test statistic approach for randomized trials.

    PubMed

    Ganju, Jitendra; Yu, Xinxin; Ma, Guoguang Julie

    2013-01-01

    Formal inference in randomized clinical trials is based on controlling the type I error rate associated with a single pre-specified statistic. The deficiency of using just one method of analysis is that it depends on assumptions that may not be met. For robust inference, we propose pre-specifying multiple test statistics and relying on the minimum p-value for testing the null hypothesis of no treatment effect. The null hypothesis associated with the various test statistics is that the treatment groups are indistinguishable. The critical value for hypothesis testing comes from permutation distributions. Rejection of the null hypothesis when the smallest p-value is less than the critical value controls the type I error rate at its designated value. Even if one of the candidate test statistics has low power, the adverse effect on the power of the minimum p-value statistic is not much. Its use is illustrated with examples. We conclude that it is better to rely on the minimum p-value rather than a single statistic particularly when that single statistic is the logrank test, because of the cost and complexity of many survival trials. Copyright © 2013 John Wiley & Sons, Ltd.

  9. Bayesian population receptive field modelling.

    PubMed

    Zeidman, Peter; Silson, Edward Harry; Schwarzkopf, Dietrich Samuel; Baker, Chris Ian; Penny, Will

    2017-09-08

    We introduce a probabilistic (Bayesian) framework and associated software toolbox for mapping population receptive fields (pRFs) based on fMRI data. This generic approach is intended to work with stimuli of any dimension and is demonstrated and validated in the context of 2D retinotopic mapping. The framework enables the experimenter to specify generative (encoding) models of fMRI timeseries, in which experimental stimuli enter a pRF model of neural activity, which in turns drives a nonlinear model of neurovascular coupling and Blood Oxygenation Level Dependent (BOLD) response. The neuronal and haemodynamic parameters are estimated together on a voxel-by-voxel or region-of-interest basis using a Bayesian estimation algorithm (variational Laplace). This offers several novel contributions to receptive field modelling. The variance/covariance of parameters are estimated, enabling receptive fields to be plotted while properly representing uncertainty about pRF size and location. Variability in the haemodynamic response across the brain is accounted for. Furthermore, the framework introduces formal hypothesis testing to pRF analysis, enabling competing models to be evaluated based on their log model evidence (approximated by the variational free energy), which represents the optimal tradeoff between accuracy and complexity. Using simulations and empirical data, we found that parameters typically used to represent pRF size and neuronal scaling are strongly correlated, which is taken into account by the Bayesian methods we describe when making inferences. We used the framework to compare the evidence for six variants of pRF model using 7 T functional MRI data and we found a circular Difference of Gaussians (DoG) model to be the best explanation for our data overall. We hope this framework will prove useful for mapping stimulus spaces with any number of dimensions onto the anatomy of the brain. Copyright © 2017 The Authors. Published by Elsevier Inc. All rights reserved.

  10. Phylogenetic classification of Aureobasidium pullulans strains for production of pullulan and xylanase

    USDA-ARS?s Scientific Manuscript database

    This study tests the hypothesis that phylogenetic classification can predict whether A. pullulans strains will produce useful levels of the commercial polysaccharide, pullulan, or the valuable enzyme, xylanase. To test this hypothesis, 19 strains of A. pullulans with previously described phenotypes...

  11. The potential for increased power from combining P-values testing the same hypothesis.

    PubMed

    Ganju, Jitendra; Julie Ma, Guoguang

    2017-02-01

    The conventional approach to hypothesis testing for formal inference is to prespecify a single test statistic thought to be optimal. However, we usually have more than one test statistic in mind for testing the null hypothesis of no treatment effect but we do not know which one is the most powerful. Rather than relying on a single p-value, combining p-values from prespecified multiple test statistics can be used for inference. Combining functions include Fisher's combination test and the minimum p-value. Using randomization-based tests, the increase in power can be remarkable when compared with a single test and Simes's method. The versatility of the method is that it also applies when the number of covariates exceeds the number of observations. The increase in power is large enough to prefer combined p-values over a single p-value. The limitation is that the method does not provide an unbiased estimator of the treatment effect and does not apply to situations when the model includes treatment by covariate interaction.

  12. Music Training Increases Phonological Awareness and Reading Skills in Developmental Dyslexia: A Randomized Control Trial

    PubMed Central

    Flaugnacco, Elena; Lopez, Luisa; Terribili, Chiara; Montico, Marcella; Zoia, Stefania; Schön, Daniele

    2015-01-01

    There is some evidence for a role of music training in boosting phonological awareness, word segmentation, working memory, as well as reading abilities in children with typical development. Poor performance in tasks requiring temporal processing, rhythm perception and sensorimotor synchronization seems to be a crucial factor underlying dyslexia in children. Interestingly, children with dyslexia show deficits in temporal processing, both in language and in music. Within this framework, we test the hypothesis that music training, by improving temporal processing and rhythm abilities, improves phonological awareness and reading skills in children with dyslexia. The study is a prospective, multicenter, open randomized controlled trial, consisting of test, rehabilitation and re-test (ID NCT02316873). After rehabilitation, the music group (N = 24) performed better than the control group (N = 22) in tasks assessing rhythmic abilities, phonological awareness and reading skills. This is the first randomized control trial testing the effect of music training in enhancing phonological and reading abilities in children with dyslexia. The findings show that music training can modify reading and phonological abilities even when these skills are severely impaired. Through the enhancement of temporal processing and rhythmic skills, music might become an important tool in both remediation and early intervention programs. Trial Registration ClinicalTrials.gov NCT02316873 PMID:26407242

  13. Music Training Increases Phonological Awareness and Reading Skills in Developmental Dyslexia: A Randomized Control Trial.

    PubMed

    Flaugnacco, Elena; Lopez, Luisa; Terribili, Chiara; Montico, Marcella; Zoia, Stefania; Schön, Daniele

    2015-01-01

    There is some evidence for a role of music training in boosting phonological awareness, word segmentation, working memory, as well as reading abilities in children with typical development. Poor performance in tasks requiring temporal processing, rhythm perception and sensorimotor synchronization seems to be a crucial factor underlying dyslexia in children. Interestingly, children with dyslexia show deficits in temporal processing, both in language and in music. Within this framework, we test the hypothesis that music training, by improving temporal processing and rhythm abilities, improves phonological awareness and reading skills in children with dyslexia. The study is a prospective, multicenter, open randomized controlled trial, consisting of test, rehabilitation and re-test (ID NCT02316873). After rehabilitation, the music group (N = 24) performed better than the control group (N = 22) in tasks assessing rhythmic abilities, phonological awareness and reading skills. This is the first randomized control trial testing the effect of music training in enhancing phonological and reading abilities in children with dyslexia. The findings show that music training can modify reading and phonological abilities even when these skills are severely impaired. Through the enhancement of temporal processing and rhythmic skills, music might become an important tool in both remediation and early intervention programs.Trial Registration: ClinicalTrials.gov NCT02316873

  14. A test of the orthographic recoding hypothesis

    NASA Astrophysics Data System (ADS)

    Gaygen, Daniel E.

    2003-04-01

    The Orthographic Recoding Hypothesis [D. E. Gaygen and P. A. Luce, Percept. Psychophys. 60, 465-483 (1998)] was tested. According to this hypothesis, listeners recognize spoken words heard for the first time by mapping them onto stored representations of the orthographic forms of the words. Listeners have a stable orthographic representation of words, but no phonological representation, when those words have been read frequently but never heard or spoken. Such may be the case for low frequency words such as jargon. Three experiments using visually and auditorily presented nonword stimuli tested this hypothesis. The first two experiments were explicit tests of memory (old-new tests) for words presented visually. In the first experiment, the recognition of auditorily presented nonwords was facilitated when they previously appeared on a visually presented list. The second experiment was similar, but included a concurrent articulation task during a visual word list presentation, thus preventing covert rehearsal of the nonwords. The results were similar to the first experiment. The third experiment was an indirect test of memory (auditory lexical decision task) for visually presented nonwords. Auditorily presented nonwords were identified as nonwords significantly more slowly if they had previously appeared on the visually presented list accompanied by a concurrent articulation task.

  15. The picture superiority effect in conceptual implicit memory: a conceptual distinctiveness hypothesis.

    PubMed

    Hamilton, Maryellen; Geraci, Lisa

    2006-01-01

    According to leading theories, the picture superiority effect is driven by conceptual processing, yet this effect has been difficult to obtain using conceptual implicit memory tests. We hypothesized that the picture superiority effect results from conceptual processing of a picture's distinctive features rather than a picture's semantic features. To test this hypothesis, we used 2 conceptual implicit general knowledge tests; one cued conceptually distinctive features (e.g., "What animal has large eyes?") and the other cued semantic features (e.g., "What animal is the figurehead of Tootsie Roll?"). Results showed a picture superiority effect only on the conceptual test using distinctive cues, supporting our hypothesis that this effect is mediated by conceptual processing of a picture's distinctive features.

  16. Hypothesis testing for band size detection of high-dimensional banded precision matrices.

    PubMed

    An, Baiguo; Guo, Jianhua; Liu, Yufeng

    2014-06-01

    Many statistical analysis procedures require a good estimator for a high-dimensional covariance matrix or its inverse, the precision matrix. When the precision matrix is banded, the Cholesky-based method often yields a good estimator of the precision matrix. One important aspect of this method is determination of the band size of the precision matrix. In practice, crossvalidation is commonly used; however, we show that crossvalidation not only is computationally intensive but can be very unstable. In this paper, we propose a new hypothesis testing procedure to determine the band size in high dimensions. Our proposed test statistic is shown to be asymptotically normal under the null hypothesis, and its theoretical power is studied. Numerical examples demonstrate the effectiveness of our testing procedure.

  17. Why do mothers favor girls and fathers, boys? : A hypothesis and a test of investment disparity.

    PubMed

    Godoy, Ricardo; Reyes-García, Victoria; McDade, Thomas; Tanner, Susan; Leonard, William R; Huanca, Tomás; Vadez, Vincent; Patel, Karishma

    2006-06-01

    Growing evidence suggests mothers invest more in girls than boys and fathers more in boys than girls. We develop a hypothesis that predicts preference for girls by the parent facing more resource constraints and preference for boys by the parent facing less constraint. We test the hypothesis with panel data from the Tsimane', a foraging-farming society in the Bolivian Amazon. Tsimane' mothers face more resource constraints than fathers. As predicted, mother's wealth protected girl's BMI, but father's wealth had weak effects on boy's BMI. Numerous tests yielded robust results, including those that controlled for fixed effects of child and household.

  18. Addendum to the article: Misuse of null hypothesis significance testing: Would estimation of positive and negative predictive values improve certainty of chemical risk assessment?

    PubMed

    Bundschuh, Mirco; Newman, Michael C; Zubrod, Jochen P; Seitz, Frank; Rosenfeldt, Ricki R; Schulz, Ralf

    2015-03-01

    We argued recently that the positive predictive value (PPV) and the negative predictive value (NPV) are valuable metrics to include during null hypothesis significance testing: They inform the researcher about the probability of statistically significant and non-significant test outcomes actually being true. Although commonly misunderstood, a reported p value estimates only the probability of obtaining the results or more extreme results if the null hypothesis of no effect was true. Calculations of the more informative PPV and NPV require a priori estimate of the probability (R). The present document discusses challenges of estimating R.

  19. Functional imaging of brain responses to different outcomes of hypothesis testing: revealed in a category induction task.

    PubMed

    Li, Fuhong; Cao, Bihua; Luo, Yuejia; Lei, Yi; Li, Hong

    2013-02-01

    Functional magnetic resonance imaging (fMRI) was used to examine differences in brain activation that occur when a person receives the different outcomes of hypothesis testing (HT). Participants were provided with a series of images of batteries and were asked to learn a rule governing what kinds of batteries were charged. Within each trial, the first two charged batteries were sequentially displayed, and participants would generate a preliminary hypothesis based on the perceptual comparison. Next, a third battery that served to strengthen, reject, or was irrelevant to the preliminary hypothesis was displayed. The fMRI results revealed that (1) no significant differences in brain activation were found between the 2 hypothesis-maintain conditions (i.e., strengthen and irrelevant conditions); and (2) compared with the hypothesis-maintain conditions, the hypothesis-reject condition activated the left medial frontal cortex, bilateral putamen, left parietal cortex, and right cerebellum. These findings are discussed in terms of the neural correlates of the subcomponents of HT and working memory manipulation. Copyright © 2012 Elsevier Inc. All rights reserved.

  20. Animal Models for Testing the DOHaD Hypothesis

    EPA Science Inventory

    Since the seminal work in human populations by David Barker and colleagues, several species of animals have been used in the laboratory to test the Developmental Origins of Health and Disease (DOHaD) hypothesis. Rats, mice, guinea pigs, sheep, pigs and non-human primates have bee...

  1. A "Projective" Test of the Golden Section Hypothesis.

    ERIC Educational Resources Information Center

    Lee, Chris; Adams-Webber, Jack

    1987-01-01

    In a projective test of the golden section hypothesis, 24 high school students rated themselves and 10 comic strip characters on basis of 12 bipolar constructs. Overall proportion of cartoon figures which subjects assigned to positive poles of constructs was very close to golden section. (Author/NB)

  2. Metacognitive evaluation in the avoidance of demand.

    PubMed

    Dunn, Timothy L; Lutes, David J C; Risko, Evan F

    2016-09-01

    In the current set of experiments our goal was to test the hypothesis that individuals avoid courses of action based on a kind of metacognitive evaluation of demand in a Demand Selection Task (DST). Individuals in Experiment 1 completed a DST utilizing visual stimuli known to yield a dissociation between performance and perceived demand. Patterns of demand avoidance followed that of perceived demand. Experiment 2 provided a replication of the aforementioned results, in addition to demonstrating a second dissociation between a peripheral physiological measure of demand (i.e., blink rates) and demand avoidance. Experiment 3 directly tested the assumption that individuals make use of a general metacognitive evaluation of task demand during selections. A DST was utilized in a forced-choice paradigm that required individuals to either select the most effortful, time demanding, or least accurate of 2 choices. Patterns of selections were similar across all rating dimensions, lending credit to this notion. Findings are discussed within a metacognitive framework of demand avoidance and contrasted to current theories. (PsycINFO Database Record (c) 2016 APA, all rights reserved).

  3. Are stock prices too volatile to be justified by the dividend discount model?

    NASA Astrophysics Data System (ADS)

    Akdeniz, Levent; Salih, Aslıhan Altay; Ok, Süleyman Tuluğ

    2007-03-01

    This study investigates excess stock price volatility using the variance bound framework of LeRoy and Porter [The present-value relation: tests based on implied variance bounds, Econometrica 49 (1981) 555-574] and of Shiller [Do stock prices move too much to be justified by subsequent changes in dividends? Am. Econ. Rev. 71 (1981) 421-436.]. The conditional variance bound relationship is examined using cross-sectional data simulated from the general equilibrium asset pricing model of Brock [Asset prices in a production economy, in: J.J. McCall (Ed.), The Economics of Information and Uncertainty, University of Chicago Press, Chicago (for N.B.E.R.), 1982]. Results show that the conditional variance bounds hold, hence, our hypothesis of the validity of the dividend discount model cannot be rejected. Moreover, in our setting, markets are efficient and stock prices are neither affected by herd psychology nor by the outcome of noise trading by naive investors; thus, we are able to control for market efficiency. Consequently, we show that one cannot infer any conclusions about market efficiency from the unconditional variance bounds tests.

  4. Pasture succession in the Neotropics: extending the nucleation hypothesis into a matrix discontinuity hypothesis.

    PubMed

    Peterson, Chris J; Dosch, Jerald J; Carson, Walter P

    2014-08-01

    The nucleation hypothesis appears to explain widespread patterns of succession in tropical pastures, specifically the tendency for isolated trees to promote woody species recruitment. Still, the nucleation hypothesis has usually been tested explicitly for only short durations and in some cases isolated trees fail to promote woody recruitment. Moreover, at times, nucleation occurs in other key habitat patches. Thus, we propose an extension, the matrix discontinuity hypothesis: woody colonization will occur in focal patches that function to mitigate the herbaceous vegetation effects, thus providing safe sites or regeneration niches. We tested predictions of the classical nucleation hypothesis, the matrix discontinuity hypothesis, and a distance from forest edge hypothesis, in five abandoned pastures in Costa Rica, across the first 11 years of succession. Our findings confirmed the matrix discontinuity hypothesis: specifically, rotting logs and steep slopes significantly enhanced woody colonization. Surprisingly, isolated trees did not consistently significantly enhance recruitment; only larger trees did so. Finally, woody recruitment consistently decreased with distance from forest. Our results as well as results from others suggest that the nucleation hypothesis needs to be broadened beyond its historical focus on isolated trees or patches; the matrix discontinuity hypothesis focuses attention on a suite of key patch types or microsites that promote woody species recruitment. We argue that any habitat discontinuities that ameliorate the inhibition by dense graminoid layers will be foci for recruitment. Such patches could easily be manipulated to speed the transition of pastures to closed canopy forests.

  5. Humans have evolved specialized skills of social cognition: the cultural intelligence hypothesis.

    PubMed

    Herrmann, Esther; Call, Josep; Hernàndez-Lloreda, Maráa Victoria; Hare, Brian; Tomasello, Michael

    2007-09-07

    Humans have many cognitive skills not possessed by their nearest primate relatives. The cultural intelligence hypothesis argues that this is mainly due to a species-specific set of social-cognitive skills, emerging early in ontogeny, for participating and exchanging knowledge in cultural groups. We tested this hypothesis by giving a comprehensive battery of cognitive tests to large numbers of two of humans' closest primate relatives, chimpanzees and orangutans, as well as to 2.5-year-old human children before literacy and schooling. Supporting the cultural intelligence hypothesis and contradicting the hypothesis that humans simply have more "general intelligence," we found that the children and chimpanzees had very similar cognitive skills for dealing with the physical world but that the children had more sophisticated cognitive skills than either of the ape species for dealing with the social world.

  6. Hypothesis-driven research for G × E interactions: the relationship between oxytocin, parental divorce during adolescence, and depression in young adulthood.

    PubMed

    Windle, Michael; Mrug, Sylvie

    2015-01-01

    Research in molecular genetics has generally focused on genome-wide association studies (GWAS) and exploratory candidate gene and candidate gene-environment (G × E) studies. In this article it is proposed that hypothesis-driven and biologically informed research provides a complementary approach to GWAS to advance pressing research questions about G × E relations that are of public health relevance. Prior research studies and developmental and evolutionary theory were used to guide hypothesis testing of G × E relationships in this study. The study investigated whether the oxytocin polymorphism, rs53576, moderated the relationship between parental divorce during adolescence and depression symptoms in young adulthood. Oxytocin is a neuropeptide that has been related to the regulation of complex social cognition and behaviors such as empathy, attachment, and nurturance. We hypothesized that the GG polymorphism would be associated with more depressive symptoms following parental divorce, and that this effect would be stronger in females than males. The sample consisted of 340 individuals who participated in a longitudinal study with data used both from adolescence and young adulthood. Findings using prospective follow-up and autoregressive change models supported the hypothesized relationships. Young adult females who had experienced parental divorce during adolescence and had the GG oxytocin genotype reported almost twice as many depressive symptoms relative to young adult females who also experienced parental divorce during adolescence but had the AA or AG genotype. This pattern was not indicated among males. Findings were discussed with regard to how molecular genetic factors in combination with environmental stressors, such parental divorce, framed within a developmental framework may facilitate the future study of G × E relationships in the parental divorce-child adjustment literature and contribute to a prevention science perspective.

  7. Hypothesis-driven research for G × E interactions: the relationship between oxytocin, parental divorce during adolescence, and depression in young adulthood

    PubMed Central

    Windle, Michael; Mrug, Sylvie

    2015-01-01

    Research in molecular genetics has generally focused on genome-wide association studies (GWAS) and exploratory candidate gene and candidate gene–environment (G × E) studies. In this article it is proposed that hypothesis-driven and biologically informed research provides a complementary approach to GWAS to advance pressing research questions about G × E relations that are of public health relevance. Prior research studies and developmental and evolutionary theory were used to guide hypothesis testing of G × E relationships in this study. The study investigated whether the oxytocin polymorphism, rs53576, moderated the relationship between parental divorce during adolescence and depression symptoms in young adulthood. Oxytocin is a neuropeptide that has been related to the regulation of complex social cognition and behaviors such as empathy, attachment, and nurturance. We hypothesized that the GG polymorphism would be associated with more depressive symptoms following parental divorce, and that this effect would be stronger in females than males. The sample consisted of 340 individuals who participated in a longitudinal study with data used both from adolescence and young adulthood. Findings using prospective follow-up and autoregressive change models supported the hypothesized relationships. Young adult females who had experienced parental divorce during adolescence and had the GG oxytocin genotype reported almost twice as many depressive symptoms relative to young adult females who also experienced parental divorce during adolescence but had the AA or AG genotype. This pattern was not indicated among males. Findings were discussed with regard to how molecular genetic factors in combination with environmental stressors, such parental divorce, framed within a developmental framework may facilitate the future study of G × E relationships in the parental divorce-child adjustment literature and contribute to a prevention science perspective. PMID:26441708

  8. The Nutritional Balancing Act of a Large Herbivore: An Experiment with Captive Moose (Alces alces L)

    PubMed Central

    Felton, Annika M.; Felton, Adam; Raubenheimer, David; Simpson, Stephen J.; Krizsan, Sophie J.; Hedwall, Per-Ola; Stolter, Caroline

    2016-01-01

    The nutrient balancing hypothesis proposes that, when sufficient food is available, the primary goal of animal diet selection is to obtain a nutritionally balanced diet. This hypothesis can be tested using the Geometric Framework for nutrition (GF). The GF enables researchers to study patterns of nutrient intake (e.g. macronutrients; protein, carbohydrates, fat), interactions between the different nutrients, and how an animal resolves the potential conflict between over-eating one or more nutrients and under-eating others during periods of dietary imbalance. Using the moose (Alces alces L.), a model species in the development of herbivore foraging theory, we conducted a feeding experiment guided by the GF, combining continuous observations of six captive moose with analysis of the macronutritional composition of foods. We identified the moose’s self-selected macronutrient target by allowing them to compose a diet by mixing two nutritionally complementary pellet types plus limited access to Salix browse. Such periods of free choice were intermixed with periods when they were restricted to one of the two pellet types plus Salix browse. Our observations of food intake by moose given free choice lend support to the nutrient balancing hypothesis, as the moose combined the foods in specific proportions that provided a particular ratio and amount of macronutrients. When restricted to either of two diets comprising a single pellet type, the moose i) maintained a relatively stable intake of non-protein energy while allowing protein intakes to vary with food composition, and ii) increased their intake of the food item that most closely resembled the self-selected macronutrient intake from the free choice periods, namely Salix browse. We place our results in the context of the nutritional strategy of the moose, ruminant physiology and the categorization of food quality. PMID:26986618

  9. A Statistical Analysis of the Relationship between Harmonic Surprise and Preference in Popular Music.

    PubMed

    Miles, Scott A; Rosen, David S; Grzywacz, Norberto M

    2017-01-01

    Studies have shown that some musical pieces may preferentially activate reward centers in the brain. Less is known, however, about the structural aspects of music that are associated with this activation. Based on the music cognition literature, we propose two hypotheses for why some musical pieces are preferred over others. The first, the Absolute-Surprise Hypothesis, states that unexpected events in music directly lead to pleasure. The second, the Contrastive-Surprise Hypothesis, proposes that the juxtaposition of unexpected events and subsequent expected events leads to an overall rewarding response. We tested these hypotheses within the framework of information theory, using the measure of "surprise." This information-theoretic variable mathematically describes how improbable an event is given a known distribution. We performed a statistical investigation of surprise in the harmonic structure of songs within a representative corpus of Western popular music, namely, the McGill Billboard Project corpus. We found that chords of songs in the top quartile of the Billboard chart showed greater average surprise than those in the bottom quartile. We also found that the different sections within top-quartile songs varied more in their average surprise than the sections within bottom-quartile songs. The results of this study are consistent with both the Absolute- and Contrastive-Surprise Hypotheses. Although these hypotheses seem contradictory to one another, we cannot yet discard the possibility that both absolute and contrastive types of surprise play roles in the enjoyment of popular music. We call this possibility the Hybrid-Surprise Hypothesis. The results of this statistical investigation have implications for both music cognition and the human neural mechanisms of esthetic judgments.

  10. A Statistical Analysis of the Relationship between Harmonic Surprise and Preference in Popular Music

    PubMed Central

    Miles, Scott A.; Rosen, David S.; Grzywacz, Norberto M.

    2017-01-01

    Studies have shown that some musical pieces may preferentially activate reward centers in the brain. Less is known, however, about the structural aspects of music that are associated with this activation. Based on the music cognition literature, we propose two hypotheses for why some musical pieces are preferred over others. The first, the Absolute-Surprise Hypothesis, states that unexpected events in music directly lead to pleasure. The second, the Contrastive-Surprise Hypothesis, proposes that the juxtaposition of unexpected events and subsequent expected events leads to an overall rewarding response. We tested these hypotheses within the framework of information theory, using the measure of “surprise.” This information-theoretic variable mathematically describes how improbable an event is given a known distribution. We performed a statistical investigation of surprise in the harmonic structure of songs within a representative corpus of Western popular music, namely, the McGill Billboard Project corpus. We found that chords of songs in the top quartile of the Billboard chart showed greater average surprise than those in the bottom quartile. We also found that the different sections within top-quartile songs varied more in their average surprise than the sections within bottom-quartile songs. The results of this study are consistent with both the Absolute- and Contrastive-Surprise Hypotheses. Although these hypotheses seem contradictory to one another, we cannot yet discard the possibility that both absolute and contrastive types of surprise play roles in the enjoyment of popular music. We call this possibility the Hybrid-Surprise Hypothesis. The results of this statistical investigation have implications for both music cognition and the human neural mechanisms of esthetic judgments. PMID:28572763

  11. A Search for Factors Causing Training Costs to Rise by Examining the U. S. Navy’s AT, AW, and AX Ratings during their First Enlistment Period

    DTIC Science & Technology

    1986-09-01

    HYPOTHESIS TEST .................... 20 III. TIME TO GET RATED TWO FACTOR ANOVA RESULTS ......... 23 IV. TIME TO GET RATED TUKEY’S PAIRED COvfl’PARISON... TEST RESULTS A ............................................ 24 V. TIME TO GET RATED TUKEY’S PAIRED COMPARISON TEST RESULTS B...25 VI. SINGLE FACTOR ANOVA HYPOTHESIS TEST #I............... 27 VII. AT: TIME TO GET RATED ANOVA TEST RESULTS ............. 30

  12. Sensor Compromise Detection in Multiple-Target Tracking Systems

    PubMed Central

    Doucette, Emily A.; Curtis, Jess W.

    2018-01-01

    Tracking multiple targets using a single estimator is a problem that is commonly approached within a trusted framework. There are many weaknesses that an adversary can exploit if it gains control over the sensors. Because the number of targets that the estimator has to track is not known with anticipation, an adversary could cause a loss of information or a degradation in the tracking precision. Other concerns include the introduction of false targets, which would result in a waste of computational and material resources, depending on the application. In this work, we study the problem of detecting compromised or faulty sensors in a multiple-target tracker, starting with the single-sensor case and then considering the multiple-sensor scenario. We propose an algorithm to detect a variety of attacks in the multiple-sensor case, via the application of finite set statistics (FISST), one-class classifiers and hypothesis testing using nonparametric techniques. PMID:29466314

  13. Spatial association between dissection density and environmental factors over the entire conterminous United States

    NASA Astrophysics Data System (ADS)

    Luo, Wei; Jasiewicz, Jaroslaw; Stepinski, Tomasz; Wang, Jinfeng; Xu, Chengdong; Cang, Xuezhi

    2016-01-01

    Previous studies of land dissection density (D) often find contradictory results regarding factors controlling its spatial variation. We hypothesize that the dominant controlling factors (and the interactions between them) vary from region to region due to differences in each region's local characteristics and geologic history. We test this hypothesis by applying a geographical detector method to eight physiographic divisions of the conterminous United States and identify the dominant factor(s) in each. The geographical detector method computes the power of determinant (q) that quantitatively measures the affinity between the factor considered and D. Results show that the factor (or factor combination) with the largest q value is different for physiographic regions with different characteristics and geologic histories. For example, lithology dominates in mountainous regions, curvature dominates in plains, and glaciation dominates in previously glaciated areas. The geographical detector method offers an objective framework for revealing factors controlling Earth surface processes.

  14. The seemingly quixotic pursuit of a cumulative psychological science: introduction to the special issue.

    PubMed

    Curran, Patrick J

    2009-06-01

    The goal of any empirical science is to pursue the construction of a cumulative base of knowledge upon which the future of the science may be built. However, there is mixed evidence that the science of psychology can accurately be characterized by such a cumulative progression. Indeed, some argue that the development of a truly cumulative psychological science is not possible with the current paradigms of hypothesis testing in single-study designs. The author explores this controversy as a framework to introduce the 6 articles that make up this special issue on the integration of data and empirical findings across multiple studies. The author proposes that the methods and techniques described in this set of articles can significantly propel researchers forward in their ongoing quest to build a cumulative psychological science. (PsycINFO Database Record (c) 2009 APA, all rights reserved).

  15. Learning from Animal Models of Obsessive-Compulsive Disorder

    PubMed Central

    Monteiro, Patricia; Feng, Guoping

    2015-01-01

    Obsessive-Compulsive Disorder (OCD) affects 2–3% of the worldwide population and can cause significant distress and disability to its sufferers. Substantial challenges remain in the field of OCD research and therapeutics. Approved interventions only partially alleviate symptoms, with 30–40% of patients being resistant to treatment. Research evidence points towards the involvement of cortico-striato-thalamocortical circuitry (CSTC) although OCD’s etiology is still unknown. This review will focus on the most recent behavior, genetics and neurophysiological findings from animal models of OCD. Based on evidence from these models and parallels with human studies, we discuss the circuit hyperactivity hypothesis for OCD, a potential circuitry dysfunction of action termination, and the involvement of candidate genes. Adding a more biologically-valid framework to OCD will help us define and test new hypotheses and facilitate the development of targeted therapies based on disease-specific mechanisms. PMID:26037910

  16. Spike-Timing of Orbitofrontal Neurons Is Synchronized With Breathing.

    PubMed

    Kőszeghy, Áron; Lasztóczi, Bálint; Forro, Thomas; Klausberger, Thomas

    2018-01-01

    The orbitofrontal cortex (OFC) has been implicated in a multiplicity of complex brain functions, including representations of expected outcome properties, post-decision confidence, momentary food-reward values, complex flavors and odors. As breathing rhythm has an influence on odor processing at primary olfactory areas, we tested the hypothesis that it may also influence neuronal activity in the OFC, a prefrontal area involved also in higher order processing of odors. We recorded spike timing of orbitofrontal neurons as well as local field potentials (LFPs) in awake, head-fixed mice, together with the breathing rhythm. We observed that a large majority of orbitofrontal neurons showed robust phase-coupling to breathing during immobility and running. The phase coupling of action potentials to breathing was significantly stronger in orbitofrontal neurons compared to cells in the medial prefrontal cortex. The characteristic synchronization of orbitofrontal neurons with breathing might provide a temporal framework for multi-variable processing of olfactory, gustatory and reward-value relationships.

  17. Subjective Social Status, Mental and Psychosocial Health, and Birth Weight Differences in Mexican-American and Mexican Immigrant Women.

    PubMed

    Fleuriet, K Jill; Sunil, T S

    2015-12-01

    Recent Mexican immigrant women on average have an unexpectedly low incidence of low birth weight (LBW). Birth weights decline and LBW incidence increases in post-immigrant generations. This pilot project tested the hypothesis that subjective social status (SSS) of pregnant women predicts variation in birth weight between Mexican immigrant and Mexican-American women. 300 low-income pregnant Mexican immigrant and Mexican-American women in South Texas were surveyed for SSS, depression, pregnancy-related anxiety, perceived social stress and self-esteem and subsequent birth weight. No significant difference in SSS levels between pregnant Mexican immigrant and Mexican-American women were found. However, SSS better predicted variation in birth weight across both groups than mental and psychosocial health variables. Results suggest distinct relationships among SSS, mental and psychosocial health that could impact birth weight. They underscore the relevance of a multilevel, biopsychosocial analytical framework to studying LBW.

  18. A new variant of a scaling hypothesis and a fundamental equation of state based on it

    NASA Astrophysics Data System (ADS)

    Kudryavtseva, I. V.; Rykov, V. A.; Rykov, S. V.; Ustyuzhanin, E. E.

    2018-01-01

    This paper deals with a fundamental equation of state (FEOS) for substances. We have suggested a new method. It allows constructing FEOS that is based on the scaling theory of critical phenomena and describes thermodynamic properties related to liquid and gas phases of a substance in a wide range of the pressures and temperatures. In the framework of the methodological approach, we have provided: (i) a transition of FEOS in a virial equation of state in the low density region; (ii) a transition of FEOS in a Widom equation of state in the critical region. The method has been tested on the example of FEOS of R218. The area of applicability of FEOS is 0 < ρ/ρ c < 3.2 in the density and 133 < T < 440 K in the temperature. We have compared FEOS with some equations of state and discussed the results.

  19. Neural syntax: cell assemblies, synapsembles and readers

    PubMed Central

    Buzsáki, György

    2010-01-01

    Summary A widely discussed hypothesis in neuroscience is that transiently active ensembles of neurons, known as ‘cell assemblies’, underlie numerous operations of the brain, from encoding memories to reasoning. However, the mechanisms responsible for the formation and disbanding of cell assemblies and temporal evolution of cell assembly sequences are not well understood. I introduce and review three interconnected topics, which could facilitate progress in defining cell assemblies, identifying their neuronal organization and revealing causal relationships between assembly organization and behavior. First, I hypothesize that cell assemblies are best understood in light of their output product, as detected by ‘reader-actuator’ mechanisms. Second, I suggest that the hierarchical organization of cell assemblies may be regarded as a neural syntax. Third, constituents of the neural syntax are linked together by dynamically changing constellations of synaptic weights (‘synapsembles’). Existing support for this tripartite framework is reviewed and strategies for experimental testing of its predictions are discussed. PMID:21040841

  20. Genetic Factors That Increase Male Facial Masculinity Decrease Facial Attractiveness of Female Relatives

    PubMed Central

    Lee, Anthony J.; Mitchem, Dorian G.; Wright, Margaret J.; Martin, Nicholas G.; Keller, Matthew C.; Zietsch, Brendan P.

    2014-01-01

    For women, choosing a facially masculine man as a mate is thought to confer genetic benefits to offspring. Crucial assumptions of this hypothesis have not been adequately tested. It has been assumed that variation in facial masculinity is due to genetic variation and that genetic factors that increase male facial masculinity do not increase facial masculinity in female relatives. We objectively quantified the facial masculinity in photos of identical (n = 411) and nonidentical (n = 782) twins and their siblings (n = 106). Using biometrical modeling, we found that much of the variation in male and female facial masculinity is genetic. However, we also found that masculinity of male faces is unrelated to their attractiveness and that facially masculine men tend to have facially masculine, less-attractive sisters. These findings challenge the idea that facially masculine men provide net genetic benefits to offspring and call into question this popular theoretical framework. PMID:24379153

  1. Genetic factors that increase male facial masculinity decrease facial attractiveness of female relatives.

    PubMed

    Lee, Anthony J; Mitchem, Dorian G; Wright, Margaret J; Martin, Nicholas G; Keller, Matthew C; Zietsch, Brendan P

    2014-02-01

    For women, choosing a facially masculine man as a mate is thought to confer genetic benefits to offspring. Crucial assumptions of this hypothesis have not been adequately tested. It has been assumed that variation in facial masculinity is due to genetic variation and that genetic factors that increase male facial masculinity do not increase facial masculinity in female relatives. We objectively quantified the facial masculinity in photos of identical (n = 411) and nonidentical (n = 782) twins and their siblings (n = 106). Using biometrical modeling, we found that much of the variation in male and female facial masculinity is genetic. However, we also found that masculinity of male faces is unrelated to their attractiveness and that facially masculine men tend to have facially masculine, less-attractive sisters. These findings challenge the idea that facially masculine men provide net genetic benefits to offspring and call into question this popular theoretical framework.

  2. Cumulative early life adversity predicts longevity in wild baboons

    PubMed Central

    Tung, Jenny; Archie, Elizabeth A.; Altmann, Jeanne; Alberts, Susan C.

    2016-01-01

    In humans and other animals, harsh circumstances in early life predict morbidity and mortality in adulthood. Multiple adverse conditions are thought to be especially toxic, but this hypothesis has rarely been tested in a prospective, longitudinal framework, especially in long-lived mammals. Here we use prospective data on 196 wild female baboons to show that cumulative early adversity predicts natural adult lifespan. Females who experience ≥3 sources of early adversity die a median of 10 years earlier than females who experience ≤1 adverse circumstances (median lifespan is 18.5 years). Females who experience the most adversity are also socially isolated in adulthood, suggesting that social processes partially explain the link between early adversity and adult survival. Our results provide powerful evidence for the developmental origins of health and disease and indicate that close ties between early adversity and survival arise even in the absence of health habit and health care-related explanations. PMID:27091302

  3. B-physics anomalies: a guide to combined explanations

    NASA Astrophysics Data System (ADS)

    Buttazzo, Dario; Greljo, Admir; Isidori, Gino; Marzocca, David

    2017-11-01

    Motivated by additional experimental hints of Lepton Flavour Universality violation in B decays, both in charged- and in neutral-current processes, we analyse the ingredients necessary to provide a combined description of these phenomena. By means of an Effective Field Theory (EFT) approach, based on the hypothesis of New Physics coupled predominantly to the third generation of left-handed quarks and leptons, we show how this is possible. We demonstrate, in particular, how to solve the problems posed by electroweak precision tests and direct searches with a rather natural choice of model parameters, within the context of a U(2) q ×U(2)ℓ flavour symmetry. We further exemplify the general EFT findings by means of simplified models with explicit mediators in the TeV range: coloured scalar or vector leptoquarks and colour-less vectors. Among these, the case of an SU(2) L -singlet vector leptoquark emerges as a particularly simple and successful framework.

  4. Multiple Geographical Origins of Environmental Sex Determination enhanced the diversification of Darwin's Favourite Orchids.

    PubMed

    Pérez-Escobar, Oscar Alejandro; Chomicki, Guillaume; Condamine, Fabien L; de Vos, Jurriaan M; Martins, Aline C; Smidt, Eric C; Klitgård, Bente; Gerlach, Günter; Heinrichs, Jochen

    2017-10-10

    Environmental sex determination (ESD) - a change in sexual function during an individual life span driven by environmental cues - is an exceedingly rare sexual system among angiosperms. Because ESD can directly affect reproduction success, it could influence diversification rate as compared with lineages that have alternative reproductive systems. Here we test this hypothesis using a solid phylogenetic framework of Neotropical Catasetinae, the angiosperm lineage richest in taxa with ESD. We assess whether gains of ESD are associated with higher diversification rates compared to lineages with alternative systems while considering additional traits known to positively affect diversification rates in orchids. We found that ESD has evolved asynchronously three times during the last ~5 Myr. Lineages with ESD have consistently higher diversification rates than related lineages with other sexual systems. Habitat fragmentation due to mega-wetlands extinction, and climate instability are suggested as the driving forces for ESD evolution.

  5. Parental conflict resolution styles and children's adjustment: children's appraisals and emotion regulation as mediators.

    PubMed

    Siffert, Andrea; Schwarz, Beate

    2011-01-01

    Guided by the emotional security hypothesis and the cognitive-contextual framework, the authors investigated whether the associations between negative parental conflict resolution styles and children's internalizing and externalizing problems were mediated by children's appraisals of threat and self-blame and their emotion regulation. Participants were 192 Swiss 2-parent families with children aged 9-12 years (M age = 10.62 years, SD = 0.41 years). Structural equation modeling was used to test the empirical validity of the theoretical model. Results indicated that children's maladaptive emotion regulation mediated the association between negative parental conflict resolution styles and children's internalizing as well as externalizing problems. Whereas perceived threat was related only to children's internalizing problems, self-blame did not mediate the links between negative parental conflict resolution styles and children's adjustment. Implications for understanding the mechanisms by which exposure to interparental conflict could lead to children's maladjustment and limitations of the study are discussed.

  6. Geological Implications of a Physical Libration on Enceladus

    NASA Technical Reports Server (NTRS)

    Hurford, T. A.; Bills, B. G.; Helfenstein, P.; Greenberg, R.; Hoppa, G. V.; Hamilton, D. P.

    2008-01-01

    Given the non-spherical shape of Enceladus (Thomas et al., 2007), the satellite will experience gravitational torques that will cause it to physically librate as it orbits Saturn. Physical libration would produce a diurnal oscillation in the longitude of Enceladus tidal bulge which, could have a profound effect on the diurnal stresses experienced by the surface of the satellite. Although Cassini ISS has placed an observational upper limit on Enceladus libration amplitude of F < 1.5deg (Porco et al., 2006), smaller amplitudes can still have geologically significant consequences. Here we present the first detailed description of how physical libration affects tidal stresses and how those stresses then might affect geological processes including crack formation and propagation, south polar eruption activity, and tidal heating. Our goal is to provide a framework for testing the hypothesis that geologic features on Enceladus are produced by tidal stresses from diurnal physical and optical librations of the satellite.

  7. A multi-informant longitudinal study on the relationship between aggression, peer victimization, and dating status in adolescence.

    PubMed

    Arnocky, Steven; Vaillancourt, Tracy

    2012-05-25

    Adolescent peer-aggression has recently been considered from the evolutionary perspective of intrasexual competition for mates. We tested the hypothesis that peer-nominated physical aggression, indirect aggression, along with self-reported bullying behaviors at Time 1 would predict Time 2 dating status (one year later), and that Time 1 peer- and self-reported peer victimization would negatively predict Time 2 dating status. Participants were 310 adolescents who were in grades 6 through 9 (ages 11-14) at Time 1.  Results showed that for both boys and girls, peer-nominated indirect aggression was predictive of dating one year later even when controlling for age, peer-rated attractiveness, and peer-perceived popularity, as well as initial dating status. For both sexes, self-reported peer victimization was negatively related to having a dating partner at Time 2. Findings are discussed within the framework of intrasexual competition.

  8. The Uses of the Term Hypothesis and the Inquiry Emphasis Conflation in Science Teacher Education

    NASA Astrophysics Data System (ADS)

    Gyllenpalm, Jakob; Wickman, Per-Olof

    2011-09-01

    This paper examines the use and role of the term 'hypothesis' in science teacher education as described by teacher students. Data were collected through focus group interviews conducted at seven occasions with 32 students from six well-known Swedish universities. The theoretical framework is a sociocultural and pragmatist perspective on language and learning, introducing the notion of pivot terms to operationalise language use as a habit and mediated action. We describe three different customs of using the term 'hypothesis' within four cultural institutions that can be said to constitute science teacher education in Sweden. Students were found to habitually use the term hypothesis as meaning a guess about an outcome. This is contrasted to the function of this term in scientific research as a tentative explanation. We also found differences in how this term was used between the pure science courses given by the science departments of universities and science education courses taken only by teacher students. Findings also included further support for school students hypothesis fear reported in an earlier study. It is discussed how these findings can obstruct learning and teaching about the nature of scientific inquiry. Constructivist theories of learning are suggested as a possible origin of these problems. The findings are also related to curricular reform and development.

  9. Sensory discrimination and intelligence: testing Spearman's other hypothesis.

    PubMed

    Deary, Ian J; Bell, P Joseph; Bell, Andrew J; Campbell, Mary L; Fazal, Nicola D

    2004-01-01

    At the centenary of Spearman's seminal 1904 article, his general intelligence hypothesis remains one of the most influential in psychology. Less well known is the article's other hypothesis that there is "a correspondence between what may provisionally be called 'General Discrimination' and 'General Intelligence' which works out with great approximation to one or absoluteness" (Spearman, 1904, p. 284). Studies that do not find high correlations between psychometric intelligence and single sensory discrimination tests do not falsify this hypothesis. This study is the first directly to address Spearman's general intelligence-general sensory discrimination hypothesis. It attempts to replicate his findings with a similar sample of schoolchildren. In a well-fitting structural equation model of the data, general intelligence and general discrimination correlated .92. In a reanalysis of data published byActon and Schroeder (2001), general intelligence and general sensory ability correlated .68 in men and women. One hundred years after its conception, Spearman's other hypothesis achieves some confirmation. The association between general intelligence and general sensory ability remains to be replicated and explained.

  10. Dynamic test input generation for multiple-fault isolation

    NASA Technical Reports Server (NTRS)

    Schaefer, Phil

    1990-01-01

    Recent work is Causal Reasoning has provided practical techniques for multiple fault diagnosis. These techniques provide a hypothesis/measurement diagnosis cycle. Using probabilistic methods, they choose the best measurements to make, then update fault hypotheses in response. For many applications such as computers and spacecraft, few measurement points may be accessible, or values may change quickly as the system under diagnosis operates. In these cases, a hypothesis/measurement cycle is insufficient. A technique is presented for a hypothesis/test-input/measurement diagnosis cycle. In contrast to generating tests a priori for determining device functionality, it dynamically generates tests in response to current knowledge about fault probabilities. It is shown how the mathematics previously used for measurement specification can be applied to the test input generation process. An example from an efficient implementation called Multi-Purpose Causal (MPC) is presented.

  11. A framework to optimize the restoration and retention of large mature forest tracts in managed boreal landscapes.

    PubMed

    Bouchard, Mathieu; Garet, Jérôme

    The decreasing abundance of mature forests and their fragmentation have been identified as major threats for the preservation of biodiversity in managed landscapes. In this study, we developed a multi-level framework to coordinate forest harvestings so as to optimize the retention or restoration of large mature forest tracts in managed forests. We used mixed-integer programming for this optimization, and integrated realistic management assumptions regarding stand yield and operational harvest constraints. The model was parameterized for eastern Canadian boreal forests, where clear-cutting is the main silvicultural system, and is used to examine two hypotheses. First, we tested if mature forest tract targets had more negative impacts on wood supplies when implemented in landscapes that are very different from targeted conditions. Second, we tested the hypothesis that using more partial cuts can be useful to attenuate the negative impacts of mature forest targets on wood supplies. The results indicate that without the integration of an explicit mature forest tract target, the optimization leads to relatively high fragmentation levels. Forcing the retention or restoration of large mature forest tracts on 40% of the landscapes had negative impacts on wood supplies in all types of landscapes, but these impacts were less important in landscapes that were initially fragmented. This counter-intuitive result is explained by the presence in the models of an operational constraint that forbids diffuse patterns of harvestings, which are more costly. Once this constraint is applied, the residual impact of the mature forest tract target is low. The results also indicate that partial cuts are of very limited use to attenuate the impacts of mature forest tract targets on wood supplies in highly fragmented landscapes. Partial cuts are somewhat more useful in landscapes that are less fragmented, but they have to be well coordinated with clearcut schedules in order to contribute efficiently to conservation objectives. This modeling framework could easily be adapted and parameterized to test hypotheses or to optimize restoration schedules in landscapes where issues such as forest fragmentation and the abundance of mature or old-growth forests are a concern.

  12. Incentive payments are not related to expected health gain in the pay for performance scheme for UK primary care: cross-sectional analysis

    PubMed Central

    2012-01-01

    Background The General Medical Services primary care contract for the United Kingdom financially rewards performance in 19 clinical areas, through the Quality and Outcomes Framework. Little is known about how best to determine the size of financial incentives in pay for performance schemes. Our aim was to test the hypothesis that performance indicators with larger population health benefits receive larger financial incentives. Methods We performed cross sectional analyses to quantify associations between the size of financial incentives and expected health gain in the 2004 and 2006 versions of the Quality and Outcomes Framework. We used non-parametric two-sided Spearman rank correlation tests. Health gain was measured in expected lives saved in one year and in quality adjusted life years. For each quality indicator in an average sized general practice we tested for associations first, between the marginal increase in payment and the health gain resulting from a one percent point improvement in performance and second, between total payment and the health gain at the performance threshold for maximum payment. Results Evidence for lives saved or quality adjusted life years gained was found for 28 indicators accounting for 41% of the total incentive payments. No statistically significant associations were found between the expected health gain and incentive gained from a marginal 1% increase in performance in either the 2004 or 2006 version of the Quality and Outcomes Framework. In addition no associations were found between the size of financial payment for achievement of an indicator and the expected health gain at the performance threshold for maximum payment measured in lives saved or quality adjusted life years. Conclusions In this subgroup of indicators the financial incentives were not aligned to maximise health gain. This disconnection between incentive and expected health gain risks supporting clinical activities that are only marginally effective, at the expense of more effective activities receiving lower incentives. When designing pay for performance programmes decisions about the size of the financial incentive attached to an indicator should be informed by information on the health gain to be expected from that indicator. PMID:22507660

  13. In silico experiment system for testing hypothesis on gene functions using three condition specific biological networks.

    PubMed

    Lee, Chai-Jin; Kang, Dongwon; Lee, Sangseon; Lee, Sunwon; Kang, Jaewoo; Kim, Sun

    2018-05-25

    Determining functions of a gene requires time consuming, expensive biological experiments. Scientists can speed up this experimental process if the literature information and biological networks can be adequately provided. In this paper, we present a web-based information system that can perform in silico experiments of computationally testing hypothesis on the function of a gene. A hypothesis that is specified in English by the user is converted to genes using a literature and knowledge mining system called BEST. Condition-specific TF, miRNA and PPI (protein-protein interaction) networks are automatically generated by projecting gene and miRNA expression data to template networks. Then, an in silico experiment is to test how well the target genes are connected from the knockout gene through the condition-specific networks. The test result visualizes path from the knockout gene to the target genes in the three networks. Statistical and information-theoretic scores are provided on the resulting web page to help scientists either accept or reject the hypothesis being tested. Our web-based system was extensively tested using three data sets, such as E2f1, Lrrk2, and Dicer1 knockout data sets. We were able to re-produce gene functions reported in the original research papers. In addition, we comprehensively tested with all disease names in MalaCards as hypothesis to show the effectiveness of our system. Our in silico experiment system can be very useful in suggesting biological mechanisms which can be further tested in vivo or in vitro. http://biohealth.snu.ac.kr/software/insilico/. Copyright © 2018 Elsevier Inc. All rights reserved.

  14. Killeen's (2005) "p[subscript rep]" Coefficient: Logical and Mathematical Problems

    ERIC Educational Resources Information Center

    Maraun, Michael; Gabriel, Stephanie

    2010-01-01

    In his article, "An Alternative to Null-Hypothesis Significance Tests," Killeen (2005) urged the discipline to abandon the practice of "p[subscript obs]"-based null hypothesis testing and to quantify the signal-to-noise characteristics of experimental outcomes with replication probabilities. He described the coefficient that he…

  15. Using VITA Service Learning Experiences to Teach Hypothesis Testing and P-Value Analysis

    ERIC Educational Resources Information Center

    Drougas, Anne; Harrington, Steve

    2011-01-01

    This paper describes a hypothesis testing project designed to capture student interest and stimulate classroom interaction and communication. Using an online survey instrument, the authors collected student demographic information and data regarding university service learning experiences. Introductory statistics students performed a series of…

  16. A Rational Analysis of the Selection Task as Optimal Data Selection.

    ERIC Educational Resources Information Center

    Oaksford, Mike; Chater, Nick

    1994-01-01

    Experimental data on human reasoning in hypothesis-testing tasks is reassessed in light of a Bayesian model of optimal data selection in inductive hypothesis testing. The rational analysis provided by the model suggests that reasoning in such tasks may be rational rather than subject to systematic bias. (SLD)

  17. Random Effects Structure for Confirmatory Hypothesis Testing: Keep It Maximal

    ERIC Educational Resources Information Center

    Barr, Dale J.; Levy, Roger; Scheepers, Christoph; Tily, Harry J.

    2013-01-01

    Linear mixed-effects models (LMEMs) have become increasingly prominent in psycholinguistics and related areas. However, many researchers do not seem to appreciate how random effects structures affect the generalizability of an analysis. Here, we argue that researchers using LMEMs for confirmatory hypothesis testing should minimally adhere to the…

  18. The effects of rater bias and assessment method used to estimate disease severity on hypothesis testing

    USDA-ARS?s Scientific Manuscript database

    The effects of bias (over and underestimates) in estimates of disease severity on hypothesis testing using different assessment methods was explored. Nearest percent estimates (NPE), the Horsfall-Barratt (H-B) scale, and two different linear category scales (10% increments, with and without addition...

  19. A Multivariate Test of the Bott Hypothesis in an Urban Irish Setting

    ERIC Educational Resources Information Center

    Gordon, Michael; Downing, Helen

    1978-01-01

    Using a sample of 686 married Irish women in Cork City the Bott hypothesis was tested, and the results of a multivariate regression analysis revealed that neither network connectedness nor the strength of the respondent's emotional ties to the network had any explanatory power. (Author)

  20. Polarization, Definition, and Selective Media Learning.

    ERIC Educational Resources Information Center

    Tichenor, P. J.; And Others

    The traditional hypothesis that extreme attitudinal positions on controversial issues are likely to produce low understanding of messages on these issues--especially when the messages represent opposing views--is tested. Data for test of the hypothesis are from two field studies, each dealing with reader attitudes and decoding of one news article…

  1. The Lasting Effects of Introductory Economics Courses.

    ERIC Educational Resources Information Center

    Sanders, Philip

    1980-01-01

    Reports research which tests the Stigler Hypothesis. The hypothesis suggests that students who have taken introductory economics courses and those who have not show little difference in test performance five years after completing college. Results of the author's research illustrate that economics students do retain some knowledge of economics…

  2. Robust Approach to Verifying the Weak Form of the Efficient Market Hypothesis

    NASA Astrophysics Data System (ADS)

    Střelec, Luboš

    2011-09-01

    The weak form of the efficient markets hypothesis states that prices incorporate only past information about the asset. An implication of this form of the efficient markets hypothesis is that one cannot detect mispriced assets and consistently outperform the market through technical analysis of past prices. One of possible formulations of the efficient market hypothesis used for weak form tests is that share prices follow a random walk. It means that returns are realizations of IID sequence of random variables. Consequently, for verifying the weak form of the efficient market hypothesis, we can use distribution tests, among others, i.e. some tests of normality and/or some graphical methods. Many procedures for testing the normality of univariate samples have been proposed in the literature [7]. Today the most popular omnibus test of normality for a general use is the Shapiro-Wilk test. The Jarque-Bera test is the most widely adopted omnibus test of normality in econometrics and related fields. In particular, the Jarque-Bera test (i.e. test based on the classical measures of skewness and kurtosis) is frequently used when one is more concerned about heavy-tailed alternatives. As these measures are based on moments of the data, this test has a zero breakdown value [2]. In other words, a single outlier can make the test worthless. The reason so many classical procedures are nonrobust to outliers is that the parameters of the model are expressed in terms of moments, and their classical estimators are expressed in terms of sample moments, which are very sensitive to outliers. Another approach to robustness is to concentrate on the parameters of interest suggested by the problem under this study. Consequently, novel robust testing procedures of testing normality are presented in this paper to overcome shortcomings of classical normality tests in the field of financial data, which are typical with occurrence of remote data points and additional types of deviations from normality. This study also discusses some results of simulation power studies of these tests for normality against selected alternatives. Based on outcome of the power simulation study, selected normality tests were consequently used to verify weak form of efficiency in Central Europe stock markets.

  3. Enhancing Concept Comprehension in a Web-Based Course Using a Framework Integrating the Learning Cycle with Variation Theory

    ERIC Educational Resources Information Center

    Hsu, Chih-Chao; Wang, Tzone-I

    2014-01-01

    Concept comprehension is an important foundation for more complex thoughts. To enhance concept comprehension, teachers of traditional classrooms have been using instructional strategies and specific course designs, which have been proven effective. It initiates a hypothesis that integrating instructional strategies in the course designs of an…

  4. Dynamic heterogeneity: a framework to promote ecological integration and hypothesis generation in urban systems

    Treesearch

    S. T. A. Pickett; M. L. Cadenasso; E. J. Rosi-Marshall; Ken Belt; P. M. Groffman; Morgan Grove; E. G. Irwin; S. S. Kaushal; S. L. LaDeau; C. H. Nilon; C. M. Swan; P. S. Warren

    2016-01-01

    Urban areas are understood to be extraordinarily spatially heterogeneous. Spatial heterogeneity, and its causes, consequences, and changes, are central to ecological science. The social sciences and urban design and planning professions also include spatial heterogeneity as a key concern. However, urban ecology, as a pursuit that integrates across these disciplines,...

  5. Declining Symptom of Academic Productivity in the Japanese Research University Sector

    ERIC Educational Resources Information Center

    Arimoto, Akira

    2015-01-01

    From a framework of this study, modern society may be explained by a paradigm transformation from ascription to achievement and also from particularism to universalism. According to this hypothesis, Japanese university society has not developed successfully for more than the closed society to the opened society. This paper intends to deal with the…

  6. Evaluation of temperature history of a spherical nanosystem irradiated with various short-pulse laser sources

    NASA Astrophysics Data System (ADS)

    Lahiri, Arnab; Mondal, Pranab K.

    2018-04-01

    Spatiotemporal thermal response and characteristics of net entropy production rate of a gold nanosphere (radius: 50-200 nm), subjected to a short-pulse, femtosecond laser is reported. In order to correctly illustrate the temperature history of laser-metal interaction(s) at picoseconds transient with a comprehensive single temperature definition in macroscale and to further understand how the thermophysical response of the single-phase lag (SPL) and dual-phase lag (DPL) frameworks (with various lag-ratios') differs, governing energy equations derived from these benchmark non-Fourier frameworks are numerically solved and thermodynamic assessment under both the classical irreversible thermodynamics (CIT) as well as extended irreversible thermodynamics (EIT) frameworks is subsequently carried out. Under the frameworks of SPL and DPL with small lag ratio, thermophysical anomalies such as temperature overshooting characterized by adverse temperature gradient is observed to violate the local thermodynamic equilibrium (LTE) hypothesis. The EIT framework, however, justifies the compatibility of overshooting of temperature with the second law of thermodynamics under a nonequilibrium paradigm. The DPL framework with higher lag ratio was however observed to remain free from temperature overshooting and finds suitable consistency with LTE hypothesis. In order to solve the dimensional non-Fourier governing energy equation with volumetric laser-irradiation source term(s), the lattice Boltzmann method (LBM) is extended and a three-time level, fully implicit, second order accurate finite difference method (FDM) is illustrated. For all situations under observation, the LBM scheme is featured to be computationally superior to remaining FDM schemes. With detailed prediction of maximum temperature rise and the corresponding peaking time by all the numerical schemes, effects of the change of radius of the gold nanosphere, the magnitude of fluence of laser, and laser irradiation with multiple pulses on thermal energy transport and lagging behavior (if any) are further elucidated at different radial locations of the gold nanosphere. Last, efforts are further made to address the thermophysical characteristics when effective thermal conductivity (with temporal and size effects) is considered instead of the usual bulk thermal conductivity.

  7. Concerns regarding a call for pluralism of information theory and hypothesis testing

    USGS Publications Warehouse

    Lukacs, P.M.; Thompson, W.L.; Kendall, W.L.; Gould, W.R.; Doherty, P.F.; Burnham, K.P.; Anderson, D.R.

    2007-01-01

    1. Stephens et al . (2005) argue for `pluralism? in statistical analysis, combining null hypothesis testing and information-theoretic (I-T) methods. We show that I-T methods are more informative even in single variable problems and we provide an ecological example. 2. I-T methods allow inferences to be made from multiple models simultaneously. We believe multimodel inference is the future of data analysis, which cannot be achieved with null hypothesis-testing approaches. 3. We argue for a stronger emphasis on critical thinking in science in general and less reliance on exploratory data analysis and data dredging. Deriving alternative hypotheses is central to science; deriving a single interesting science hypothesis and then comparing it to a default null hypothesis (e.g. `no difference?) is not an efficient strategy for gaining knowledge. We think this single-hypothesis strategy has been relied upon too often in the past. 4. We clarify misconceptions presented by Stephens et al . (2005). 5. We think inference should be made about models, directly linked to scientific hypotheses, and their parameters conditioned on data, Prob(Hj| data). I-T methods provide a basis for this inference. Null hypothesis testing merely provides a probability statement about the data conditioned on a null model, Prob(data |H0). 6. Synthesis and applications. I-T methods provide a more informative approach to inference. I-T methods provide a direct measure of evidence for or against hypotheses and a means to consider simultaneously multiple hypotheses as a basis for rigorous inference. Progress in our science can be accelerated if modern methods can be used intelligently; this includes various I-T and Bayesian methods.

  8. Characterizing and Addressing the Need for Statistical Adjustment of Global Climate Model Data

    NASA Astrophysics Data System (ADS)

    White, K. D.; Baker, B.; Mueller, C.; Villarini, G.; Foley, P.; Friedman, D.

    2017-12-01

    As part of its mission to research and measure the effects of the changing climate, the U. S. Army Corps of Engineers (USACE) regularly uses the World Climate Research Programme's Coupled Model Intercomparison Project Phase 5 (CMIP5) multi-model dataset. However, these data are generated at a global level and are not fine-tuned for specific watersheds. This often causes CMIP5 output to vary from locally observed patterns in the climate. Several downscaling methods have been developed to increase the resolution of the CMIP5 data and decrease systemic differences to support decision-makers as they evaluate results at the watershed scale. Evaluating preliminary comparisons of observed and projected flow frequency curves over the US revealed a simple framework for water resources decision makers to plan and design water resources management measures under changing conditions using standard tools. Using this framework as a basis, USACE has begun to explore to use of statistical adjustment to alter global climate model data to better match the locally observed patterns while preserving the general structure and behavior of the model data. When paired with careful measurement and hypothesis testing, statistical adjustment can be particularly effective at navigating the compromise between the locally observed patterns and the global climate model structures for decision makers.

  9. Microbial mitigation-exacerbation continuum: a novel framework for microbiome effects on hosts in the face of stress.

    PubMed

    David, Aaron S; Thapa-Magar, Khum B; Afkhami, Michelle E

    2018-03-01

    A key challenge to understanding microbiomes and their role in ecological processes is contextualizing their effects on host organisms, particularly when faced with environmental stress. One influential theory, the Stress Gradient Hypothesis, might predict that the frequency of positive interactions increases with stressful conditions such that microbial taxa would mitigate harmful effects on host performance. Yet, equally plausible is that microbial taxa could exacerbate these effects. Here, we introduce the Mitigation-Exacerbation Continuum as a novel framework to conceptualize microbial mediation of stress. We (1) use this continuum to quantify microbial mediation of stress for six plant species and (2) test the association between these continuum values and natural species' abundance. We factorially manipulated a common stress (allelopathy) and the presence of soil microbes to quantify microbial effects in benign and stressed environments for two critical early life-history metrics, seed germination and seedling biomass. Although we found evidence of both mitigation and exacerbation among the six species, exacerbation was more common. Across species, the degree of microbial-mediated effects on germination explained >80% of the variation of natural field abundances. Our results suggest a critical role of soil microbes in mediating plant stress responses, and a potential microbial mechanism underlying species abundance. © 2018 by the Ecological Society of America.

  10. Investigating Soil Moisture Feedbacks on Precipitation With Tests of Granger Causality

    NASA Astrophysics Data System (ADS)

    Salvucci, G. D.; Saleem, J. A.; Kaufmann, R.

    2002-05-01

    Granger causality (GC) is used in the econometrics literature to identify the presence of one- and two-way coupling between terms in noisy multivariate dynamical systems. Here we test for the presence of GC to identify a soil moisture (S) feedback on precipitation (P) using data from Illinois. In this framework S is said to Granger cause P if F(Pt;At-dt)does not equal F(P;(A-S)t-dt) where F denotes the conditional distribution of P at time t, At-dt represents the set of all knowledge available at time t-dt, and (A-S)t-dt represents all knowledge available at t-dt except S. Critical for land-atmosphere interaction research is that At-dt includes all past information on P as well as S. Therefore that part of the relation between past soil moisture and current precipitation which results from precipitation autocorrelation and soil water balance will be accounted for and not attributed to causality. Tests for GC usually specify all relevant variables in a coupled vector autoregressive (VAR) model and then calculate the significance level of decreased predictability as various coupling coefficients are omitted. But because the data (daily precipitation and soil moisture) are distinctly non-Gaussian, we avoid using a VAR and instead express the daily precipitation events as a Markov model. We then test whether the probability of storm occurrence, conditioned on past information on precipitation, changes with information on soil moisture. Past information on precipitation is expressed both as the occurrence of previous day precipitation (to account for storm-scale persistence) and as a simple soil moisture-like precipitation-wetness index derived solely from precipitation (to account for seasonal-scale persistence). In this way only those fluctuations in moisture not attributable to past fluctuations in precipitation (e.g., those due to temperature) can influence the outcome of the test. The null hypothesis (no moisture influence) is evaluated by comparing observed changes in storm probability to Monte-Carlo simulated differences generated with unconditional occurrence probabilities. The null hypothesis is not rejected (p>0.5) suggesting that contrary to recently published results, insufficient evidence exists to support an influence of soil moisture on precipitation in Illinois.

  11. CorSig: a general framework for estimating statistical significance of correlation and its application to gene co-expression analysis.

    PubMed

    Wang, Hong-Qiang; Tsai, Chung-Jui

    2013-01-01

    With the rapid increase of omics data, correlation analysis has become an indispensable tool for inferring meaningful associations from a large number of observations. Pearson correlation coefficient (PCC) and its variants are widely used for such purposes. However, it remains challenging to test whether an observed association is reliable both statistically and biologically. We present here a new method, CorSig, for statistical inference of correlation significance. CorSig is based on a biology-informed null hypothesis, i.e., testing whether the true PCC (ρ) between two variables is statistically larger than a user-specified PCC cutoff (τ), as opposed to the simple null hypothesis of ρ = 0 in existing methods, i.e., testing whether an association can be declared without a threshold. CorSig incorporates Fisher's Z transformation of the observed PCC (r), which facilitates use of standard techniques for p-value computation and multiple testing corrections. We compared CorSig against two methods: one uses a minimum PCC cutoff while the other (Zhu's procedure) controls correlation strength and statistical significance in two discrete steps. CorSig consistently outperformed these methods in various simulation data scenarios by balancing between false positives and false negatives. When tested on real-world Populus microarray data, CorSig effectively identified co-expressed genes in the flavonoid pathway, and discriminated between closely related gene family members for their differential association with flavonoid and lignin pathways. The p-values obtained by CorSig can be used as a stand-alone parameter for stratification of co-expressed genes according to their correlation strength in lieu of an arbitrary cutoff. CorSig requires one single tunable parameter, and can be readily extended to other correlation measures. Thus, CorSig should be useful for a wide range of applications, particularly for network analysis of high-dimensional genomic data. A web server for CorSig is provided at http://202.127.200.1:8080/probeWeb. R code for CorSig is freely available for non-commercial use at http://aspendb.uga.edu/downloads.

  12. Testing the status-legitimacy hypothesis: A multilevel modeling approach to the perception of legitimacy in income distribution in 36 nations.

    PubMed

    Caricati, Luca

    2017-01-01

    The status-legitimacy hypothesis was tested by analyzing cross-national data about social inequality. Several indicators were used as indexes of social advantage: social class, personal income, and self-position in the social hierarchy. Moreover, inequality and freedom in nations, as indexed by Gini and by the human freedom index, were considered. Results from 36 nations worldwide showed no support for the status-legitimacy hypothesis. The perception that income distribution was fair tended to increase as social advantage increased. Moreover, national context increased the difference between advantaged and disadvantaged people in the perception of social fairness: Contrary to the status-legitimacy hypothesis, disadvantaged people were more likely than advantaged people to perceive income distribution as too large, and this difference increased in nations with greater freedom and equality. The implications for the status-legitimacy hypothesis are discussed.

  13. Tests of the Giant Impact Hypothesis

    NASA Technical Reports Server (NTRS)

    Jones, J. H.

    1998-01-01

    The giant impact hypothesis has gained popularity as a means of explaining a volatile-depleted Moon that still has a chemical affinity to the Earth. As Taylor's Axiom decrees, the best models of lunar origin are testable, but this is difficult with the giant impact model. The energy associated with the impact would be sufficient to totally melt and partially vaporize the Earth. And this means that there should he no geological vestige of Barber times. Accordingly, it is important to devise tests that may be used to evaluate the giant impact hypothesis. Three such tests are discussed here. None of these is supportive of the giant impact model, but neither do they disprove it.

  14. Age Dedifferentiation Hypothesis: Evidence form the WAIS III.

    ERIC Educational Resources Information Center

    Juan-Espinosa, Manuel; Garcia, Luis F.; Escorial, Sergio; Rebollo, Irene; Colom, Roberto; Abad, Francisco J.

    2002-01-01

    Used the Spanish standardization of the Wechsler Adult Intelligence Scale III (WAIS III) (n=1,369) to test the age dedifferentiation hypothesis. Results show no changes in the percentage of variance accounted for by "g" and four group factors when restriction of range is controlled. Discusses an age indifferentation hypothesis. (SLD)

  15. Hypothesis tests for the detection of constant speed radiation moving sources

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Dumazert, Jonathan; Coulon, Romain; Kondrasovs, Vladimir

    2015-07-01

    Radiation Portal Monitors are deployed in linear network to detect radiological material in motion. As a complement to single and multichannel detection algorithms, inefficient under too low signal to noise ratios, temporal correlation algorithms have been introduced. Test hypothesis methods based on empirically estimated mean and variance of the signals delivered by the different channels have shown significant gain in terms of a tradeoff between detection sensitivity and false alarm probability. This paper discloses the concept of a new hypothesis test for temporal correlation detection methods, taking advantage of the Poisson nature of the registered counting signals, and establishes amore » benchmark between this test and its empirical counterpart. The simulation study validates that in the four relevant configurations of a pedestrian source carrier under respectively high and low count rate radioactive background, and a vehicle source carrier under the same respectively high and low count rate radioactive background, the newly introduced hypothesis test ensures a significantly improved compromise between sensitivity and false alarm, while guaranteeing the stability of its optimization parameter regardless of signal to noise ratio variations between 2 to 0.8. (authors)« less

  16. Multiple Hypothesis Testing for Experimental Gingivitis Based on Wilcoxon Signed Rank Statistics

    PubMed Central

    Preisser, John S.; Sen, Pranab K.; Offenbacher, Steven

    2011-01-01

    Dental research often involves repeated multivariate outcomes on a small number of subjects for which there is interest in identifying outcomes that exhibit change in their levels over time as well as to characterize the nature of that change. In particular, periodontal research often involves the analysis of molecular mediators of inflammation for which multivariate parametric methods are highly sensitive to outliers and deviations from Gaussian assumptions. In such settings, nonparametric methods may be favored over parametric ones. Additionally, there is a need for statistical methods that control an overall error rate for multiple hypothesis testing. We review univariate and multivariate nonparametric hypothesis tests and apply them to longitudinal data to assess changes over time in 31 biomarkers measured from the gingival crevicular fluid in 22 subjects whereby gingivitis was induced by temporarily withholding tooth brushing. To identify biomarkers that can be induced to change, multivariate Wilcoxon signed rank tests for a set of four summary measures based upon area under the curve are applied for each biomarker and compared to their univariate counterparts. Multiple hypothesis testing methods with choice of control of the false discovery rate or strong control of the family-wise error rate are examined. PMID:21984957

  17. Filling the gap in functional trait databases: use of ecological hypotheses to replace missing data.

    PubMed

    Taugourdeau, Simon; Villerd, Jean; Plantureux, Sylvain; Huguenin-Elie, Olivier; Amiaud, Bernard

    2014-04-01

    Functional trait databases are powerful tools in ecology, though most of them contain large amounts of missing values. The goal of this study was to test the effect of imputation methods on the evaluation of trait values at species level and on the subsequent calculation of functional diversity indices at community level using functional trait databases. Two simple imputation methods (average and median), two methods based on ecological hypotheses, and one multiple imputation method were tested using a large plant trait database, together with the influence of the percentage of missing data and differences between functional traits. At community level, the complete-case approach and three functional diversity indices calculated from grassland plant communities were included. At the species level, one of the methods based on ecological hypothesis was for all traits more accurate than imputation with average or median values, but the multiple imputation method was superior for most of the traits. The method based on functional proximity between species was the best method for traits with an unbalanced distribution, while the method based on the existence of relationships between traits was the best for traits with a balanced distribution. The ranking of the grassland communities for their functional diversity indices was not robust with the complete-case approach, even for low percentages of missing data. With the imputation methods based on ecological hypotheses, functional diversity indices could be computed with a maximum of 30% of missing data, without affecting the ranking between grassland communities. The multiple imputation method performed well, but not better than single imputation based on ecological hypothesis and adapted to the distribution of the trait values for the functional identity and range of the communities. Ecological studies using functional trait databases have to deal with missing data using imputation methods corresponding to their specific needs and making the most out of the information available in the databases. Within this framework, this study indicates the possibilities and limits of single imputation methods based on ecological hypothesis and concludes that they could be useful when studying the ranking of communities for their functional diversity indices.

  18. Filling the gap in functional trait databases: use of ecological hypotheses to replace missing data

    PubMed Central

    Taugourdeau, Simon; Villerd, Jean; Plantureux, Sylvain; Huguenin-Elie, Olivier; Amiaud, Bernard

    2014-01-01

    Functional trait databases are powerful tools in ecology, though most of them contain large amounts of missing values. The goal of this study was to test the effect of imputation methods on the evaluation of trait values at species level and on the subsequent calculation of functional diversity indices at community level using functional trait databases. Two simple imputation methods (average and median), two methods based on ecological hypotheses, and one multiple imputation method were tested using a large plant trait database, together with the influence of the percentage of missing data and differences between functional traits. At community level, the complete-case approach and three functional diversity indices calculated from grassland plant communities were included. At the species level, one of the methods based on ecological hypothesis was for all traits more accurate than imputation with average or median values, but the multiple imputation method was superior for most of the traits. The method based on functional proximity between species was the best method for traits with an unbalanced distribution, while the method based on the existence of relationships between traits was the best for traits with a balanced distribution. The ranking of the grassland communities for their functional diversity indices was not robust with the complete-case approach, even for low percentages of missing data. With the imputation methods based on ecological hypotheses, functional diversity indices could be computed with a maximum of 30% of missing data, without affecting the ranking between grassland communities. The multiple imputation method performed well, but not better than single imputation based on ecological hypothesis and adapted to the distribution of the trait values for the functional identity and range of the communities. Ecological studies using functional trait databases have to deal with missing data using imputation methods corresponding to their specific needs and making the most out of the information available in the databases. Within this framework, this study indicates the possibilities and limits of single imputation methods based on ecological hypothesis and concludes that they could be useful when studying the ranking of communities for their functional diversity indices. PMID:24772273

  19. A default Bayesian hypothesis test for mediation.

    PubMed

    Nuijten, Michèle B; Wetzels, Ruud; Matzke, Dora; Dolan, Conor V; Wagenmakers, Eric-Jan

    2015-03-01

    In order to quantify the relationship between multiple variables, researchers often carry out a mediation analysis. In such an analysis, a mediator (e.g., knowledge of a healthy diet) transmits the effect from an independent variable (e.g., classroom instruction on a healthy diet) to a dependent variable (e.g., consumption of fruits and vegetables). Almost all mediation analyses in psychology use frequentist estimation and hypothesis-testing techniques. A recent exception is Yuan and MacKinnon (Psychological Methods, 14, 301-322, 2009), who outlined a Bayesian parameter estimation procedure for mediation analysis. Here we complete the Bayesian alternative to frequentist mediation analysis by specifying a default Bayesian hypothesis test based on the Jeffreys-Zellner-Siow approach. We further extend this default Bayesian test by allowing a comparison to directional or one-sided alternatives, using Markov chain Monte Carlo techniques implemented in JAGS. All Bayesian tests are implemented in the R package BayesMed (Nuijten, Wetzels, Matzke, Dolan, & Wagenmakers, 2014).

  20. Knowledge Base Refinement as Improving an Incorrect and Incomplete Domain Theory

    DTIC Science & Technology

    1990-04-01

    Ginsberg et al., 1985), and RL (Fu and Buchanan, 1985), which perform empirical induction over a library of test cases. This chapter describes a new...state knowledge. Examples of high-level goals are: to test a hypothesis, to differentiate between several plausible hypotheses, to ask a clarifying...one tuple when we Group Hypotheses Test Hypothesis Applyrule Findout Strategy Metarule Strategy Metarule Strategy Metarule Strategy Metarule goal(group

Top