On Restructurable Control System Theory
NASA Technical Reports Server (NTRS)
Athans, M.
1983-01-01
The state of stochastic system and control theory as it impacts restructurable control issues is addressed. The multivariable characteristics of the control problem are addressed. The failure detection/identification problem is discussed as a multi-hypothesis testing problem. Control strategy reconfiguration, static multivariable controls, static failure hypothesis testing, dynamic multivariable controls, fault-tolerant control theory, dynamic hypothesis testing, generalized likelihood ratio (GLR) methods, and adaptive control are discussed.
Haller, Moira; Chassin, Laurie
2014-09-01
The present study utilized longitudinal data from a community sample (n = 377; 166 trauma-exposed; 54% males; 73% non-Hispanic Caucasian; 22% Hispanic; 5% other ethnicity) to test whether pretrauma substance use problems increase risk for trauma exposure (high-risk hypothesis) or posttraumatic stress disorder (PTSD) symptoms (susceptibility hypothesis), whether PTSD symptoms increase risk for later alcohol/drug problems (self-medication hypothesis), and whether the association between PTSD symptoms and alcohol/drug problems is attributable to shared risk factors (shared vulnerability hypothesis). Logistic and negative binomial regressions were performed in a path analysis framework. Results provided the strongest support for the self-medication hypothesis, such that PTSD symptoms predicted higher levels of later alcohol and drug problems, over and above the influences of pretrauma family risk factors, pretrauma substance use problems, trauma exposure, and demographic variables. Results partially supported the high-risk hypothesis, such that adolescent substance use problems increased risk for assaultive violence exposure but did not influence overall risk for trauma exposure. There was no support for the susceptibility hypothesis. Finally, there was little support for the shared vulnerability hypothesis. Neither trauma exposure nor preexisting family adversity accounted for the link between PTSD symptoms and later substance use problems. Rather, PTSD symptoms mediated the effect of pretrauma family adversity on later alcohol and drug problems, thereby supporting the self-medication hypothesis. These findings make important contributions to better understanding the directions of influence among traumatic stress, PTSD symptoms, and substance use problems.
Knowledge dimensions in hypothesis test problems
NASA Astrophysics Data System (ADS)
Krishnan, Saras; Idris, Noraini
2012-05-01
The reformation in statistics education over the past two decades has predominantly shifted the focus of statistical teaching and learning from procedural understanding to conceptual understanding. The emphasis of procedural understanding is on the formulas and calculation procedures. Meanwhile, conceptual understanding emphasizes students knowing why they are using a particular formula or executing a specific procedure. In addition, the Revised Bloom's Taxonomy offers a twodimensional framework to describe learning objectives comprising of the six revised cognition levels of original Bloom's taxonomy and four knowledge dimensions. Depending on the level of complexities, the four knowledge dimensions essentially distinguish basic understanding from the more connected understanding. This study identifiesthe factual, procedural and conceptual knowledgedimensions in hypothesis test problems. Hypothesis test being an important tool in making inferences about a population from sample informationis taught in many introductory statistics courses. However, researchers find that students in these courses still have difficulty in understanding the underlying concepts of hypothesis test. Past studies also show that even though students can perform the hypothesis testing procedure, they may not understand the rationale of executing these steps or know how to apply them in novel contexts. Besides knowing the procedural steps in conducting a hypothesis test, students must have fundamental statistical knowledge and deep understanding of the underlying inferential concepts such as sampling distribution and central limit theorem. By identifying the knowledge dimensions of hypothesis test problems in this study, suitable instructional and assessment strategies can be developed in future to enhance students' learning of hypothesis test as a valuable inferential tool.
Bayesian inference for psychology. Part II: Example applications with JASP.
Wagenmakers, Eric-Jan; Love, Jonathon; Marsman, Maarten; Jamil, Tahira; Ly, Alexander; Verhagen, Josine; Selker, Ravi; Gronau, Quentin F; Dropmann, Damian; Boutin, Bruno; Meerhoff, Frans; Knight, Patrick; Raj, Akash; van Kesteren, Erik-Jan; van Doorn, Johnny; Šmíra, Martin; Epskamp, Sacha; Etz, Alexander; Matzke, Dora; de Jong, Tim; van den Bergh, Don; Sarafoglou, Alexandra; Steingroever, Helen; Derks, Koen; Rouder, Jeffrey N; Morey, Richard D
2018-02-01
Bayesian hypothesis testing presents an attractive alternative to p value hypothesis testing. Part I of this series outlined several advantages of Bayesian hypothesis testing, including the ability to quantify evidence and the ability to monitor and update this evidence as data come in, without the need to know the intention with which the data were collected. Despite these and other practical advantages, Bayesian hypothesis tests are still reported relatively rarely. An important impediment to the widespread adoption of Bayesian tests is arguably the lack of user-friendly software for the run-of-the-mill statistical problems that confront psychologists for the analysis of almost every experiment: the t-test, ANOVA, correlation, regression, and contingency tables. In Part II of this series we introduce JASP ( http://www.jasp-stats.org ), an open-source, cross-platform, user-friendly graphical software package that allows users to carry out Bayesian hypothesis tests for standard statistical problems. JASP is based in part on the Bayesian analyses implemented in Morey and Rouder's BayesFactor package for R. Armed with JASP, the practical advantages of Bayesian hypothesis testing are only a mouse click away.
Hypothesis testing of scientific Monte Carlo calculations.
Wallerberger, Markus; Gull, Emanuel
2017-11-01
The steadily increasing size of scientific Monte Carlo simulations and the desire for robust, correct, and reproducible results necessitates rigorous testing procedures for scientific simulations in order to detect numerical problems and programming bugs. However, the testing paradigms developed for deterministic algorithms have proven to be ill suited for stochastic algorithms. In this paper we demonstrate explicitly how the technique of statistical hypothesis testing, which is in wide use in other fields of science, can be used to devise automatic and reliable tests for Monte Carlo methods, and we show that these tests are able to detect some of the common problems encountered in stochastic scientific simulations. We argue that hypothesis testing should become part of the standard testing toolkit for scientific simulations.
Hypothesis testing of scientific Monte Carlo calculations
NASA Astrophysics Data System (ADS)
Wallerberger, Markus; Gull, Emanuel
2017-11-01
The steadily increasing size of scientific Monte Carlo simulations and the desire for robust, correct, and reproducible results necessitates rigorous testing procedures for scientific simulations in order to detect numerical problems and programming bugs. However, the testing paradigms developed for deterministic algorithms have proven to be ill suited for stochastic algorithms. In this paper we demonstrate explicitly how the technique of statistical hypothesis testing, which is in wide use in other fields of science, can be used to devise automatic and reliable tests for Monte Carlo methods, and we show that these tests are able to detect some of the common problems encountered in stochastic scientific simulations. We argue that hypothesis testing should become part of the standard testing toolkit for scientific simulations.
Assessment of Theory of Mind in Children with Communication Disorders: Role of Presentation Mode
ERIC Educational Resources Information Center
van Buijsen, Marit; Hendriks, Angelique; Ketelaars, Mieke; Verhoeven, Ludo
2011-01-01
Children with communication disorders have problems with both language and social interaction. The theory-of-mind hypothesis provides an explanation for these problems, and different tests have been developed to test this hypothesis. However, different modes of presentation are used in these tasks, which make the results difficult to compare. In…
DOE Office of Scientific and Technical Information (OSTI.GOV)
Nelson, P.; Seth, D.L.; Ray, A.K.
A detailed and systematic study of the nature of the discretization error associated with the upwind finite-difference method is presented. A basic model problem has been identified and based upon the results for this problem, a basic hypothesis regarding the accuracy of the computational solution of the Spencer-Lewis equation is formulated. The basic hypothesis is then tested under various systematic single complexifications of the basic model problem. The results of these tests provide the framework of the refined hypothesis presented in the concluding comments. 27 refs., 3 figs., 14 tabs.
ERIC Educational Resources Information Center
Liu, Lisa L.; Lau, Anna S.; Chen, Angela Chia-Chen; Dinh, Khanh T.; Kim, Su Yeong
2009-01-01
Associations among neighborhood disadvantage, maternal acculturation, parenting and conduct problems were investigated in a sample of 444 Chinese American adolescents. Adolescents (54% female, 46% male) ranged from 12 to 15 years of age (mean age = 13.0 years). Multilevel modeling was employed to test the hypothesis that the association between…
Hypothesis testing in hydrology: Theory and practice
NASA Astrophysics Data System (ADS)
Kirchner, James; Pfister, Laurent
2017-04-01
Well-posed hypothesis tests have spurred major advances in hydrological theory. However, a random sample of recent research papers suggests that in hydrology, as in other fields, hypothesis formulation and testing rarely correspond to the idealized model of the scientific method. Practices such as "p-hacking" or "HARKing" (Hypothesizing After the Results are Known) are major obstacles to more rigorous hypothesis testing in hydrology, along with the well-known problem of confirmation bias - the tendency to value and trust confirmations more than refutations - among both researchers and reviewers. Hypothesis testing is not the only recipe for scientific progress, however: exploratory research, driven by innovations in measurement and observation, has also underlain many key advances. Further improvements in observation and measurement will be vital to both exploratory research and hypothesis testing, and thus to advancing the science of hydrology.
NASA Astrophysics Data System (ADS)
Lehmann, Rüdiger; Lösler, Michael
2017-12-01
Geodetic deformation analysis can be interpreted as a model selection problem. The null model indicates that no deformation has occurred. It is opposed to a number of alternative models, which stipulate different deformation patterns. A common way to select the right model is the usage of a statistical hypothesis test. However, since we have to test a series of deformation patterns, this must be a multiple test. As an alternative solution for the test problem, we propose the p-value approach. Another approach arises from information theory. Here, the Akaike information criterion (AIC) or some alternative is used to select an appropriate model for a given set of observations. Both approaches are discussed and applied to two test scenarios: A synthetic levelling network and the Delft test data set. It is demonstrated that they work but behave differently, sometimes even producing different results. Hypothesis tests are well-established in geodesy, but may suffer from an unfavourable choice of the decision error rates. The multiple test also suffers from statistical dependencies between the test statistics, which are neglected. Both problems are overcome by applying information criterions like AIC.
ERIC Educational Resources Information Center
Trafimow, David
2017-01-01
There has been much controversy over the null hypothesis significance testing procedure, with much of the criticism centered on the problem of inverse inference. Specifically, p gives the probability of the finding (or one more extreme) given the null hypothesis, whereas the null hypothesis significance testing procedure involves drawing a…
Computer-Assisted Problem Solving in School Mathematics
ERIC Educational Resources Information Center
Hatfield, Larry L.; Kieren, Thomas E.
1972-01-01
A test of the hypothesis that writing and using computer programs related to selected mathematical content positively affects performance on those topics. Results particularly support the hypothesis. (MM)
Killeen's (2005) "p[subscript rep]" Coefficient: Logical and Mathematical Problems
ERIC Educational Resources Information Center
Maraun, Michael; Gabriel, Stephanie
2010-01-01
In his article, "An Alternative to Null-Hypothesis Significance Tests," Killeen (2005) urged the discipline to abandon the practice of "p[subscript obs]"-based null hypothesis testing and to quantify the signal-to-noise characteristics of experimental outcomes with replication probabilities. He described the coefficient that he…
Testing the null hypothesis: the forgotten legacy of Karl Popper?
Wilkinson, Mick
2013-01-01
Testing of the null hypothesis is a fundamental aspect of the scientific method and has its basis in the falsification theory of Karl Popper. Null hypothesis testing makes use of deductive reasoning to ensure that the truth of conclusions is irrefutable. In contrast, attempting to demonstrate the new facts on the basis of testing the experimental or research hypothesis makes use of inductive reasoning and is prone to the problem of the Uniformity of Nature assumption described by David Hume in the eighteenth century. Despite this issue and the well documented solution provided by Popper's falsification theory, the majority of publications are still written such that they suggest the research hypothesis is being tested. This is contrary to accepted scientific convention and possibly highlights a poor understanding of the application of conventional significance-based data analysis approaches. Our work should remain driven by conjecture and attempted falsification such that it is always the null hypothesis that is tested. The write up of our studies should make it clear that we are indeed testing the null hypothesis and conforming to the established and accepted philosophical conventions of the scientific method.
SOME EFFECTS OF DOGMATISM IN ELEMENTARY SCHOOL PRINCIPALS AND TEACHERS.
ERIC Educational Resources Information Center
BENTZEN, MARY M.
THE HYPOTHESIS THAT RATINGS ON CONGENIALITY AS A COWORKER GIVEN TO TEACHERS WILL BE IN PART A FUNCTION OF THE ORGANIZATIONAL STATUS OF THE RATER WAS TESTED. A SECONDARY PROBLEM WAS TO TEST THE HYPOTHESIS THAT DOGMATIC SUBJECTS MORE THAN NONDOGMATIC SUBJECTS WOULD EXHIBIT COGNITIVE BEHAVIOR WHICH INDICATED (1) GREATER DISTINCTION BETWEEN POSITIVE…
Thou Shalt Not Bear False Witness against Null Hypothesis Significance Testing
ERIC Educational Resources Information Center
García-Pérez, Miguel A.
2017-01-01
Null hypothesis significance testing (NHST) has been the subject of debate for decades and alternative approaches to data analysis have been proposed. This article addresses this debate from the perspective of scientific inquiry and inference. Inference is an inverse problem and application of statistical methods cannot reveal whether effects…
Quantum chi-squared and goodness of fit testing
DOE Office of Scientific and Technical Information (OSTI.GOV)
Temme, Kristan; Verstraete, Frank
2015-01-15
A quantum mechanical hypothesis test is presented for the hypothesis that a certain setup produces a given quantum state. Although the classical and the quantum problems are very much related to each other, the quantum problem is much richer due to the additional optimization over the measurement basis. A goodness of fit test for i.i.d quantum states is developed and a max-min characterization for the optimal measurement is introduced. We find the quantum measurement which leads both to the maximal Pitman and Bahadur efficiencies, and determine the associated divergence rates. We discuss the relationship of the quantum goodness of fitmore » test to the problem of estimating multiple parameters from a density matrix. These problems are found to be closely related and we show that the largest error of an optimal strategy, determined by the smallest eigenvalue of the Fisher information matrix, is given by the divergence rate of the goodness of fit test.« less
New methods of testing nonlinear hypothesis using iterative NLLS estimator
NASA Astrophysics Data System (ADS)
Mahaboob, B.; Venkateswarlu, B.; Mokeshrayalu, G.; Balasiddamuni, P.
2017-11-01
This research paper discusses the method of testing nonlinear hypothesis using iterative Nonlinear Least Squares (NLLS) estimator. Takeshi Amemiya [1] explained this method. However in the present research paper, a modified Wald test statistic due to Engle, Robert [6] is proposed to test the nonlinear hypothesis using iterative NLLS estimator. An alternative method for testing nonlinear hypothesis using iterative NLLS estimator based on nonlinear hypothesis using iterative NLLS estimator based on nonlinear studentized residuals has been proposed. In this research article an innovative method of testing nonlinear hypothesis using iterative restricted NLLS estimator is derived. Pesaran and Deaton [10] explained the methods of testing nonlinear hypothesis. This paper uses asymptotic properties of nonlinear least squares estimator proposed by Jenrich [8]. The main purpose of this paper is to provide very innovative methods of testing nonlinear hypothesis using iterative NLLS estimator, iterative NLLS estimator based on nonlinear studentized residuals and iterative restricted NLLS estimator. Eakambaram et al. [12] discussed least absolute deviation estimations versus nonlinear regression model with heteroscedastic errors and also they studied the problem of heteroscedasticity with reference to nonlinear regression models with suitable illustration. William Grene [13] examined the interaction effect in nonlinear models disused by Ai and Norton [14] and suggested ways to examine the effects that do not involve statistical testing. Peter [15] provided guidelines for identifying composite hypothesis and addressing the probability of false rejection for multiple hypotheses.
Null but not void: considerations for hypothesis testing.
Shaw, Pamela A; Proschan, Michael A
2013-01-30
Standard statistical theory teaches us that once the null and alternative hypotheses have been defined for a parameter, the choice of the statistical test is clear. Standard theory does not teach us how to choose the null or alternative hypothesis appropriate to the scientific question of interest. Neither does it tell us that in some cases, depending on which alternatives are realistic, we may want to define our null hypothesis differently. Problems in statistical practice are frequently not as pristinely summarized as the classic theory in our textbooks. In this article, we present examples in statistical hypothesis testing in which seemingly simple choices are in fact rich with nuance that, when given full consideration, make the choice of the right hypothesis test much less straightforward. Published 2012. This article is a US Government work and is in the public domain in the USA.
Hypothesis Testing as an Act of Rationality
NASA Astrophysics Data System (ADS)
Nearing, Grey
2017-04-01
Statistical hypothesis testing is ad hoc in two ways. First, setting probabilistic rejection criteria is, as Neyman (1957) put it, an act of will rather than an act of rationality. Second, physical theories like conservation laws do not inherently admit probabilistic predictions, and so we must use what are called epistemic bridge principles to connect model predictions with the actual methods of hypothesis testing. In practice, these bridge principles are likelihood functions, error functions, or performance metrics. I propose that the reason we are faced with these problems is because we have historically failed to account for a fundamental component of basic logic - namely the portion of logic that explains how epistemic states evolve in the presence of empirical data. This component of Cox' (1946) calculitic logic is called information theory (Knuth, 2005), and adding information theory our hypothetico-deductive account of science yields straightforward solutions to both of the above problems. This also yields a straightforward method for dealing with Popper's (1963) problem of verisimilitude by facilitating a quantitative approach to measuring process isomorphism. In practice, this involves data assimilation. Finally, information theory allows us to reliably bound measures of epistemic uncertainty, thereby avoiding the problem of Bayesian incoherency under misspecified priors (Grünwald, 2006). I therefore propose solutions to four of the fundamental problems inherent in both hypothetico-deductive and/or Bayesian hypothesis testing. - Neyman (1957) Inductive Behavior as a Basic Concept of Philosophy of Science. - Cox (1946) Probability, Frequency and Reasonable Expectation. - Knuth (2005) Lattice Duality: The Origin of Probability and Entropy. - Grünwald (2006). Bayesian Inconsistency under Misspecification. - Popper (1963) Conjectures and Refutations: The Growth of Scientific Knowledge.
Debates—Hypothesis testing in hydrology: Theory and practice
NASA Astrophysics Data System (ADS)
Pfister, Laurent; Kirchner, James W.
2017-03-01
The basic structure of the scientific method—at least in its idealized form—is widely championed as a recipe for scientific progress, but the day-to-day practice may be different. Here, we explore the spectrum of current practice in hypothesis formulation and testing in hydrology, based on a random sample of recent research papers. This analysis suggests that in hydrology, as in other fields, hypothesis formulation and testing rarely correspond to the idealized model of the scientific method. Practices such as "p-hacking" or "HARKing" (Hypothesizing After the Results are Known) are major obstacles to more rigorous hypothesis testing in hydrology, along with the well-known problem of confirmation bias—the tendency to value and trust confirmations more than refutations—among both researchers and reviewers. Nonetheless, as several examples illustrate, hypothesis tests have played an essential role in spurring major advances in hydrological theory. Hypothesis testing is not the only recipe for scientific progress, however. Exploratory research, driven by innovations in measurement and observation, has also underlain many key advances. Further improvements in observation and measurement will be vital to both exploratory research and hypothesis testing, and thus to advancing the science of hydrology.
Unscaled Bayes factors for multiple hypothesis testing in microarray experiments.
Bertolino, Francesco; Cabras, Stefano; Castellanos, Maria Eugenia; Racugno, Walter
2015-12-01
Multiple hypothesis testing collects a series of techniques usually based on p-values as a summary of the available evidence from many statistical tests. In hypothesis testing, under a Bayesian perspective, the evidence for a specified hypothesis against an alternative, conditionally on data, is given by the Bayes factor. In this study, we approach multiple hypothesis testing based on both Bayes factors and p-values, regarding multiple hypothesis testing as a multiple model selection problem. To obtain the Bayes factors we assume default priors that are typically improper. In this case, the Bayes factor is usually undetermined due to the ratio of prior pseudo-constants. We show that ignoring prior pseudo-constants leads to unscaled Bayes factor which do not invalidate the inferential procedure in multiple hypothesis testing, because they are used within a comparative scheme. In fact, using partial information from the p-values, we are able to approximate the sampling null distribution of the unscaled Bayes factor and use it within Efron's multiple testing procedure. The simulation study suggests that under normal sampling model and even with small sample sizes, our approach provides false positive and false negative proportions that are less than other common multiple hypothesis testing approaches based only on p-values. The proposed procedure is illustrated in two simulation studies, and the advantages of its use are showed in the analysis of two microarray experiments. © The Author(s) 2011.
Testing jumps via false discovery rate control.
Yen, Yu-Min
2013-01-01
Many recently developed nonparametric jump tests can be viewed as multiple hypothesis testing problems. For such multiple hypothesis tests, it is well known that controlling type I error often makes a large proportion of erroneous rejections, and such situation becomes even worse when the jump occurrence is a rare event. To obtain more reliable results, we aim to control the false discovery rate (FDR), an efficient compound error measure for erroneous rejections in multiple testing problems. We perform the test via the Barndorff-Nielsen and Shephard (BNS) test statistic, and control the FDR with the Benjamini and Hochberg (BH) procedure. We provide asymptotic results for the FDR control. From simulations, we examine relevant theoretical results and demonstrate the advantages of controlling the FDR. The hybrid approach is then applied to empirical analysis on two benchmark stock indices with high frequency data.
Huang, Peng; Ou, Ai-hua; Piantadosi, Steven; Tan, Ming
2014-11-01
We discuss the problem of properly defining treatment superiority through the specification of hypotheses in clinical trials. The need to precisely define the notion of superiority in a one-sided hypothesis test problem has been well recognized by many authors. Ideally designed null and alternative hypotheses should correspond to a partition of all possible scenarios of underlying true probability models P={P(ω):ω∈Ω} such that the alternative hypothesis Ha={P(ω):ω∈Ωa} can be inferred upon the rejection of null hypothesis Ho={P(ω):ω∈Ω(o)} However, in many cases, tests are carried out and recommendations are made without a precise definition of superiority or a specification of alternative hypothesis. Moreover, in some applications, the union of probability models specified by the chosen null and alternative hypothesis does not constitute a completed model collection P (i.e., H(o)∪H(a) is smaller than P). This not only imposes a strong non-validated assumption of the underlying true models, but also leads to different superiority claims depending on which test is used instead of scientific plausibility. Different ways to partition P fro testing treatment superiority often have different implications on sample size, power, and significance in both efficacy and comparative effectiveness trial design. Such differences are often overlooked. We provide a theoretical framework for evaluating the statistical properties of different specification of superiority in typical hypothesis testing. This can help investigators to select proper hypotheses for treatment comparison inclinical trial design. Copyright © 2014 Elsevier Inc. All rights reserved.
Revised standards for statistical evidence.
Johnson, Valen E
2013-11-26
Recent advances in Bayesian hypothesis testing have led to the development of uniformly most powerful Bayesian tests, which represent an objective, default class of Bayesian hypothesis tests that have the same rejection regions as classical significance tests. Based on the correspondence between these two classes of tests, it is possible to equate the size of classical hypothesis tests with evidence thresholds in Bayesian tests, and to equate P values with Bayes factors. An examination of these connections suggest that recent concerns over the lack of reproducibility of scientific studies can be attributed largely to the conduct of significance tests at unjustifiably high levels of significance. To correct this problem, evidence thresholds required for the declaration of a significant finding should be increased to 25-50:1, and to 100-200:1 for the declaration of a highly significant finding. In terms of classical hypothesis tests, these evidence standards mandate the conduct of tests at the 0.005 or 0.001 level of significance.
Distributed Immune Systems for Wireless Network Information Assurance
2010-04-26
ratio test (SPRT), where the goal is to optimize a hypothesis testing problem given a trade-off between the probability of errors and the...using cumulative sum (CUSUM) and Girshik-Rubin-Shiryaev (GRSh) statistics. In sequential versions of the problem the sequential probability ratio ...the more complicated problems, in particular those where no clear mean can be established. We developed algorithms based on the sequential probability
Memory Inhibition as a Critical Factor Preventing Creative Problem Solving
ERIC Educational Resources Information Center
Gómez-Ariza, Carlos J.; del Prete, Francesco; Prieto del Val, Laura; Valle, Tania; Bajo, M. Teresa; Fernandez, Angel
2017-01-01
The hypothesis that reduced accessibility to relevant information can negatively affect problem solving in a remote associate test (RAT) was tested by using, immediately before the RAT, a retrieval practice procedure to hinder access to target solutions. The results of 2 experiments clearly showed that, relative to baseline, target words that had…
Problem Solving Ability of Disadvantaged Children Under Four Test Modes.
ERIC Educational Resources Information Center
Houtz, John C.; And Others
A study was conducted to test the hypothesis that Ss from disadvantaged homes have poorly developed "abstract" thinking skills and that their thought can be characterized as more "concrete" or relational. Four forms of a problem-solving inventory were developed which differed in mode of presentation. The original form consisted of real-life…
Bayesian models based on test statistics for multiple hypothesis testing problems.
Ji, Yuan; Lu, Yiling; Mills, Gordon B
2008-04-01
We propose a Bayesian method for the problem of multiple hypothesis testing that is routinely encountered in bioinformatics research, such as the differential gene expression analysis. Our algorithm is based on modeling the distributions of test statistics under both null and alternative hypotheses. We substantially reduce the complexity of the process of defining posterior model probabilities by modeling the test statistics directly instead of modeling the full data. Computationally, we apply a Bayesian FDR approach to control the number of rejections of null hypotheses. To check if our model assumptions for the test statistics are valid for various bioinformatics experiments, we also propose a simple graphical model-assessment tool. Using extensive simulations, we demonstrate the performance of our models and the utility of the model-assessment tool. In the end, we apply the proposed methodology to an siRNA screening and a gene expression experiment.
Asymmetrically dominated choice problems, the isolation hypothesis and random incentive mechanisms.
Cox, James C; Sadiraj, Vjollca; Schmidt, Ulrich
2014-01-01
This paper presents an experimental study of the random incentive mechanisms which are a standard procedure in economic and psychological experiments. Random incentive mechanisms have several advantages but are incentive-compatible only if responses to the single tasks are independent. This is true if either the independence axiom of expected utility theory or the isolation hypothesis of prospect theory holds. We present a simple test of this in the context of choice under risk. In the baseline (one task) treatment we observe risk behavior in a given choice problem. We show that by integrating a second, asymmetrically dominated choice problem in a random incentive mechanism risk behavior can be manipulated systematically. This implies that the isolation hypothesis is violated and the random incentive mechanism does not elicit true preferences in our example.
Do job demands and job control affect problem-solving?
Bergman, Peter N; Ahlberg, Gunnel; Johansson, Gun; Stoetzer, Ulrich; Aborg, Carl; Hallsten, Lennart; Lundberg, Ingvar
2012-01-01
The Job Demand Control model presents combinations of working conditions that may facilitate learning, the active learning hypothesis, or have detrimental effects on health, the strain hypothesis. To test the active learning hypothesis, this study analysed the effects of job demands and job control on general problem-solving strategies. A population-based sample of 4,636 individuals (55% women, 45% men) with the same job characteristics measured at two times with a three year time lag was used. Main effects of demands, skill discretion, task authority and control, and the combined effects of demands and control were analysed in logistic regressions, on four outcomes representing general problem-solving strategies. Those reporting high on skill discretion, task authority and control, as well as those reporting high demand/high control and low demand/high control job characteristics were more likely to state using problem solving strategies. Results suggest that working conditions including high levels of control may affect how individuals cope with problems and that workplace characteristics may affect behaviour in the non-work domain.
Does Problem Behavior Elicit Poor Parenting?: A Prospective Study of Adolescent Girls
ERIC Educational Resources Information Center
Huh, David; Tristan, Jennifer; Wade, Emily; Stice, Eric
2006-01-01
This study tested the hypothesis that perceived parenting would show reciprocal relations with adolescents' problem behavior using longitudinal data from 496 adolescent girls. Results provided support for the assertion that female problem behavior has an adverse effect on parenting; elevated externalizing symptoms and substance abuse symptoms…
ERIC Educational Resources Information Center
Lord, Frederic M.; Stocking, Martha
A general Computer program is described that will compute asymptotic standard errors and carry out significance tests for an endless variety of (standard and) nonstandard large-sample statistical problems, without requiring the statistician to derive asymptotic standard error formulas. The program assumes that the observations have a multinormal…
ERIC Educational Resources Information Center
Hambrick, David Z.; Libarkin, Julie C.; Petcovic, Heather L.; Baker, Kathleen M.; Elkins, Joe; Callahan, Caitlin N.; Turner, Sheldon P.; Rench, Tara A.; LaDue, Nicole D.
2012-01-01
Sources of individual differences in scientific problem solving were investigated. Participants representing a wide range of experience in geology completed tests of visuospatial ability and geological knowledge, and performed a geological bedrock mapping task, in which they attempted to infer the geological structure of an area in the Tobacco…
Time out of Mind: Temporal Perspective in Adults with ADHD
ERIC Educational Resources Information Center
Carelli, Maria G.; Wiberg, Britt
2012-01-01
Objective: ADHD is often associated with difficulties in planning and time management. In this study, the authors examined the hypothesis that these functional problems in ADHD reflect systematic biases in temporal orientation. Method: To test this hypothesis, adults with ADHD (n = 30) and healthy controls (n = 60) completed the Swedish version of…
Learning Process and Vocational Experience Attainments.
ERIC Educational Resources Information Center
Colardyn, Danielle; White, Kathleen M.
From a search of (mostly French) literature, a hypothesis was formulated that students with both academic training and work experience would solve a practical learning problem more easily than students with academic learning only. A study was conducted at the Conservatoire National des Arts et Metiers in Paris to test this hypothesis. Two groups,…
ERIC Educational Resources Information Center
Erickson, Martha Farrell; And Others
1985-01-01
Tests hypothesis that young children who were anxiously attached would be more likely than securely attached children to have behavior problems in preschool. Examines particular patterns of anxious attachment in relation to specific problem behaviors. Studies child, parental, interactional, and environmental factors that account for behavior…
Community Differences in the Association between Parenting: Practices and Child Conduct Problems.
ERIC Educational Resources Information Center
Simons, Ronald L.; Lin, Kuei-Hsiu; Gordon, Leslie C.; Brody, Gene H.; Murry, Velma; Conger, Rand D.
2002-01-01
Surveys African American families (N=841) to test hypothesis that community context might influence the association between parent control and punishment on child conduct problems. Survey found the deterrent effect of caretaker control on conduct problems became smaller as deviant behavior became more widespread. Results suggest that a particular…
ERIC Educational Resources Information Center
Kohn, Nicholas; Smith, Steven M.
2009-01-01
Incubation has long been proposed as a mechanism in creative problem solving (Wallas, 1926). A new trial-by-trial method for observing incubation effects was used to compare the forgetting fixation hypothesis with the conscious work hypothesis. Two experiments examined the effects of incubation on initially unsolved Remote Associates Test (RAT)…
Statistical Smoothing Methods and Image Analysis
1988-12-01
83 - 111. Rosenfeld, A. and Kak, A.C. (1982). Digital Picture Processing. Academic Press,Qrlando. Serra, J. (1982). Image Analysis and Mat hematical ...hypothesis testing. IEEE Trans. Med. Imaging, MI-6, 313-319. Wicksell, S.D. (1925) The corpuscle problem. A mathematical study of a biometric problem
The continuum fusion theory of signal detection applied to a bi-modal fusion problem
NASA Astrophysics Data System (ADS)
Schaum, A.
2011-05-01
A new formalism has been developed that produces detection algorithms for model-based problems, in which one or more parameter values is unknown. Continuum Fusion can be used to generate different flavors of algorithm for any composite hypothesis testing problem. The methodology is defined by a fusion logic that can be translated into max/min conditions. Here it is applied to a simple sensor fusion model, but one for which the generalized likelihood ratio test is intractable. By contrast, a fusion-based response to the same problem can be devised that is solvable in closed form and represents a good approximation to the GLR test.
NASA Technical Reports Server (NTRS)
Taylor, S. R.
1984-01-01
The concept that the Moon was fissioned from the Earth after core separation is the most readily testable hypothesis of lunar origin, since direct comparisons of lunar and terrestrial compositions can be made. Differences found in such comparisons introduce so many ad hoc adjustments to the fission hypothesis that it becomes untestable. Further constraints may be obtained from attempting to date the volatile-refractory element fractionation. The combination of chemical and isotopic problems suggests that the fission hypothesis is no longer viable, and separate terrestrial and lunar accretion from a population of fractionated precursor planetesimals provides a more reasonable explanation.
Problem Solving in Biology: A Methodology
ERIC Educational Resources Information Center
Wisehart, Gary; Mandell, Mark
2008-01-01
A methodology is described that teaches science process by combining informal logic and a heuristic for rating factual reliability. This system facilitates student hypothesis formation, testing, and evaluation of results. After problem solving with this scheme, students are asked to examine and evaluate arguments for the underlying principles of…
Is it better to select or to receive? Learning via active and passive hypothesis testing.
Markant, Douglas B; Gureckis, Todd M
2014-02-01
People can test hypotheses through either selection or reception. In a selection task, the learner actively chooses observations to test his or her beliefs, whereas in reception tasks data are passively encountered. People routinely use both forms of testing in everyday life, but the critical psychological differences between selection and reception learning remain poorly understood. One hypothesis is that selection learning improves learning performance by enhancing generic cognitive processes related to motivation, attention, and engagement. Alternatively, we suggest that differences between these 2 learning modes derives from a hypothesis-dependent sampling bias that is introduced when a person collects data to test his or her own individual hypothesis. Drawing on influential models of sequential hypothesis-testing behavior, we show that such a bias (a) can lead to the collection of data that facilitates learning compared with reception learning and (b) can be more effective than observing the selections of another person. We then report a novel experiment based on a popular category learning paradigm that compares reception and selection learning. We additionally compare selection learners to a set of "yoked" participants who viewed the exact same sequence of observations under reception conditions. The results revealed systematic differences in performance that depended on the learner's role in collecting information and the abstract structure of the problem.
Schulz-Heik, R Jay; Rhee, Soo Hyun; Silvern, Louise E; Haberstick, Brett C; Hopfer, Christian; Lessem, Jeffrey M; Hewitt, John K
2010-05-01
It is often assumed that childhood maltreatment causes conduct problems via an environmentally mediated process. However, the association may be due alternatively to either a nonpassive gene-environment correlation, in which parents react to children's genetically-influenced conduct problems by maltreating them, or a passive gene-environment correlation, in which parents' tendency to engage in maltreatment and children's conduct problems are both influenced by a hereditary vulnerability to antisocial behavior (i.e. genetic mediation). The present study estimated the contribution of these processes to the association between maltreatment and conduct problems. Bivariate behavior genetic analyses were conducted on approximately 1,650 twin and sibling pairs drawn from a large longitudinal study of adolescent health (Add Health). The correlation between maltreatment and conduct problems was small; much of the association between maltreatment and conduct problems was due to a nonpassive gene-environment correlation. Results were more consistent with the hypothesis that parents respond to children's genetically-influenced conduct problems by maltreating them than the hypothesis that maltreatment causes conduct problems.
The Role of Oral Output in Noticing and Promoting the Acquisition of Linguistic Forms
ERIC Educational Resources Information Center
Liu, Dan
2013-01-01
Many empirical studies carried out to test the three major functions of the Comprehensible Output Hypothesis proposed by Swain lend some support to the Hypothesis in one way or another. This study aims to investigate whether giving the Chinese EFL learners an opportunity for oral output encourages them to notice their linguistic problems in oral…
Increasing arousal enhances inhibitory control in calm but not excitable dogs
Bray, Emily E.; MacLean, Evan L.; Hare, Brian A.
2015-01-01
The emotional-reactivity hypothesis proposes that problem-solving abilities can be constrained by temperament, within and across species. One way to test this hypothesis is with the predictions of the Yerkes-Dodson law. The law posits that arousal level, a component of temperament, affects problem solving in an inverted U-shaped relationship: optimal performance is reached at intermediate levels of arousal and impeded by high and low levels. Thus, a powerful test of the emotional-reactivity hypothesis is to compare cognitive performance in dog populations that have been bred and trained based in part on their arousal levels. We therefore compared a group of pet dogs to a group of assistance dogs bred and trained for low arousal (N = 106) on a task of inhibitory control involving a detour response. Consistent with the Yerkes-Dodson law, assistance dogs, which began the test with lower levels of baseline arousal, showed improvements when arousal was artificially increased. In contrast, pet dogs, which began the test with higher levels of baseline arousal, were negatively affected when their arousal was increased. Furthermore, the dogs’ baseline levels of arousal, as measured in their rate of tail wagging, differed by population in the expected directions. Low-arousal assistance dogs showed the most inhibition in a detour task when humans eagerly encouraged them while more highly aroused pet dogs performed worst on the same task with strong encouragement. Our findings support the hypothesis that selection on temperament can have important implications for cognitive performance. PMID:26169659
Increasing arousal enhances inhibitory control in calm but not excitable dogs.
Bray, Emily E; MacLean, Evan L; Hare, Brian A
2015-11-01
The emotional-reactivity hypothesis proposes that problem-solving abilities can be constrained by temperament, within and across species. One way to test this hypothesis is with the predictions of the Yerkes-Dodson law. The law posits that arousal level, a component of temperament, affects problem solving in an inverted U-shaped relationship: Optimal performance is reached at intermediate levels of arousal and impeded by high and low levels. Thus, a powerful test of the emotional-reactivity hypothesis is to compare cognitive performance in dog populations that have been bred and trained based in part on their arousal levels. We therefore compared a group of pet dogs to a group of assistance dogs bred and trained for low arousal (N = 106) on a task of inhibitory control involving a detour response. Consistent with the Yerkes-Dodson law, assistance dogs, which began the test with lower levels of baseline arousal, showed improvements when arousal was artificially increased. In contrast, pet dogs, which began the test with higher levels of baseline arousal, were negatively affected when their arousal was increased. Furthermore, the dogs' baseline levels of arousal, as measured in their rate of tail wagging, differed by population in the expected directions. Low-arousal assistance dogs showed the most inhibition in a detour task when humans eagerly encouraged them, while more highly aroused pet dogs performed worst on the same task with strong encouragement. Our findings support the hypothesis that selection on temperament can have important implications for cognitive performance.
Controlling Uncertainty: A Review of Human Behavior in Complex Dynamic Environments
ERIC Educational Resources Information Center
Osman, Magda
2010-01-01
Complex dynamic control (CDC) tasks are a type of problem-solving environment used for examining many cognitive activities (e.g., attention, control, decision making, hypothesis testing, implicit learning, memory, monitoring, planning, and problem solving). Because of their popularity, there have been many findings from diverse domains of research…
Conduct Problems, IQ, and Household Chaos: A Longitudinal Multi-Informant Study
ERIC Educational Resources Information Center
Deater-Deckard, Kirby; Mullineaux, Paula Y.; Beekman, Charles; Petrill, Stephen A.; Schatschneider, Chris; Thompson, Lee A.
2009-01-01
Background: We tested the hypothesis that household chaos would be associated with lower child IQ and more child conduct problems concurrently and longitudinally over two years while controlling for housing conditions, parent education/IQ, literacy environment, parental warmth/negativity, and stressful events. Methods: The sample included 302…
ERIC Educational Resources Information Center
Davies, Patrick T.; Martin, Meredith J.; Cummings, E. Mark
2018-01-01
Although social difficulties have been identified as sequelae of children's experiences with interparental conflict and insecurity, little is known about the specific mechanisms underlying their vulnerability to social problems. Guided by emotional security theory, this study tested the hypothesis that children's emotional insecurity mediates…
ERIC Educational Resources Information Center
Ngu, Bing Hiong; Yeung, Alexander Seeshing
2012-01-01
Holyoak and Koh (1987) and Holyoak (1984) propose four critical tasks for analogical transfer to occur in problem solving. A study was conducted to test this hypothesis by comparing a multiple components (MC) approach against worked examples (WE) in helping students to solve algebra word problems in chemistry classes. The MC approach incorporated…
Petschner, Péter; Bagdy, György; Tóthfalusi, Laszló
2015-03-01
One of the characteristics of many methods used in neuropsychopharmacology is that a large number of parameters (P) are measured in relatively few subjects (n). Functional magnetic resonance imaging, electroencephalography (EEG) and genomic studies are typical examples. For example one microarray chip can contain thousands of probes. Therefore, in studies using microarray chips, P may be several thousand-fold larger than n. Statistical analysis of such studies is a challenging task and they are refereed to in the statistical literature such as the small "n" big "P" problem. The problem has many facets including the controversies associated with multiple hypothesis testing. A typical scenario in this context is, when two or more groups are compared by the individual attributes. If the increased classification error due to the multiple testing is neglected, then several highly significant differences will be discovered. But in reality, some of these significant differences are coincidental, not reproducible findings. Several methods were proposed to solve this problem. In this review we discuss two of the proposed solutions, algorithms to compare sets and statistical hypothesis tests controlling the false discovery rate.
2014-01-01
Background In complex large-scale experiments, in addition to simultaneously considering a large number of features, multiple hypotheses are often being tested for each feature. This leads to a problem of multi-dimensional multiple testing. For example, in gene expression studies over ordered categories (such as time-course or dose-response experiments), interest is often in testing differential expression across several categories for each gene. In this paper, we consider a framework for testing multiple sets of hypothesis, which can be applied to a wide range of problems. Results We adopt the concept of the overall false discovery rate (OFDR) for controlling false discoveries on the hypothesis set level. Based on an existing procedure for identifying differentially expressed gene sets, we discuss a general two-step hierarchical hypothesis set testing procedure, which controls the overall false discovery rate under independence across hypothesis sets. In addition, we discuss the concept of the mixed-directional false discovery rate (mdFDR), and extend the general procedure to enable directional decisions for two-sided alternatives. We applied the framework to the case of microarray time-course/dose-response experiments, and proposed three procedures for testing differential expression and making multiple directional decisions for each gene. Simulation studies confirm the control of the OFDR and mdFDR by the proposed procedures under independence and positive correlations across genes. Simulation results also show that two of our new procedures achieve higher power than previous methods. Finally, the proposed methodology is applied to a microarray dose-response study, to identify 17 β-estradiol sensitive genes in breast cancer cells that are induced at low concentrations. Conclusions The framework we discuss provides a platform for multiple testing procedures covering situations involving two (or potentially more) sources of multiplicity. The framework is easy to use and adaptable to various practical settings that frequently occur in large-scale experiments. Procedures generated from the framework are shown to maintain control of the OFDR and mdFDR, quantities that are especially relevant in the case of multiple hypothesis set testing. The procedures work well in both simulations and real datasets, and are shown to have better power than existing methods. PMID:24731138
Spread of cattle led to the loss of matrilineal descent in Africa: a coevolutionary analysis.
Holden, Clare Janaki; Mace, Ruth
2003-01-01
Matrilineal descent is rare in human societies that keep large livestock. However, this negative correlation does not provide reliable evidence that livestock and descent rules are functionally related, because human cultures are not statistically independent owing to their historical relationships (Galton's problem). We tested the hypothesis that when matrilineal cultures acquire cattle they become patrilineal using a sample of 68 Bantu- and Bantoid-speaking populations from sub-Saharan Africa. We used a phylogenetic comparative method to control for Galton's problem, and a maximum-parsimony Bantu language tree as a model of population history. We tested for coevolution between cattle and descent. We also tested the direction of cultural evolution--were cattle acquired before matriliny was lost? The results support the hypothesis that acquiring cattle led formerly matrilineal Bantu-speaking cultures to change to patrilineal or mixed descent. We discuss possible reasons for matriliny's association with horticulture and its rarity in pastoralist societies. We outline the daughter-biased parental investment hypothesis for matriliny, which is supported by data on sex, wealth and reproductive success from two African societies, the matrilineal Chewa in Malawi and the patrilineal Gabbra in Kenya. PMID:14667331
NASA Technical Reports Server (NTRS)
Paine, D. A.; Zack, J. W.; Kaplan, M. L.
1979-01-01
The progress and problems associated with the dynamical forecast system which was developed to predict severe storms are examined. The meteorological problem of severe convective storm forecasting is reviewed. The cascade hypothesis which forms the theoretical core of the nested grid dynamical numerical modelling system is described. The dynamical and numerical structure of the model used during the 1978 test period is presented and a preliminary description of a proposed multigrid system for future experiments and tests is provided. Six cases from the spring of 1978 are discussed to illustrate the model's performance and its problems. Potential solutions to the problems are examined.
Hall, Matthew L.; Eigsti, Inge-Marie; Bortfeld, Heather; Lillo-Martin, Diane
2017-01-01
Deaf children are often described as having difficulty with executive function (EF), often manifesting in behavioral problems. Some researchers view these problems as a consequence of auditory deprivation; however, the behavioral problems observed in previous studies may not be due to deafness but to some other factor, such as lack of early language exposure. Here, we distinguish these accounts by using the BRIEF EF parent report questionnaire to test for behavioral problems in a group of Deaf children from Deaf families, who have a history of auditory but not language deprivation. For these children, the auditory deprivation hypothesis predicts behavioral impairments; the language deprivation hypothesis predicts no group differences in behavioral control. Results indicated that scores among the Deaf native signers (n = 42) were age-appropriate and similar to scores among the typically developing hearing sample (n = 45). These findings are most consistent with the language deprivation hypothesis, and provide a foundation for continued research on outcomes of children with early exposure to sign language. PMID:27624307
Reflectiveness/Impulsiveness and Mathematics Achievement
ERIC Educational Resources Information Center
Cathcart, W. George; Liedtke, Werner
1969-01-01
Report of research to test the hypothesis that reflective students would be higher achievers in mathematics than impulsive pupils. An achievement test was developed to measure understanding of mathematical concepts and applications, ability to solve verbal problems and recall basic facts. Data suggest that reflective students obtain better…
Goodnight, Jackson A.; Lahey, Benjamin B.; Van Hulle, Carol A.; Rodgers, Joseph L.; Rathouz, Paul J.; Waldman, Irwin D.; D’Onofrio, Brian M.
2012-01-01
A quasi-experimental comparison of cousins differentially exposed to levels of neighborhood disadvantage (ND) was used with extensive measured covariates to test the hypothesis that neighborhood risk has independent effects on youth conduct problems (CPs). Multilevel analyses were based on mother-rated ND and both mother-reported CPs across 4–13 years (n = 7,077) and youth-reported CPs across 10–13 years (n = 4,524) from the Children of the National Longitudinal Survey of Youth. ND was robustly related to CPs reported by both informants when controlling for both measured risk factors that are correlated with ND and unmeasured confounds. These findings are consistent with the hypothesis that ND has influence on conduct problems. PMID:21942334
How did you guess? Or, what do multiple-choice questions measure?
Cox, K R
1976-06-05
Multiple-choice questions classified as requiring problem-solving skills have been interpreted as measuring problem-solving skills within students, with the implicit hypothesis that questions needing an increasingly complex intellectual process should present increasing difficulty to the student. This hypothesis was tested in a 150-question paper taken by 721 students in seven Australian medical schools. No correlation was observed between difficulty and assigned process. Consequently, the question-answering process was explored with a group of final-year students. Anecdotal recall by students gave heavy weight to knowledge rather than problem solving in answering these questions. Assignment of the 150 questions to the classification by three teachers and six students showed their congruence to be a little above random probability.
DOE Office of Scientific and Technical Information (OSTI.GOV)
Audenaert, Koenraad M. R., E-mail: koenraad.audenaert@rhul.ac.uk; Department of Physics and Astronomy, University of Ghent, S9, Krijgslaan 281, B-9000 Ghent; Mosonyi, Milán, E-mail: milan.mosonyi@gmail.com
2014-10-01
We consider the multiple hypothesis testing problem for symmetric quantum state discrimination between r given states σ₁, …, σ{sub r}. By splitting up the overall test into multiple binary tests in various ways we obtain a number of upper bounds on the optimal error probability in terms of the binary error probabilities. These upper bounds allow us to deduce various bounds on the asymptotic error rate, for which it has been hypothesized that it is given by the multi-hypothesis quantum Chernoff bound (or Chernoff divergence) C(σ₁, …, σ{sub r}), as recently introduced by Nussbaum and Szkoła in analogy with Salikhov'smore » classical multi-hypothesis Chernoff bound. This quantity is defined as the minimum of the pairwise binary Chernoff divergences min{sub j« less
ERIC Educational Resources Information Center
Lin, Wei-Lun; Lien, Yunn-Wen
2013-01-01
This study examined how working memory plays different roles in open-ended versus closed-ended creative problem-solving processes, as represented by divergent thinking tests and insight problem-solving tasks. With respect to the analysis of different task demands and the framework of dual-process theories, the hypothesis was that the idea…
Evolution of Protein Lipograms: A Bioinformatics Problem
ERIC Educational Resources Information Center
White, Harold B., III; Dhurjati, Prasad
2006-01-01
A protein lacking one of the 20 common amino acids is a protein lipogram. This open-ended problem-based learning assignment deals with the evolution of proteins with biased amino acid composition. It has students query protein and metabolic databases to test the hypothesis that natural selection has reduced the frequency of each amino acid…
ERIC Educational Resources Information Center
Omemu, Felix
2017-01-01
This study examined the relationship between administrative strategy by principals and their effectiveness in tackling disciplinary problems. Three research questions were asked and one hypothesis was raised and tested at 0.05 level of significance. Ninety-five randomly selected principals from Bayelsa State constitute the sample. The instrument…
Lau, Anna S.; Chen, Angela Chia-Chen; Dinh, Khanh T.; Kim, Su Yeong
2009-01-01
Associations among neighborhood disadvantage, maternal acculturation, parenting and conduct problems were investigated in a sample of 444 Chinese American adolescents. Adolescents (54% female, 46% male) ranged from 12 to 15 years of age (mean age = 13.0 years). Multilevel modeling was employed to test the hypothesis that the association between maternal acculturation and adolescents’ conduct problems could be explained by differences in mothers’ reliance on monitoring and harsh discipline. In addition, guided by segmented assimilation theory, measures of neighborhood disadvantage were expected not only to be related to differences in parenting, but also to moderate the effects of maternal acculturation on parenting. Results indicated that increased maternal acculturation was related to higher levels of maternal monitoring and lower levels of harsh discipline, which, in turn, were related to lower levels of adolescents’ conduct problems. Hierarchical linear modeling results revealed that neighborhood disadvantage was related to lower levels of maternal monitoring. However, neighborhood disadvantage did not moderate the link between maternal acculturation and parenting practices. PMID:19636764
Liu, Lisa L; Lau, Anna S; Chen, Angela Chia-Chen; Dinh, Khanh T; Kim, Su Yeong
2009-05-01
Associations among neighborhood disadvantage, maternal acculturation, parenting and conduct problems were investigated in a sample of 444 Chinese American adolescents. Adolescents (54% female, 46% male) ranged from 12 to 15 years of age (mean age = 13.0 years). Multilevel modeling was employed to test the hypothesis that the association between maternal acculturation and adolescents' conduct problems could be explained by differences in mothers' reliance on monitoring and harsh discipline. In addition, guided by segmented assimilation theory, measures of neighborhood disadvantage were expected not only to be related to differences in parenting, but also to moderate the effects of maternal acculturation on parenting. Results indicated that increased maternal acculturation was related to higher levels of maternal monitoring and lower levels of harsh discipline, which, in turn, were related to lower levels of adolescents' conduct problems. Hierarchical linear modeling results revealed that neighborhood disadvantage was related to lower levels of maternal monitoring. However, neighborhood disadvantage did not moderate the link between maternal acculturation and parenting practices.
Human performance on the traveling salesman problem.
MacGregor, J N; Ormerod, T
1996-05-01
Two experiments on performance on the traveling salesman problem (TSP) are reported. The TSP consists of finding the shortest path through a set of points, returning to the origin. It appears to be an intransigent mathematical problem, and heuristics have been developed to find approximate solutions. The first experiment used 10-point, the second, 20-point problems. The experiments tested the hypothesis that complexity of TSPs is a function of number of nonboundary points, not total number of points. Both experiments supported the hypothesis. The experiments provided information on the quality of subjects' solutions. Their solutions clustered close to the best known solutions, were an order of magnitude better than solutions produced by three well-known heuristics, and on average fell beyond the 99.9th percentile in the distribution of random solutions. The solution process appeared to be perceptually based.
Ricarte Trives, Jorge Javier; Navarro Bravo, Beatriz; Latorre Postigo, José Miguel; Ros Segura, Laura; Watkins, Ed
2016-07-18
Our study tested the hypothesis that older adults and men use more adaptive emotion regulatory strategies but fewer negative emotion regulatory strategies than younger adults and women. In addition, we tested the hypothesis that rumination acts as a mediator variable for the effect of age and gender on depression scores. Differences in rumination, problem solving, distraction, autobiographical recall and depression were assessed in a group of young adults (18-29 years) compared to a group of older adults (50-76 years). The older group used more problem solving and distraction strategies when in a depressed state than their younger counterparts (ps .06). Ordinary least squares regression analyses with bootstrapping showed that rumination mediated the association between age, gender and depression scores. These results suggest that older adults and men select more adaptive strategies to regulate emotions than young adults and women with rumination acting as a significant mediator variable in the association between age, gender, and depression.
NASA Astrophysics Data System (ADS)
Mushlihuddin, R.; Nurafifah; Irvan
2018-01-01
The student’s low ability in mathematics problem solving proved to the less effective of a learning process in the classroom. Effective learning was a learning that affects student’s math skills, one of which is problem-solving abilities. Problem-solving capability consisted of several stages: understanding the problem, planning the settlement, solving the problem as planned, re-examining the procedure and the outcome. The purpose of this research was to know: (1) was there any influence of PBL model in improving ability Problem solving of student math in a subject of vector analysis?; (2) was the PBL model effective in improving students’ mathematical problem-solving skills in vector analysis courses? This research was a quasi-experiment research. The data analysis techniques performed from the test stages of data description, a prerequisite test is the normality test, and hypothesis test using the ANCOVA test and Gain test. The results showed that: (1) there was an influence of PBL model in improving students’ math problem-solving abilities in vector analysis courses; (2) the PBL model was effective in improving students’ problem-solving skills in vector analysis courses with a medium category.
Burrows, Catherine A; Usher, Lauren V; Schwartz, Caley B; Mundy, Peter C; Henderson, Heather A
2016-04-01
This study tested the spectrum hypothesis, which posits that children and adolescents with high functioning autism (HFA) differ quantitatively but not qualitatively from typically developing peers on self-reported temperament. Temperament refers to early-appearing, relatively stable behavioral and emotional tendencies, which relate to maladaptive behaviors across clinical populations. Quantitatively, participants with HFA (N = 104, aged 10-16) self-reported less surgency and more negative affect but did not differ from comparison participants (N = 94, aged 10-16) on effortful control or affiliation. Qualitatively, groups demonstrated comparable reliability of self-reported temperament and associations between temperament and parent-reported behavior problems. These findings support the spectrum hypothesis, highlighting the utility of self-report temperament measures for understanding individual differences in comorbid behavior problems among children and adolescents with HFA.
[Reflections on the cause and consequences of exposure to violent presentations in children].
Rauchfleisch, U
1997-01-01
The article deals with the questions why children and adolescents nowadays have an intensive use of violent videos and which are the consequences of this use. The motives of the use are compensation and flight of daily problems and of inner psychic emptiness, identification with grandiose heroes, the experience of anxiety-joy, protest against the parents' generation and the test of courage to obtain a social status in the peer group. The consequences of violent presentations can be understood by different psychological concepts as social cognitive learning theory, suggestion hypothesis, justification hypothesis, habituation hypothesis, catharsis theory, priming effect, arousal theory. These findings lead to the following consequences: Media critical lectures at school, governmental protection of children against harmful products, solution of the underlying social problems which lead to resignation and hopelessness in children and adolescent.
Burrows, Catherine A.; Usher, Lauren V.; Schwartz, Caley B.; Mundy, Peter C.; Henderson, Heather A.
2015-01-01
This study tested the spectrum hypothesis, which posits that children and adolescents with high functioning autism (HFA) differ quantitatively but not qualitatively from typically developing peers on self-reported temperament. Temperament refers to early-appearing, relatively stable behavioral and emotional tendencies, which relate to maladaptive behaviors across clinical populations. Quantitatively, participants with HFA (N=104, aged 10–16) self-reported less Surgency and more Negative Affect but did not differ from comparison participants (N=94, aged 10–16) on Effortful Control or Affiliation. Qualitatively, groups demonstrated comparable reliability of self-reported temperament and associations between temperament and parent-reported behavior problems. These findings support the spectrum hypothesis, highlighting the utility of self-report temperament measures for understanding individual differences in comorbid behavior problems among children and adolescents with HFA. PMID:26589536
Patrick H. Brose; Daniel C. Dey; Ross J. Phillips; Thomas A. Waldrop
2013-01-01
The fire-oak hypothesis asserts that the current lack of fire is a reason behind the widespread oak (Quercus spp.) regeneration difficulties of eastern North America, and use of prescribed burning can help solve this problem. We performed a meta-analysis on the data from 32 prescribed fire studies conducted in mixed-oak forests to test whether they...
Experimental design, power and sample size for animal reproduction experiments.
Chapman, Phillip L; Seidel, George E
2008-01-01
The present paper concerns statistical issues in the design of animal reproduction experiments, with emphasis on the problems of sample size determination and power calculations. We include examples and non-technical discussions aimed at helping researchers avoid serious errors that may invalidate or seriously impair the validity of conclusions from experiments. Screen shots from interactive power calculation programs and basic SAS power calculation programs are presented to aid in understanding statistical power and computing power in some common experimental situations. Practical issues that are common to most statistical design problems are briefly discussed. These include one-sided hypothesis tests, power level criteria, equality of within-group variances, transformations of response variables to achieve variance equality, optimal specification of treatment group sizes, 'post hoc' power analysis and arguments for the increased use of confidence intervals in place of hypothesis tests.
Is "g" an Entity? A Japanese Twin Study Using Syllogisms and Intelligence Tests
ERIC Educational Resources Information Center
Shikishima, Chizuru; Hiraishi, Kai; Yamagata, Shinji; Sugimoto, Yutaro; Takemura, Ryo; Ozaki, Koken; Okada, Mitsuhiro; Toda, Tatsushi; Ando, Juko
2009-01-01
Using a behavioral genetic approach, we examined the validity of the hypothesis concerning the singularity of human general intelligence, the "g" theory, by analyzing data from two tests: the first consisted of 100 syllogism problems and the second a full-scale intelligence test. The participants were 448 Japanese young adult twins (167…
Sample Size Calculation for Estimating or Testing a Nonzero Squared Multiple Correlation Coefficient
ERIC Educational Resources Information Center
Krishnamoorthy, K.; Xia, Yanping
2008-01-01
The problems of hypothesis testing and interval estimation of the squared multiple correlation coefficient of a multivariate normal distribution are considered. It is shown that available one-sided tests are uniformly most powerful, and the one-sided confidence intervals are uniformly most accurate. An exact method of calculating sample size to…
Science Fairs and Observational Science: A Case History from Earth Orbit
NASA Technical Reports Server (NTRS)
Lowman, Paul D., Jr.; Smith, David E. (Technical Monitor)
2002-01-01
Having judged dozens of science fairs over the years, I am repeatedly disturbed by the ground rules under which students must prepare their entries. They are almost invariably required to follow the "scientific method," involving formulating a hypothesis, a test of the hypothesis, and then a project in which this test is carried out. As a research scientist for over 40 years, I consider this approach to science fairs fundamentally unsound. It is not only too restrictive, but actually avoids the most important (and difficult) part of scientific research: recognizing a scientific problem in the first place. A well-known example is one of the problems that, by his own account, stimulated Einstein's theory of special relativity: the obvious fact that when an electric current is induced in a conductor by a magnetic field , it makes no difference whether the field or the conductor is actually (so to speak) moving. There is in other words no such thing as absolute motion. Physics was transformed by Einstein's recognition of a problem. Most competent scientists can solve problems after they have been recognized and a hypothesis properly formulated, but the ability to find problems in the first Place is much rarer. Getting down to specifics, the "scientific method" under which almost all students must operate is actually the experimental method, involving controlled variables, one of which, ideally, is changed at a time. However, there is another type of science that can be called observational science. As it happens, almost all the space research I have carried out since 1959 has been this type, not experimental science.
Chi-Square Statistics, Tests of Hypothesis and Technology.
ERIC Educational Resources Information Center
Rochowicz, John A.
The use of technology such as computers and programmable calculators enables students to find p-values and conduct tests of hypotheses in many different ways. Comprehension and interpretation of a research problem become the focus for statistical analysis. This paper describes how to calculate chisquare statistics and p-values for statistical…
Within-Intervention Change: Mediators of Intervention Effects during Multisystemic Therapy
ERIC Educational Resources Information Center
Dekovic, Maja; Asscher, Jessica J.; Manders, Willeke A.; Prins, Pier J. M.; van der Laan, Peter
2012-01-01
Objective: The present study tested the hypothesis that improvements in parental sense of competence during multisystemic therapy (MST) lead to positive changes in parenting, which in turn lead to a decrease of adolescent externalizing problems. Mediational models were tested separately for 3 dimensions of parenting (positive discipline, inept…
ERIC Educational Resources Information Center
Topolinski, Sascha; Reber, Rolf
2010-01-01
A temporal contiguity hypothesis for the experience of veracity is tested which states that a solution candidate to a cognitive problem is more likely to be experienced as correct the faster it succeeds the problem. Experiment 1 varied the onset time of the appearance of proposed solutions to anagrams (50 ms vs. 150 ms) and found for both correct…
Attentional and Executive Function Behaviours in Children with Poor Working Memory
ERIC Educational Resources Information Center
Gathercole, Susan E.; Alloway, Tracy P.; Kirkwood, Hannah J.; Elliott, Julian G.; Holmes, Joni; Hilton, Kerry A.
2008-01-01
The purpose of this study was to explore the profiles of classroom behaviour relating to attention and executive functions in children with very poor working memory, and to test the hypothesis that inattentive behaviour and working memory problems co-occur. Teachers rated problem behaviours of 52 children with low working memory scores aged 5/6…
ERIC Educational Resources Information Center
Spires, Hiller A.; Rowe, Jonathan P.; Mott, Bradford W.; Lester, James C.
2011-01-01
Targeted as a highly desired skill for contemporary work and life, problem solving is central to game-based learning research. In this study, middle grade students achieved significant learning gains from gameplay interactions that required solving a science mystery based on microbiology content. Student trace data results indicated that effective…
ERIC Educational Resources Information Center
Vasilyeva, Marina; Laski, Elida V.; Shen, Chen
2015-01-01
The present study tested the hypothesis that children's fluency with basic number facts and knowledge of computational strategies, derived from early arithmetic experience, predicts their performance on complex arithmetic problems. First-grade students from United States and Taiwan (N = 152, mean age: 7.3 years) were presented with problems that…
ERIC Educational Resources Information Center
Streibel, Michael; And Others
1987-01-01
Describes an advice-giving computer system being developed for genetics education called MENDEL that is based on research in learning, genetics problem solving, and expert systems. The value of MENDEL as a design tool and the tutorial function are stressed. Hypothesis testing, graphics, and experiential learning are also discussed. (Author/LRW)
Keers, Robert; Coleman, Jonathan R.I.; Lester, Kathryn J.; Roberts, Susanna; Breen, Gerome; Thastum, Mikael; Bögels, Susan; Schneider, Silvia; Heiervang, Einar; Meiser-Stedman, Richard; Nauta, Maaike; Creswell, Cathy; Thirlwall, Kerstin; Rapee, Ronald M.; Hudson, Jennifer L.; Lewis, Cathryn; Plomin, Robert; Eley, Thalia C.
2016-01-01
Background The differential susceptibly hypothesis suggests that certain genetic variants moderate the effects of both negative and positive environments on mental health and may therefore be important predictors of response to psychological treatments. Nevertheless, the identification of such variants has so far been limited to preselected candidate genes. In this study we extended the differential susceptibility hypothesis from a candidate gene to a genome-wide approach to test whether a polygenic score of environmental sensitivity predicted response to cognitive behavioural therapy (CBT) in children with anxiety disorders. Methods We identified variants associated with environmental sensitivity using a novel method in which within-pair variability in emotional problems in 1,026 monozygotic twin pairs was examined as a function of the pairs' genotype. We created a polygenic score of environmental sensitivity based on the whole-genome findings and tested the score as a moderator of parenting on emotional problems in 1,406 children and response to individual, group and brief parent-led CBT in 973 children with anxiety disorders. Results The polygenic score significantly moderated the effects of parenting on emotional problems and the effects of treatment. Individuals with a high score responded significantly better to individual CBT than group CBT or brief parent-led CBT (remission rates: 70.9, 55.5 and 41.6%, respectively). Conclusions Pending successful replication, our results should be considered exploratory. Nevertheless, if replicated, they suggest that individuals with the greatest environmental sensitivity may be more likely to develop emotional problems in adverse environments but also benefit more from the most intensive types of treatment. PMID:27043157
Visual working memory and number sense: Testing the double deficit hypothesis in mathematics.
Toll, Sylke W M; Kroesbergen, Evelyn H; Van Luit, Johannes E H
2016-09-01
Evidence exists that there are two main underlying cognitive factors in mathematical difficulties: working memory and number sense. It is suggested that real math difficulties appear when both working memory and number sense are weak, here referred to as the double deficit (DD) hypothesis. The aim of this study was to test the DD hypothesis within a longitudinal time span of 2 years. A total of 670 children participated. The mean age was 4.96 years at the start of the study and 7.02 years at the end of the study. At the end of the first year of kindergarten, both visual-spatial working memory and number sense were measured by two different tasks. At the end of first grade, mathematical performance was measured with two tasks, one for math facts and one for math problems. Multiple regressions revealed that both visual working memory and symbolic number sense are predictors of mathematical performance in first grade. Symbolic number sense appears to be the strongest predictor for both math areas (math facts and math problems). Non-symbolic number sense only predicts performance in math problems. Multivariate analyses of variance showed that a combination of visual working memory and number sense deficits (NSDs) leads to the lowest performance on mathematics. Our DD hypothesis was confirmed. Both visual working memory and symbolic number sense in kindergarten are related to mathematical performance 2 years later, and a combination of visual working memory and NSDs leads to low performance in mathematical performance. © 2016 The British Psychological Society.
A Case Study to Explore Rigorous Teaching and Testing Practices to Narrow the Achievement Gap
ERIC Educational Resources Information Center
Isler, Tesha
2012-01-01
The problem examined in this study: Does the majority of teachers use rigorous teaching and testing practices? The purpose of this qualitative exploratory case study was to explore the classroom techniques of six effective teachers who use rigorous teaching and testing practices. The hypothesis for this study is that the examination of the…
Cognitive differences between orang-utan species: a test of the cultural intelligence hypothesis
Forss, Sofia I. F.; Willems, Erik; Call, Josep; van Schaik, Carel P.
2016-01-01
Cultural species can - or even prefer to - learn their skills from conspecifics. According to the cultural intelligence hypothesis, selection on underlying mechanisms not only improves this social learning ability but also the asocial (individual) learning ability. Thus, species with systematically richer opportunities to socially acquire knowledge and skills should over time evolve to become more intelligent. We experimentally compared the problem-solving ability of Sumatran orang-utans (Pongo abelii), which are sociable in the wild, with that of the closely related, but more solitary Bornean orang-utans (P. pygmaeus), under the homogeneous environmental conditions provided by zoos. Our results revealed that Sumatrans showed superior innate problem-solving skills to Borneans, and also showed greater inhibition and a more cautious and less rough exploration style. This pattern is consistent with the cultural intelligence hypothesis, which predicts that the more sociable of two sister species experienced stronger selection on cognitive mechanisms underlying learning. PMID:27466052
Cognitive differences between orang-utan species: a test of the cultural intelligence hypothesis.
Forss, Sofia I F; Willems, Erik; Call, Josep; van Schaik, Carel P
2016-07-28
Cultural species can - or even prefer to - learn their skills from conspecifics. According to the cultural intelligence hypothesis, selection on underlying mechanisms not only improves this social learning ability but also the asocial (individual) learning ability. Thus, species with systematically richer opportunities to socially acquire knowledge and skills should over time evolve to become more intelligent. We experimentally compared the problem-solving ability of Sumatran orang-utans (Pongo abelii), which are sociable in the wild, with that of the closely related, but more solitary Bornean orang-utans (P. pygmaeus), under the homogeneous environmental conditions provided by zoos. Our results revealed that Sumatrans showed superior innate problem-solving skills to Borneans, and also showed greater inhibition and a more cautious and less rough exploration style. This pattern is consistent with the cultural intelligence hypothesis, which predicts that the more sociable of two sister species experienced stronger selection on cognitive mechanisms underlying learning.
Firing the Executive: When an Analytic Approach to Problem Solving Helps and Hurts
ERIC Educational Resources Information Center
Aiello, Daniel A.; Jarosz, Andrew F.; Cushen, Patrick J.; Wiley, Jennifer
2012-01-01
There is a general assumption that a more controlled or more focused attentional state is beneficial for most cognitive tasks. However, there has been a growing realization that creative problem solving tasks, such as the Remote Associates Task (RAT), may benefit from a less controlled solution approach. To test this hypothesis, in a 2x2 design,…
ERIC Educational Resources Information Center
Painter, Jon; Hastings, Richard; Ingham, Barry; Trevithick, Liam; Roy, Ashok
2018-01-01
Introduction: Current research findings in the field of intellectual disabilities (ID) regarding the relationship between mental health problems and challenging behavior are inconclusive and/or contradictory. The aim of this study was to further investigate the putative association between these two highly prevalent phenomena in people with ID,…
Spatial Abilities in an Elective Course of Applied Anatomy after a Problem-Based Learning Curriculum
ERIC Educational Resources Information Center
Langlois, Jean; Wells, George A.; Lecourtois, Marc; Bergeron, Germain; Yetisir, Elizabeth; Martin, Marcel
2009-01-01
A concern on the level of anatomy knowledge reached after a problem-based learning curriculum has been documented in the literature. Spatial anatomy, arguably the highest level in anatomy knowledge, has been related to spatial abilities. Our first objective was to test the hypothesis that residents are interested in a course of applied anatomy…
Using Shelterwood Harvests and Prescribed Fire to Regenerate Oak Stands on Productive Upland Sites
Patrick H. Brose; David H. van Lear; Roderick Cooper
1999-01-01
Regenerating oak stands on productive upland sites in the Piedmont region is a major problem because of intense competition from yellow-poplar. As a potential solution to this problem, we tested the hypothesis that a shelterwood harvest of an oak-dominated stand. followed several years later by a prescribed fire, would adequately regeneraie the stand. Three oak-...
ERIC Educational Resources Information Center
Lorber, Michael F.; Egeland, Byron
2011-01-01
The prediction of conduct problems (CPs) from infant difficulty and parenting measured in the first 6 months of life was studied in a sample of 267 high-risk mother-child dyads. Stable, cross-situational CPs at school entry (5-6 years) were predicted by negative infancy parenting, mediated by mutually angry and hostile mother-toddler interactions…
ERIC Educational Resources Information Center
Bulotsky-Shearer, Rebecca J.; Bell, Elizabeth R.; Romero, Sandy L.; Carter, Tracy M.
2014-01-01
Given theoretical and empirical support for the importance of peer play within the preschool classroom to early learning, the present study tested the hypothesis that associations between teacher-reported problem behavior and academic skills were mediated by difficulties in peer play (disruptive and disconnected play), for a representative sample…
ERIC Educational Resources Information Center
Skalická, Vera; Belsky, Jay; Stenseng, Frode; Wichstrøm, Lars
2015-01-01
The hypothesis was tested that the new open-group Norwegian day-care centers would more than traditionally organized centers negatively affect (a) current and (b) future teacher-child relationships, and (c) the developmental legacy of preschool problem behavior. The focus was on eight hundred and fifty 4-year-olds from 153 centers who were…
Crooks, Noelle M.; Alibali, Martha W.
2013-01-01
This study investigated whether activating elements of prior knowledge can influence how problem solvers encode and solve simple mathematical equivalence problems (e.g., 3 + 4 + 5 = 3 + __). Past work has shown that such problems are difficult for elementary school students (McNeil and Alibali, 2000). One possible reason is that children's experiences in math classes may encourage them to think about equations in ways that are ultimately detrimental. Specifically, children learn a set of patterns that are potentially problematic (McNeil and Alibali, 2005a): the perceptual pattern that all equations follow an “operations = answer” format, the conceptual pattern that the equal sign means “calculate the total”, and the procedural pattern that the correct way to solve an equation is to perform all of the given operations on all of the given numbers. Upon viewing an equivalence problem, knowledge of these patterns may be reactivated, leading to incorrect problem solving. We hypothesized that these patterns may negatively affect problem solving by influencing what people encode about a problem. To test this hypothesis in children would require strengthening their misconceptions, and this could be detrimental to their mathematical development. Therefore, we tested this hypothesis in undergraduate participants. Participants completed either control tasks or tasks that activated their knowledge of the three patterns, and were then asked to reconstruct and solve a set of equivalence problems. Participants in the knowledge activation condition encoded the problems less well than control participants. They also made more errors in solving the problems, and their errors resembled the errors children make when solving equivalence problems. Moreover, encoding performance mediated the effect of knowledge activation on equivalence problem solving. Thus, one way in which experience may affect equivalence problem solving is by influencing what students encode about the equations. PMID:24324454
Mason, K O; Palan, V T
1981-11-01
Multivariate analysis of the 1974 Malaysian Fertility and Family Survey tests the hypothesis that an inverse relationship between women's work and fertility occurs only when there are serious conflicts between working and caring for children. The results are only partly consistent with the hypothesis and suggest that normative conflicts between working and mothering affect the employment-fertility relationship in Malaysia more than spacio-temporal conflicts do. The lack of consistent evidence for the hypothesis, as well as some conceptual problems, lead us to propose an alternative framework for understanding variation in the employment-fertility relationship, both in Malaysia and elsewhere. This framework incorporates ideas from the role incompatibility hypothesis but views the employment-fertility relationship as dependent not just on role conflicts but more generally on the structure of the household's socioeconomic opportunities.
Decentralized Hypothesis Testing in Energy Harvesting Wireless Sensor Networks
NASA Astrophysics Data System (ADS)
Tarighati, Alla; Gross, James; Jalden, Joakim
2017-09-01
We consider the problem of decentralized hypothesis testing in a network of energy harvesting sensors, where sensors make noisy observations of a phenomenon and send quantized information about the phenomenon towards a fusion center. The fusion center makes a decision about the present hypothesis using the aggregate received data during a time interval. We explicitly consider a scenario under which the messages are sent through parallel access channels towards the fusion center. To avoid limited lifetime issues, we assume each sensor is capable of harvesting all the energy it needs for the communication from the environment. Each sensor has an energy buffer (battery) to save its harvested energy for use in other time intervals. Our key contribution is to formulate the problem of decentralized detection in a sensor network with energy harvesting devices. Our analysis is based on a queuing-theoretic model for the battery and we propose a sensor decision design method by considering long term energy management at the sensors. We show how the performance of the system changes for different battery capacities. We then numerically show how our findings can be used in the design of sensor networks with energy harvesting sensors.
Using a personal digital assistant to document clinical pharmacy services in an intensive care unit.
Lau, A; Balen, R M; Lam, R; Malyuk, D L
2001-07-01
Management Case Studies describe approaches to real-life management problems in health systems. Each installment is a brief description of a problem and how it was dealt with. The cases are intended to help readers deal with similar experiences in their own work sites. Problem solving, not hypothesis testing, is emphasized. Successful resolution of the management issue is not a criterion for publication--important lessons can be learned from failures, too.
Biostatistics Series Module 2: Overview of Hypothesis Testing.
Hazra, Avijit; Gogtay, Nithya
2016-01-01
Hypothesis testing (or statistical inference) is one of the major applications of biostatistics. Much of medical research begins with a research question that can be framed as a hypothesis. Inferential statistics begins with a null hypothesis that reflects the conservative position of no change or no difference in comparison to baseline or between groups. Usually, the researcher has reason to believe that there is some effect or some difference which is the alternative hypothesis. The researcher therefore proceeds to study samples and measure outcomes in the hope of generating evidence strong enough for the statistician to be able to reject the null hypothesis. The concept of the P value is almost universally used in hypothesis testing. It denotes the probability of obtaining by chance a result at least as extreme as that observed, even when the null hypothesis is true and no real difference exists. Usually, if P is < 0.05 the null hypothesis is rejected and sample results are deemed statistically significant. With the increasing availability of computers and access to specialized statistical software, the drudgery involved in statistical calculations is now a thing of the past, once the learning curve of the software has been traversed. The life sciences researcher is therefore free to devote oneself to optimally designing the study, carefully selecting the hypothesis tests to be applied, and taking care in conducting the study well. Unfortunately, selecting the right test seems difficult initially. Thinking of the research hypothesis as addressing one of five generic research questions helps in selection of the right hypothesis test. In addition, it is important to be clear about the nature of the variables (e.g., numerical vs. categorical; parametric vs. nonparametric) and the number of groups or data sets being compared (e.g., two or more than two) at a time. The same research question may be explored by more than one type of hypothesis test. While this may be of utility in highlighting different aspects of the problem, merely reapplying different tests to the same issue in the hope of finding a P < 0.05 is a wrong use of statistics. Finally, it is becoming the norm that an estimate of the size of any effect, expressed with its 95% confidence interval, is required for meaningful interpretation of results. A large study is likely to have a small (and therefore "statistically significant") P value, but a "real" estimate of the effect would be provided by the 95% confidence interval. If the intervals overlap between two interventions, then the difference between them is not so clear-cut even if P < 0.05. The two approaches are now considered complementary to one another.
Biostatistics Series Module 2: Overview of Hypothesis Testing
Hazra, Avijit; Gogtay, Nithya
2016-01-01
Hypothesis testing (or statistical inference) is one of the major applications of biostatistics. Much of medical research begins with a research question that can be framed as a hypothesis. Inferential statistics begins with a null hypothesis that reflects the conservative position of no change or no difference in comparison to baseline or between groups. Usually, the researcher has reason to believe that there is some effect or some difference which is the alternative hypothesis. The researcher therefore proceeds to study samples and measure outcomes in the hope of generating evidence strong enough for the statistician to be able to reject the null hypothesis. The concept of the P value is almost universally used in hypothesis testing. It denotes the probability of obtaining by chance a result at least as extreme as that observed, even when the null hypothesis is true and no real difference exists. Usually, if P is < 0.05 the null hypothesis is rejected and sample results are deemed statistically significant. With the increasing availability of computers and access to specialized statistical software, the drudgery involved in statistical calculations is now a thing of the past, once the learning curve of the software has been traversed. The life sciences researcher is therefore free to devote oneself to optimally designing the study, carefully selecting the hypothesis tests to be applied, and taking care in conducting the study well. Unfortunately, selecting the right test seems difficult initially. Thinking of the research hypothesis as addressing one of five generic research questions helps in selection of the right hypothesis test. In addition, it is important to be clear about the nature of the variables (e.g., numerical vs. categorical; parametric vs. nonparametric) and the number of groups or data sets being compared (e.g., two or more than two) at a time. The same research question may be explored by more than one type of hypothesis test. While this may be of utility in highlighting different aspects of the problem, merely reapplying different tests to the same issue in the hope of finding a P < 0.05 is a wrong use of statistics. Finally, it is becoming the norm that an estimate of the size of any effect, expressed with its 95% confidence interval, is required for meaningful interpretation of results. A large study is likely to have a small (and therefore “statistically significant”) P value, but a “real” estimate of the effect would be provided by the 95% confidence interval. If the intervals overlap between two interventions, then the difference between them is not so clear-cut even if P < 0.05. The two approaches are now considered complementary to one another. PMID:27057011
Exploring High School Students Beginning Reasoning about Significance Tests with Technology
ERIC Educational Resources Information Center
García, Víctor N.; Sánchez, Ernesto
2017-01-01
In the present study we analyze how students reason about or make inferences given a particular hypothesis testing problem (without having studied formal methods of statistical inference) when using Fathom. They use Fathom to create an empirical sampling distribution through computer simulation. It is found that most student´s reasoning rely on…
A New Test of Linear Hypotheses in OLS Regression under Heteroscedasticity of Unknown Form
ERIC Educational Resources Information Center
Cai, Li; Hayes, Andrew F.
2008-01-01
When the errors in an ordinary least squares (OLS) regression model are heteroscedastic, hypothesis tests involving the regression coefficients can have Type I error rates that are far from the nominal significance level. Asymptotically, this problem can be rectified with the use of a heteroscedasticity-consistent covariance matrix (HCCM)…
Concerns regarding a call for pluralism of information theory and hypothesis testing
Lukacs, P.M.; Thompson, W.L.; Kendall, W.L.; Gould, W.R.; Doherty, P.F.; Burnham, K.P.; Anderson, D.R.
2007-01-01
1. Stephens et al . (2005) argue for `pluralism? in statistical analysis, combining null hypothesis testing and information-theoretic (I-T) methods. We show that I-T methods are more informative even in single variable problems and we provide an ecological example. 2. I-T methods allow inferences to be made from multiple models simultaneously. We believe multimodel inference is the future of data analysis, which cannot be achieved with null hypothesis-testing approaches. 3. We argue for a stronger emphasis on critical thinking in science in general and less reliance on exploratory data analysis and data dredging. Deriving alternative hypotheses is central to science; deriving a single interesting science hypothesis and then comparing it to a default null hypothesis (e.g. `no difference?) is not an efficient strategy for gaining knowledge. We think this single-hypothesis strategy has been relied upon too often in the past. 4. We clarify misconceptions presented by Stephens et al . (2005). 5. We think inference should be made about models, directly linked to scientific hypotheses, and their parameters conditioned on data, Prob(Hj| data). I-T methods provide a basis for this inference. Null hypothesis testing merely provides a probability statement about the data conditioned on a null model, Prob(data |H0). 6. Synthesis and applications. I-T methods provide a more informative approach to inference. I-T methods provide a direct measure of evidence for or against hypotheses and a means to consider simultaneously multiple hypotheses as a basis for rigorous inference. Progress in our science can be accelerated if modern methods can be used intelligently; this includes various I-T and Bayesian methods.
NASA Technical Reports Server (NTRS)
Willsky, A. S.; Deyst, J. J.; Crawford, B. S.
1975-01-01
The paper describes two self-test procedures applied to the problem of estimating the biases in accelerometers and gyroscopes on an inertial platform. The first technique is the weighted sum-squared residual (WSSR) test, with which accelerator bias jumps are easily isolated, but gyro bias jumps are difficult to isolate. The WSSR method does not take full advantage of the knowledge of system dynamics. The other technique is a multiple hypothesis method developed by Buxbaum and Haddad (1969). It has the advantage of directly providing jump isolation information, but suffers from computational problems. It might be possible to use the WSSR to detect state jumps and then switch to the BH system for jump isolation and estimate compensation.
Youth Education and Unemployment Problems. An International Perspective.
ERIC Educational Resources Information Center
Gordon, Margaret S.; Trow, Martin
Essays focusing on issues concerning youth education and unemployment problems are presented in this document. It is divided into three general areas. The first, Youth Unemployment in Western Industrial Countries, reviews general dimensions of the problem, the cyclical hypothesis, the demand hypothesis, the supply hypothesis, disaggregating…
Frömke, Cornelia; Hothorn, Ludwig A; Kropf, Siegfried
2008-01-27
In many research areas it is necessary to find differences between treatment groups with several variables. For example, studies of microarray data seek to find a significant difference in location parameters from zero or one for ratios thereof for each variable. However, in some studies a significant deviation of the difference in locations from zero (or 1 in terms of the ratio) is biologically meaningless. A relevant difference or ratio is sought in such cases. This article addresses the use of relevance-shifted tests on ratios for a multivariate parallel two-sample group design. Two empirical procedures are proposed which embed the relevance-shifted test on ratios. As both procedures test a hypothesis for each variable, the resulting multiple testing problem has to be considered. Hence, the procedures include a multiplicity correction. Both procedures are extensions of available procedures for point null hypotheses achieving exact control of the familywise error rate. Whereas the shift of the null hypothesis alone would give straight-forward solutions, the problems that are the reason for the empirical considerations discussed here arise by the fact that the shift is considered in both directions and the whole parameter space in between these two limits has to be accepted as null hypothesis. The first algorithm to be discussed uses a permutation algorithm, and is appropriate for designs with a moderately large number of observations. However, many experiments have limited sample sizes. Then the second procedure might be more appropriate, where multiplicity is corrected according to a concept of data-driven order of hypotheses.
Suner, Aslı; Karakülah, Gökhan; Dicle, Oğuz
2014-01-01
Statistical hypothesis testing is an essential component of biological and medical studies for making inferences and estimations from the collected data in the study; however, the misuse of statistical tests is widely common. In order to prevent possible errors in convenient statistical test selection, it is currently possible to consult available test selection algorithms developed for various purposes. However, the lack of an algorithm presenting the most common statistical tests used in biomedical research in a single flowchart causes several problems such as shifting users among the algorithms, poor decision support in test selection and lack of satisfaction of potential users. Herein, we demonstrated a unified flowchart; covers mostly used statistical tests in biomedical domain, to provide decision aid to non-statistician users while choosing the appropriate statistical test for testing their hypothesis. We also discuss some of the findings while we are integrating the flowcharts into each other to develop a single but more comprehensive decision algorithm.
Causal Learning with Local Computations
ERIC Educational Resources Information Center
Fernbach, Philip M.; Sloman, Steven A.
2009-01-01
The authors proposed and tested a psychological theory of causal structure learning based on local computations. Local computations simplify complex learning problems via cues available on individual trials to update a single causal structure hypothesis. Structural inferences from local computations make minimal demands on memory, require…
Monocular precrash vehicle detection: features and classifiers.
Sun, Zehang; Bebis, George; Miller, Ronald
2006-07-01
Robust and reliable vehicle detection from images acquired by a moving vehicle (i.e., on-road vehicle detection) is an important problem with applications to driver assistance systems and autonomous, self-guided vehicles. The focus of this work is on the issues of feature extraction and classification for rear-view vehicle detection. Specifically, by treating the problem of vehicle detection as a two-class classification problem, we have investigated several different feature extraction methods such as principal component analysis, wavelets, and Gabor filters. To evaluate the extracted features, we have experimented with two popular classifiers, neural networks and support vector machines (SVMs). Based on our evaluation results, we have developed an on-board real-time monocular vehicle detection system that is capable of acquiring grey-scale images, using Ford's proprietary low-light camera, achieving an average detection rate of 10 Hz. Our vehicle detection algorithm consists of two main steps: a multiscale driven hypothesis generation step and an appearance-based hypothesis verification step. During the hypothesis generation step, image locations where vehicles might be present are extracted. This step uses multiscale techniques not only to speed up detection, but also to improve system robustness. The appearance-based hypothesis verification step verifies the hypotheses using Gabor features and SVMs. The system has been tested in Ford's concept vehicle under different traffic conditions (e.g., structured highway, complex urban streets, and varying weather conditions), illustrating good performance.
Do Hyperactive Children Have Manifestations of Hyperactivity in Their Eye Movements?
ERIC Educational Resources Information Center
Cohen, Bernard; And Others
A study involving 18 hyperkinetic children (from 3- to 12-years old) was conducted to test the hypothesis that hyperactive children manifest the same type of hypermotility in their eyes as in the rest of their body. Ss were observed under a series of test conditions (including manual problem solving) which elicit short and long periods of…
Martín H., José Antonio
2013-01-01
Many practical problems in almost all scientific and technological disciplines have been classified as computationally hard (NP-hard or even NP-complete). In life sciences, combinatorial optimization problems frequently arise in molecular biology, e.g., genome sequencing; global alignment of multiple genomes; identifying siblings or discovery of dysregulated pathways. In almost all of these problems, there is the need for proving a hypothesis about certain property of an object that can be present if and only if it adopts some particular admissible structure (an NP-certificate) or be absent (no admissible structure), however, none of the standard approaches can discard the hypothesis when no solution can be found, since none can provide a proof that there is no admissible structure. This article presents an algorithm that introduces a novel type of solution method to “efficiently” solve the graph 3-coloring problem; an NP-complete problem. The proposed method provides certificates (proofs) in both cases: present or absent, so it is possible to accept or reject the hypothesis on the basis of a rigorous proof. It provides exact solutions and is polynomial-time (i.e., efficient) however parametric. The only requirement is sufficient computational power, which is controlled by the parameter . Nevertheless, here it is proved that the probability of requiring a value of to obtain a solution for a random graph decreases exponentially: , making tractable almost all problem instances. Thorough experimental analyses were performed. The algorithm was tested on random graphs, planar graphs and 4-regular planar graphs. The obtained experimental results are in accordance with the theoretical expected results. PMID:23349711
UK Alcohol Treatment Trial: client-treatment matching effects.
2008-02-01
To test a priori hypotheses concerning client-treatment matching in the treatment of alcohol problems and to evaluate the more general hypothesis that client-treatment matching adds to the overall effectiveness of treatment. Pragmatic, multi-centre, randomized controlled trial (the UK Alcohol Treatment Trial: UKATT) with open follow-up at 3 months after entry and blind follow-up at 12 months. Five treatment centres, comprising seven treatment sites, including National Health Service (NHS), social services and joint NHS/non-statutory facilities. Motivational enhancement therapy and social behaviour and network therapy. Matching hypotheses were tested by examining interactions between client attributes and treatment types at both 3 and 12 months follow-up using the outcome variables of percentage days abstinent, drinks per drinking day and scores on the Alcohol Problems Questionnaire and Leeds Dependence Questionnaire. None of five matching hypotheses was confirmed at either follow-up point on any outcome variable. The findings strongly support the conclusion reached in Project MATCH in the United States that client-treatment matching, at least of the kind examined, is unlikely to result in substantial improvements to the effectiveness of treatment for alcohol problems. Possible reasons for this failure to support the general matching hypothesis are discussed, as are the implications of UKATT findings for the provision of treatment for alcohol problems in the United Kingdom.
Genome-wide detection of intervals of genetic heterogeneity associated with complex traits
Llinares-López, Felipe; Grimm, Dominik G.; Bodenham, Dean A.; Gieraths, Udo; Sugiyama, Mahito; Rowan, Beth; Borgwardt, Karsten
2015-01-01
Motivation: Genetic heterogeneity, the fact that several sequence variants give rise to the same phenotype, is a phenomenon that is of the utmost interest in the analysis of complex phenotypes. Current approaches for finding regions in the genome that exhibit genetic heterogeneity suffer from at least one of two shortcomings: (i) they require the definition of an exact interval in the genome that is to be tested for genetic heterogeneity, potentially missing intervals of high relevance, or (ii) they suffer from an enormous multiple hypothesis testing problem due to the large number of potential candidate intervals being tested, which results in either many false positives or a lack of power to detect true intervals. Results: Here, we present an approach that overcomes both problems: it allows one to automatically find all contiguous sequences of single nucleotide polymorphisms in the genome that are jointly associated with the phenotype. It also solves both the inherent computational efficiency problem and the statistical problem of multiple hypothesis testing, which are both caused by the huge number of candidate intervals. We demonstrate on Arabidopsis thaliana genome-wide association study data that our approach can discover regions that exhibit genetic heterogeneity and would be missed by single-locus mapping. Conclusions: Our novel approach can contribute to the genome-wide discovery of intervals that are involved in the genetic heterogeneity underlying complex phenotypes. Availability and implementation: The code can be obtained at: http://www.bsse.ethz.ch/mlcb/research/bioinformatics-and-computational-biology/sis.html. Contact: felipe.llinares@bsse.ethz.ch Supplementary information: Supplementary data are available at Bioinformatics online. PMID:26072488
Diagnosing and alleviating the impact of performance pressure on mathematical problem solving.
DeCaro, Marci S; Rotar, Kristin E; Kendra, Matthew S; Beilock, Sian L
2010-08-01
High-pressure academic testing situations can lead people to perform below their actual ability levels by co-opting working memory (WM) resources needed for the task at hand (Beilock, 2008). In the current work we examine how performance pressure impacts WM and design an intervention to alleviate pressure's negative impact. Specifically, we explore the hypothesis that high-pressure situations trigger distracting thoughts and worries that rely heavily on verbal WM. Individuals performed verbally based and spatially based mathematics problems in a low-pressure or high-pressure testing situation. Results demonstrated that performance on problems that rely heavily on verbal WM resources was less accurate under high-pressure than under low-pressure tests. Performance on spatially based problems that do not rely heavily on verbal WM was not affected by pressure. Moreover, the more people reported worrying during test performance, the worse they performed on the verbally based (but not spatially based) maths problems. Asking some individuals to focus on the problem steps by talking aloud helped to keep pressure-induced worries at bay and eliminated pressure's negative impact on performance.
NASA Astrophysics Data System (ADS)
Azila Che Musa, Nor; Mahmud, Zamalia; Baharun, Norhayati
2017-09-01
One of the important skills that is required from any student who are learning statistics is knowing how to solve statistical problems correctly using appropriate statistical methods. This will enable them to arrive at a conclusion and make a significant contribution and decision for the society. In this study, a group of 22 students majoring in statistics at UiTM Shah Alam were given problems relating to topics on testing of hypothesis which require them to solve the problems using confidence interval, traditional and p-value approach. Hypothesis testing is one of the techniques used in solving real problems and it is listed as one of the difficult concepts for students to grasp. The objectives of this study is to explore students’ perceived and actual ability in solving statistical problems and to determine which item in statistical problem solving that students find difficult to grasp. Students’ perceived and actual ability were measured based on the instruments developed from the respective topics. Rasch measurement tools such as Wright map and item measures for fit statistics were used to accomplish the objectives. Data were collected and analysed using Winsteps 3.90 software which is developed based on the Rasch measurement model. The results showed that students’ perceived themselves as moderately competent in solving the statistical problems using confidence interval and p-value approach even though their actual performance showed otherwise. Item measures for fit statistics also showed that the maximum estimated measures were found on two problems. These measures indicate that none of the students have attempted these problems correctly due to reasons which include their lack of understanding in confidence interval and probability values.
Transform-Based Wideband Array Processing
1992-01-31
Breusch and Pagan [2], it is possible to test which model, M,€, 0 AR or random coefficient, will better fit typical array data. Li The test indicates that...bearing estimation problems," Proc. IEEE, vol. 70, no. 9, pp. 1018-1028, 1982. (2] T. S. Breusch and A. R. Pagan , "A simple test for het...cor- relations do not obey an AR relationship across the array; relations in the observations. Through the use of a binary hypothesis test , it is
The Biochemistry of Memory: The Twenty-Six Year Journey of a ‘New and Specific Hypothesis’
Baudry, Michel; Bi, Xiaoning; Gall, Christine; Lynch, Gary
2010-01-01
This Special Issue of Neurobiology of Learning and Memory dedicated to Dr. Richard Thompson to celebrate his 80th birthday and his numerous contributions to the field of learning and memory gave us the opportunity to revisit the hypothesis we proposed more than 25 years ago regarding the biochemistry of learning and memory. This review summarizes our early 1980s hypothesis and then describes how it was tested and modified over the years following its introduction. We then discuss the current status of the hypothesis and provide some examples of how it has led to unexpected insights into the memory problems that accompany a broad range of neuropsychiatric disorders. PMID:21134478
Hypothesis driven assessment of an NMR curriculum
NASA Astrophysics Data System (ADS)
Cossey, Kimberly
The goal of this project was to develop a battery of assessments to evaluate an undergraduate NMR curriculum at Penn State University. As a chemical education project, we sought to approach the problem of curriculum assessment from a scientific perspective, while remaining grounded in the education research literature and practices. We chose the phrase hypothesis driven assessment to convey this process of relating the scientific method to the study of educational methods, modules, and curricula. We began from a hypothesis, that deeper understanding of one particular analytical technique (NMR) will increase undergraduate students' abilities to solve chemical problems. We designed an experiment to investigate this hypothesis, and data collected were analyzed and interpreted in light of the hypothesis and several related research questions. The expansion of the NMR curriculum at Penn State was funded through the NSF's Course, Curriculum, and Laboratory Improvement (CCLI) program, and assessment was required. The goal of this project, as stated in the grant proposal, was to provide NMR content in greater depth by integrating NMR modules throughout the curriculum in physical chemistry, instrumental, and organic chemistry laboratory courses. Hands-on contact with the NMR spectrometer and NMR data and repeated exposure of the analytical technique within different contexts (courses) were unique factors of this curriculum. Therefore, we maintained a focus on these aspects throughout the evaluation process. The most challenging and time-consuming aspect of any assessment is the development of testing instruments and methods to provide useful data. After key variables were defined, testing instruments were designed to measure these variables based on educational literature (Chapter 2). The primary variables measured in this assessment were: depth of understanding of NMR, basic NMR knowledge, problem solving skills (HETCOR problem), confidence for skills used in class (within the hands-on NMR modules), confidence for NMR tasks (not practiced), and confidence for general science tasks. Detailed discussion of the instruments, testing methods and experimental design used in this assessment are provided (Chapter 3). All data were analyzed quantitatively using methods adapted from the educational literature (Chapter 4). Data were analyzed and the descriptive statistics, independent t-tests between the experimental and control groups, and correlation statistics were calculated for each variable. In addition, for those variables included on the pretest, dependent t-tests between pretest and posttest scores were also calculated. The results of study 1 and study 2 were used to draw conclusions based on the hypothesis and research questions proposed in this work (Chapter 4). Data collected in this assessment were used to answer the following research questions: (1) Primary research question: Is depth of understanding of NMR linked to problem solving skills? (2) Are the NMR modules working as intended? Do they promote depth of understanding of NMR? (a) Will students who complete NMR modules have a greater depth of understanding of NMR than students who do not complete the modules? (b) Is depth of understanding increasing over the course of the experiment? (3) Is confidence an intermediary between depth of understanding and problem solving skills? Is it linked to both variables? (4) What levels of confidence are affected by the NMR modules? (a) Will confidence for the NMR class skills used in the modules themselves be greater for those who have completed the modules? (b) Will confidence for NMR tasks not practiced in the course be affected? (c) Will confidence for general science tasks be affected? (d) Are different levels of confidence (class skills, NMR tasks, general science tasks) linked to each other? Results from this NMR curriculum assessment could also have implications outside of the courses studied, and so there is potential to impact the chemical education community (section 5.2.1). In addition to providing reliable testing instruments/measures that could be used outside the university, the results of this research contribute to the study of problem solving in chemistry, learner characteristics within the context of chemical education studies, and NMR specific educational evaluations. Valuable information was gathered through the current method of evaluation for the NMR curriculum. However, improvements could be made to the existing assessment, and an alternate assessment that could supplement the information found in this study has been proposed (Chapter 5).
Assessing Threat Detection Scenarios through Hypothesis Generation and Testing
2015-12-01
Dog Day scenario .............................................................................................................. 9...Figure 1. Rankings of priority threats identified in the Dog Day scenario ............................... 9 Figure 2. Rankings of priority...making in uncertain environments relies heavily on pattern matching. Cohen, Freeman, and Wolf (1996) reported that features of the decision problem
Cumulative-Genetic Plasticity, Parenting and Adolescent Self-Regulation
ERIC Educational Resources Information Center
Belsky, Jay; Beaver, Kevin M.
2011-01-01
Background: The capacity to control or regulate one's emotions, cognitions and behavior is central to competent functioning, with limitations in these abilities associated with developmental problems. Parenting appears to influence such self-regulation. Here the differential-susceptibility hypothesis is tested that the more putative "plasticity…
Invited Commentary: Can Issues With Reproducibility in Science Be Blamed on Hypothesis Testing?
Weinberg, Clarice R.
2017-01-01
Abstract In the accompanying article (Am J Epidemiol. 2017;186(6):646–647), Dr. Timothy Lash makes a forceful case that the problems with reproducibility in science stem from our “culture” of null hypothesis significance testing. He notes that when attention is selectively given to statistically significant findings, the estimated effects will be systematically biased away from the null. Here I revisit the recent history of genetic epidemiology and argue for retaining statistical testing as an important part of the tool kit. Particularly when many factors are considered in an agnostic way, in what Lash calls “innovative” research, investigators need a selection strategy to identify which findings are most likely to be genuine, and hence worthy of further study. PMID:28938713
Control of Finite-State, Finite Memory Stochastic Systems
NASA Technical Reports Server (NTRS)
Sandell, Nils R.
1974-01-01
A generalized problem of stochastic control is discussed in which multiple controllers with different data bases are present. The vehicle for the investigation is the finite state, finite memory (FSFM) stochastic control problem. Optimality conditions are obtained by deriving an equivalent deterministic optimal control problem. A FSFM minimum principle is obtained via the equivalent deterministic problem. The minimum principle suggests the development of a numerical optimization algorithm, the min-H algorithm. The relationship between the sufficiency of the minimum principle and the informational properties of the problem are investigated. A problem of hypothesis testing with 1-bit memory is investigated to illustrate the application of control theoretic techniques to information processing problems.
Carlesimo, Giovanni A; Bonanni, Rita; Caltagirone, Carlo
2003-05-01
This study investigated the hypothesis that brain damaged patients with memory disorder are poorer at remembering the semantic than the perceptual attributes of information. Eight patients with memory impairment of different etiology and 24 patients with chronic consequences of severe closed-head injury were compared to similarly sized age- and literacy-matched normal control groups on recognition tests for the physical aspect and the semantic identity of words and pictures lists. In order to avoid interpretative problems deriving from different absolute levels of performance, study conditions were manipulated across subjects to obtain comparable accuracy on the perceptual recognition tests in the memory disordered and control groups. The results of the Picture Recognition test were consistent with the hypothesis. Indeed, having more time for the stimulus encoding, the two memory disordered groups performed at the same level as the normal subjects on the perceptual test but significantly lower on the semantic test. Instead, on the Word Recognition test, following study condition manipulation, patients and controls performed similarly on both the perceptual and the semantic tests. These data only partially support the hypothesis of the study; rather they suggest that in memory disordered patients there is a reduction of the advantage, exhibited by normal controls, of retrieving pictures over words (picture superiority effect).
Do physical leisure time activities prevent fatigue? A 15 month prospective study of nurses' aides.
Eriksen, W; Bruusgaard, D
2004-06-01
To test the hypothesis that physical leisure time activities reduce the risk of developing persistent fatigue. The hypothesis was tested in a sample that was homogeneous with respect to sex and occupation, with a prospective cohort design. Of 6234 vocationally active, female, Norwegian nurses' aides, not on leave because of illness or pregnancy when they completed a mailed questionnaire in 1999, 5341 (85.7%) completed a second questionnaire 15 months later. The main outcome measure was the prevalence of persistent fatigue-that is, always or usually feeling fatigued in the daytime during the preceding 14 days. In participants without persistent fatigue at baseline, reported engagement in physical leisure time activities for 20 minutes or more at least once a week during the three months before baseline was associated with a reduced risk of persistent fatigue at the follow up (odds ratio = 0.70; 95% confidence interval 0.55 to 0.89), after adjustments for age, affective symptoms, sleeping problems, musculoskeletal pain, long term health problems of any kind, smoking, marital status, tasks of a caring nature during leisure time, and work factors at baseline. The study supports the hypothesis that physical leisure time activities reduce the risk of developing persistent fatigue.
NASA Technical Reports Server (NTRS)
Porter, D. W.; Lefler, R. M.
1979-01-01
A generalized hypothesis testing approach is applied to the problem of tracking several objects where several different associations of data with objects are possible. Such problems occur, for instance, when attempting to distinctly track several aircraft maneuvering near each other or when tracking ships at sea. Conceptually, the problem is solved by first, associating data with objects in a statistically reasonable fashion and then, tracking with a bank of Kalman filters. The objects are assumed to have motion characterized by a fixed but unknown deterministic portion plus a random process portion modeled by a shaping filter. For example, the object might be assumed to have a mean straight line path about which it maneuvers in a random manner. Several hypothesized associations of data with objects are possible because of ambiguity as to which object the data comes from, false alarm/detection errors, and possible uncertainty in the number of objects being tracked. The statistical likelihood function is computed for each possible hypothesized association of data with objects. Then the generalized likelihood is computed by maximizing the likelihood over parameters that define the deterministic motion of the object.
Robust Approach to Verifying the Weak Form of the Efficient Market Hypothesis
NASA Astrophysics Data System (ADS)
Střelec, Luboš
2011-09-01
The weak form of the efficient markets hypothesis states that prices incorporate only past information about the asset. An implication of this form of the efficient markets hypothesis is that one cannot detect mispriced assets and consistently outperform the market through technical analysis of past prices. One of possible formulations of the efficient market hypothesis used for weak form tests is that share prices follow a random walk. It means that returns are realizations of IID sequence of random variables. Consequently, for verifying the weak form of the efficient market hypothesis, we can use distribution tests, among others, i.e. some tests of normality and/or some graphical methods. Many procedures for testing the normality of univariate samples have been proposed in the literature [7]. Today the most popular omnibus test of normality for a general use is the Shapiro-Wilk test. The Jarque-Bera test is the most widely adopted omnibus test of normality in econometrics and related fields. In particular, the Jarque-Bera test (i.e. test based on the classical measures of skewness and kurtosis) is frequently used when one is more concerned about heavy-tailed alternatives. As these measures are based on moments of the data, this test has a zero breakdown value [2]. In other words, a single outlier can make the test worthless. The reason so many classical procedures are nonrobust to outliers is that the parameters of the model are expressed in terms of moments, and their classical estimators are expressed in terms of sample moments, which are very sensitive to outliers. Another approach to robustness is to concentrate on the parameters of interest suggested by the problem under this study. Consequently, novel robust testing procedures of testing normality are presented in this paper to overcome shortcomings of classical normality tests in the field of financial data, which are typical with occurrence of remote data points and additional types of deviations from normality. This study also discusses some results of simulation power studies of these tests for normality against selected alternatives. Based on outcome of the power simulation study, selected normality tests were consequently used to verify weak form of efficiency in Central Europe stock markets.
More Ammunition for the Note-Taking Feud: The "Spaced Lecture."
ERIC Educational Resources Information Center
Bentley, Donna Anderson
1981-01-01
An experiment is discussed that tested a 1975 hypothesis of Aiken, Thomas, and Shennum that a "spaced lecture" might help solve the problem of the negative of concurrently listening and writing. Results indicated that the "spaced lecture" was no better than the traditional lecture. (MLW)
Lemonade's the Name, Simulation's the Game.
ERIC Educational Resources Information Center
Friel, Susan
1983-01-01
Provides a detailed description of Lemonade, a business game designed to introduce elementary and secondary students to the basics of business; i.e., problem solving strategies, hypothesis formulation and testing, trend analysis, prediction, comparative analysis, and effects of such factors as advertising and climatic conditions on sales and…
Proficiency Testing for Evaluating Aerospace Materials Test Anomalies
NASA Technical Reports Server (NTRS)
Hirsch, D.; Motto, S.; Peyton, S.; Beeson, H.
2006-01-01
ASTM G 86 and ASTM G 74 are commonly used to evaluate materials susceptibility to ignition in liquid and gaseous oxygen systems. However, the methods have been known for their lack of repeatability. The inherent problems identified with the test logic would either not allow precise identification or the magnitude of problems related to running the tests, such as lack of consistency of systems performance, lack of adherence to procedures, etc. Excessive variability leads to increasing instances of accepting the null hypothesis erroneously, and so to the false logical deduction that problems are nonexistent when they really do exist. This paper attempts to develop and recommend an approach that could lead to increased accuracy in problem diagnostics by using the 50% reactivity point, which has been shown to be more repeatable. The initial tests conducted indicate that PTFE and Viton A (for pneumatic impact) and Buna S (for mechanical impact) would be good choices for additional testing and consideration for inter-laboratory evaluations. The approach presented could also be used to evaluate variable effects with increased confidence and tolerance optimization.
ERIC Educational Resources Information Center
Kim, Hye Jeong; Pedersen, Susan
2011-01-01
Hypothesis development is a complex cognitive activity, but one that is critical as a means of reducing uncertainty during ill-structured problem solving. In this study, we examined the effect of metacognitive scaffolds in strengthening hypothesis development. We also examined the influence of hypothesis development on young adolescents'…
Nature vs. nurture: Evidence for social learning of conflict behaviour in grizzly bears
Morehouse, Andrea T.; Graves, Tabitha A.; Mikle, Nathaniel; Boyce, Mark S.
2016-01-01
The propensity for a grizzly bear to develop conflict behaviours might be a result of social learning between mothers and cubs, genetic inheritance, or both learning and inheritance. Using non-invasive genetic sampling, we collected grizzly bear hair samples during 2011–2014 across southwestern Alberta, Canada. We targeted private agricultural lands for hair samples at grizzly bear incident sites, defining an incident as an occurrence in which the grizzly bear caused property damage, obtained anthropogenic food, or killed or attempted to kill livestock or pets. We genotyped 213 unique grizzly bears (118 M, 95 F) at 24 microsatellite loci, plus the amelogenin marker for sex. We used the program COLONY to assign parentage. We evaluated 76 mother-offspring relationships and 119 father-offspring relationships. We compared the frequency of problem and non-problem offspring from problem and non-problem parents, excluding dependent offspring from our analysis. Our results support the social learning hypothesis, but not the genetic inheritance hypothesis. Offspring of problem mothers are more likely to be involved in conflict behaviours, while offspring from non-problem mothers are not likely to be involved in incidents or human-bear conflicts themselves (Barnard’s test, p = 0.05, 62.5% of offspring from problem mothers were problem bears). There was no evidence that offspring are more likely to be involved in conflict behaviour if their fathers had been problem bears (Barnard’s test, p = 0.92, 29.6% of offspring from problem fathers were problem bears). For the mother-offspring relationships evaluated, 30.3% of offspring were identified as problem bears independent of their mother’s conflict status. Similarly, 28.6% of offspring were identified as problem bears independent of their father’s conflict status. Proactive mitigation to prevent female bears from becoming problem individuals likely will help prevent the perpetuation of conflicts through social learning.
Nature vs. Nurture: Evidence for Social Learning of Conflict Behaviour in Grizzly Bears.
Morehouse, Andrea T; Graves, Tabitha A; Mikle, Nate; Boyce, Mark S
2016-01-01
The propensity for a grizzly bear to develop conflict behaviours might be a result of social learning between mothers and cubs, genetic inheritance, or both learning and inheritance. Using non-invasive genetic sampling, we collected grizzly bear hair samples during 2011-2014 across southwestern Alberta, Canada. We targeted private agricultural lands for hair samples at grizzly bear incident sites, defining an incident as an occurrence in which the grizzly bear caused property damage, obtained anthropogenic food, or killed or attempted to kill livestock or pets. We genotyped 213 unique grizzly bears (118 M, 95 F) at 24 microsatellite loci, plus the amelogenin marker for sex. We used the program COLONY to assign parentage. We evaluated 76 mother-offspring relationships and 119 father-offspring relationships. We compared the frequency of problem and non-problem offspring from problem and non-problem parents, excluding dependent offspring from our analysis. Our results support the social learning hypothesis, but not the genetic inheritance hypothesis. Offspring of problem mothers are more likely to be involved in conflict behaviours, while offspring from non-problem mothers are not likely to be involved in incidents or human-bear conflicts themselves (Barnard's test, p = 0.05, 62.5% of offspring from problem mothers were problem bears). There was no evidence that offspring are more likely to be involved in conflict behaviour if their fathers had been problem bears (Barnard's test, p = 0.92, 29.6% of offspring from problem fathers were problem bears). For the mother-offspring relationships evaluated, 30.3% of offspring were identified as problem bears independent of their mother's conflict status. Similarly, 28.6% of offspring were identified as problem bears independent of their father's conflict status. Proactive mitigation to prevent female bears from becoming problem individuals likely will help prevent the perpetuation of conflicts through social learning.
Nature vs. Nurture: Evidence for Social Learning of Conflict Behaviour in Grizzly Bears
Morehouse, Andrea T.; Graves, Tabitha A.; Mikle, Nate; Boyce, Mark S.
2016-01-01
The propensity for a grizzly bear to develop conflict behaviours might be a result of social learning between mothers and cubs, genetic inheritance, or both learning and inheritance. Using non-invasive genetic sampling, we collected grizzly bear hair samples during 2011–2014 across southwestern Alberta, Canada. We targeted private agricultural lands for hair samples at grizzly bear incident sites, defining an incident as an occurrence in which the grizzly bear caused property damage, obtained anthropogenic food, or killed or attempted to kill livestock or pets. We genotyped 213 unique grizzly bears (118 M, 95 F) at 24 microsatellite loci, plus the amelogenin marker for sex. We used the program COLONY to assign parentage. We evaluated 76 mother-offspring relationships and 119 father-offspring relationships. We compared the frequency of problem and non-problem offspring from problem and non-problem parents, excluding dependent offspring from our analysis. Our results support the social learning hypothesis, but not the genetic inheritance hypothesis. Offspring of problem mothers are more likely to be involved in conflict behaviours, while offspring from non-problem mothers are not likely to be involved in incidents or human-bear conflicts themselves (Barnard’s test, p = 0.05, 62.5% of offspring from problem mothers were problem bears). There was no evidence that offspring are more likely to be involved in conflict behaviour if their fathers had been problem bears (Barnard’s test, p = 0.92, 29.6% of offspring from problem fathers were problem bears). For the mother-offspring relationships evaluated, 30.3% of offspring were identified as problem bears independent of their mother’s conflict status. Similarly, 28.6% of offspring were identified as problem bears independent of their father’s conflict status. Proactive mitigation to prevent female bears from becoming problem individuals likely will help prevent the perpetuation of conflicts through social learning. PMID:27851753
Clairvoyant fusion: a new methodology for designing robust detection algorithms
NASA Astrophysics Data System (ADS)
Schaum, Alan
2016-10-01
Many realistic detection problems cannot be solved with simple statistical tests for known alternative probability models. Uncontrollable environmental conditions, imperfect sensors, and other uncertainties transform simple detection problems with likelihood ratio solutions into composite hypothesis (CH) testing problems. Recently many multi- and hyperspectral sensing CH problems have been addressed with a new approach. Clairvoyant fusion (CF) integrates the optimal detectors ("clairvoyants") associated with every unspecified value of the parameters appearing in a detection model. For problems with discrete parameter values, logical rules emerge for combining the decisions of the associated clairvoyants. For many problems with continuous parameters, analytic methods of CF have been found that produce closed-form solutions-or approximations for intractable problems. Here the principals of CF are reviewed and mathematical insights are described that have proven useful in the derivation of solutions. It is also shown how a second-stage fusion procedure can be used to create theoretically superior detection algorithms for ALL discrete parameter problems.
Nuclear accident at Three Mile Island: its effect on a local community
DOE Office of Scientific and Technical Information (OSTI.GOV)
Behler, G.T. Jr.
1987-01-01
This dissertation consists of a longitudinal case study of the extent to which the structure of community power in Riverside, (a pseudonym) Pennsylvania (the largest community located within five miles of the Three Mile Island nuclear facility) changed as a result of the March, 1979 accident. The investigation centers around testing a basic working hypothesis. Simply stated, this working hypothesis argues that Riverside's power structure has become more pluralistic in response to the Three Mile Island nuclear accident. An additional corollary to this working hypothesis is also tested. This corollary asserts that many of Riverside's community power actors have becomemore » much more cosmopolitan in their political-action tactics and problem-solving orientations as a results of the TMI crisis. The aforementioned working hypothesis and associated corollary are tested via the combined utilization of three different techniques for measuring the distribution of social power. The findings of the study clearly demonstrate the existence of increased pluralism, politicization, and cosmopolitanism within Riverside since March of 1979. Furthermore, these research results, and the entire dissertation itself, contribute to a number of subfields within the discipline of sociology. In particular,contributions are noted for the subfields of community power, social movements, and disaster research.« less
Developmental Differences in the Use of Retrieval Cues to Describe Episodic Information in Memory.
ERIC Educational Resources Information Center
Ackerman, Brian P.; Rathburn, Jill
1984-01-01
Examines reasons why second and fourth grade students use cues relatively ineffectively to retrieve episodic information. Four experiments tested the hypothesis that retrieval cue effectiveness varies with the extent to which cue information describes event information in memory. Results showed that problems of discriminability and…
Math Activities Using LogoWriter--Investigations.
ERIC Educational Resources Information Center
Flewelling, Gary
This book is one in a series of teacher resource books developed to: (1) rescue students from the clutches of computers that drill and control; and (2) supply teachers with computer activities compatible with a mathematics program that emphasizes investigation, problem solving, creativity, and hypothesis making and testing. This is not a book…
"Goals" Are Not an Integral Component of Imitation
ERIC Educational Resources Information Center
Leighton, Jane; Bird, Geoffrey; Heyes, Cecilia
2010-01-01
Several theories suggest that actions are coded for imitation in terms of mentalistic goals, or inferences about the actor's intentions, and that these goals solve the "correspondence problem" by allowing sensory input to be translated into matching motor output. We tested this intention reading hypothesis against general process accounts of…
A Comparison of Numerical Problem Solving under Three Types of Calculation Conditions.
ERIC Educational Resources Information Center
Roberts, Dennis M.; Glynn, Shawn M.
1978-01-01
The study reported is the first in a series of investigations designed to empirically test the hypothesis that calculators reduce quantitative working time and increase computational accuracy, and to examine the relative magnitude of benefit that accompanies utilizing calculators compared to manual work. (MN)
The Developmental Costs and Benefits of Children's Involvement in Interparental Conflict
ERIC Educational Resources Information Center
Davies, Patrick T.; Coe, Jesse L.; Martin, Meredith J.; Sturge-Apple, Melissa L.; Cummings, E. Mark
2015-01-01
Building on empirical documentation of children's involvement in interparental conflicts as a weak predictor of psychopathology, we tested the hypothesis that involvement in conflict more consistently serves as a moderator of associations between children's emotional reactivity to interparental conflict and their psychological problems. In Study…
The Effect of Family Communication Patterns on Adopted Adolescent Adjustment
ERIC Educational Resources Information Center
Rueter, Martha A.; Koerner, Ascan F.
2008-01-01
Adoption and family communication both affect adolescent adjustment. We proposed that adoption status and family communication interact such that adopted adolescents in families with certain communication patterns are at greater risk for adjustment problems. We tested this hypothesis using a community-based sample of 384 adoptive and 208…
Assessment of theory of mind in children with communication disorders: role of presentation mode.
van Buijsen, Marit; Hendriks, Angelique; Ketelaars, Mieke; Verhoeven, Ludo
2011-01-01
Children with communication disorders have problems with both language and social interaction. The theory-of-mind hypothesis provides an explanation for these problems, and different tests have been developed to test this hypothesis. However, different modes of presentation are used in these tasks, which make the results difficult to compare. In the present study, the performances of typically developing children, children with specific language impairments, and children with autism spectrum disorders were therefore compared using three theory-of-mind tests (the Charlie test, the Smarties test, and the Sally-and-Anne test) presented in three different manners each (spoken, video, and line drawing modes). The results showed differential outcomes for the three types of tests and a significant interaction between group of children and mode of presentation. For the typically developing children, no differential effects of presentation mode were detected. For the children with SLI, the highest test scores were consistently evidenced in the line-drawing mode. For the children with ASD, test performance depended on the mode of presentation. Just how the children's non-verbal age, verbal age, and short-term memory related to their test scores was also explored for each group of children. The test scores of the SLI group correlated significantly with their short-term memory, those of the ASD group with their verbal age. These findings demonstrate that performance on theory-of-mind tests clearly depend upon mode of test presentation as well as the children's cognitive and linguistic abilities. Copyright © 2011 Elsevier Ltd. All rights reserved.
Techniques for recognizing identity of several response functions from the data of visual inspection
NASA Astrophysics Data System (ADS)
Nechval, Nicholas A.
1996-08-01
The purpose of this paper is to present some efficient techniques for recognizing from the observed data whether several response functions are identical to each other. For example, in an industrial setting the problem may be to determine whether the production coefficients established in a small-scale pilot study apply to each of several large- scale production facilities. The techniques proposed here combine sensor information from automated visual inspection of manufactured products which is carried out by means of pixel-by-pixel comparison of the sensed image of the product to be inspected with some reference pattern (or image). Let (a1, . . . , am) be p-dimensional parameters associated with m response models of the same type. This study is concerned with the simultaneous comparison of a1, . . . , am. A generalized maximum likelihood ratio (GMLR) test is derived for testing equality of these parameters, where each of the parameters represents a corresponding vector of regression coefficients. The GMLR test reduces to an equivalent test based on a statistic that has an F distribution. The main advantage of the test lies in its relative simplicity and the ease with which it can be applied. Another interesting test for the same problem is an application of Fisher's method of combining independent test statistics which can be considered as a parallel procedure to the GMLR test. The combination of independent test statistics does not appear to have been used very much in applied statistics. There does, however, seem to be potential data analytic value in techniques for combining distributional assessments in relation to statistically independent samples which are of joint experimental relevance. In addition, a new iterated test for the problem defined above is presented. A rejection of the null hypothesis by this test provides some reason why all the parameters are not equal. A numerical example is discussed in the context of the proposed procedures for hypothesis testing.
Hum, Sandra; Carr, Sherilene M
2018-02-12
Loneliness and adapting to an unfamiliar environment can increase emotional vulnerability in culturally and linguistically diverse (CALD) university students. According to Blaszczynski and Nower's pathways model of problem and pathological gambling, this emotional vulnerability could increase the risk of problem gambling. The current study examined whether loneliness was associated with problem gambling risk in CALD students relative to their Australian peers. Additionally, differences in coping strategies were examined to determine their buffering effect on the relationship. A total of 463 female and 165 male university students (aged 18-38) from Australian (38%), mixed Australian and CALD (23%) and CALD (28%) backgrounds responded to an online survey of problem gambling behaviour, loneliness, and coping strategies. The results supported the hypothesis that loneliness would be related to problem gambling in CALD students. There was no evidence of a moderating effect of coping strategies. Future research could test whether the introduction of programs designed to alleviate loneliness in culturally diverse university students reduces their risk of developing problem gambling.
DOE Office of Scientific and Technical Information (OSTI.GOV)
Chima, C.M.
This study evaluates the commercial energy sector of the Economic Community of West African States (ECOWAS). Presently, an economic union exists between the 16 countries of West Africa that are members of ECOWAS. Although the ECOWAS region has plentiful resources of commercial energy, it faces problems in this sector for two reasons. First is the problem resulting from the diminishing traditional energy resources such as wood fuel and charcoal. Second, most ECOWAS members, except Nigeria, are net importers of commercial energy, and hence face a high import burden for oil. Liquid petroleum is the dominant form of commercial energy usedmore » in the ECOWAS despite the availability of other resources. This author basically argues that the best policy and strategy solution for dealing with energy problems is through a combination of regional cooperative effort, and a more-intensive country level. The intensity-of-use hypothesis is tested with case studies of Ghana, the Ivory Coast, and Nigeria. The results indicate that newly developing countries can deviate from the expectations of the hypothesis.« less
Khanam, Rasheda; Nghiem, Son
2018-05-01
This study investigates the effects of behavioural and emotional problems in children on their educational outcomes using data from the Longitudinal Survey of Australian Children (LSAC). We contribute to the extant literature using a dynamic specification to test the hypothesis of knowledge accumulation. Further, we apply the system generalised method of moments (GMM) estimator to minimise biases due to unobserved factors. We find that mental disorders in children has a negative effect on the National Assessment Program-Literacy and Numeracy (NAPLAN) test scores. Among all mental disorders, having emotional problems is found to be the most influential with one standard deviation (SD) increase in emotional problems being associated with 0.05 SD reduction in NAPLAN reading, writing and spelling; 0.04 SD reduction in matrix reasoning and grammar; and 0.03 SD reduction in NAPLAN numeracy.
HYPOTHESIS TESTING FOR HIGH-DIMENSIONAL SPARSE BINARY REGRESSION
Mukherjee, Rajarshi; Pillai, Natesh S.; Lin, Xihong
2015-01-01
In this paper, we study the detection boundary for minimax hypothesis testing in the context of high-dimensional, sparse binary regression models. Motivated by genetic sequencing association studies for rare variant effects, we investigate the complexity of the hypothesis testing problem when the design matrix is sparse. We observe a new phenomenon in the behavior of detection boundary which does not occur in the case of Gaussian linear regression. We derive the detection boundary as a function of two components: a design matrix sparsity index and signal strength, each of which is a function of the sparsity of the alternative. For any alternative, if the design matrix sparsity index is too high, any test is asymptotically powerless irrespective of the magnitude of signal strength. For binary design matrices with the sparsity index that is not too high, our results are parallel to those in the Gaussian case. In this context, we derive detection boundaries for both dense and sparse regimes. For the dense regime, we show that the generalized likelihood ratio is rate optimal; for the sparse regime, we propose an extended Higher Criticism Test and show it is rate optimal and sharp. We illustrate the finite sample properties of the theoretical results using simulation studies. PMID:26246645
Unconditional or Conditional Logistic Regression Model for Age-Matched Case-Control Data?
Kuo, Chia-Ling; Duan, Yinghui; Grady, James
2018-01-01
Matching on demographic variables is commonly used in case-control studies to adjust for confounding at the design stage. There is a presumption that matched data need to be analyzed by matched methods. Conditional logistic regression has become a standard for matched case-control data to tackle the sparse data problem. The sparse data problem, however, may not be a concern for loose-matching data when the matching between cases and controls is not unique, and one case can be matched to other controls without substantially changing the association. Data matched on a few demographic variables are clearly loose-matching data, and we hypothesize that unconditional logistic regression is a proper method to perform. To address the hypothesis, we compare unconditional and conditional logistic regression models by precision in estimates and hypothesis testing using simulated matched case-control data. Our results support our hypothesis; however, the unconditional model is not as robust as the conditional model to the matching distortion that the matching process not only makes cases and controls similar for matching variables but also for the exposure status. When the study design involves other complex features or the computational burden is high, matching in loose-matching data can be ignored for negligible loss in testing and estimation if the distributions of matching variables are not extremely different between cases and controls.
Unconditional or Conditional Logistic Regression Model for Age-Matched Case–Control Data?
Kuo, Chia-Ling; Duan, Yinghui; Grady, James
2018-01-01
Matching on demographic variables is commonly used in case–control studies to adjust for confounding at the design stage. There is a presumption that matched data need to be analyzed by matched methods. Conditional logistic regression has become a standard for matched case–control data to tackle the sparse data problem. The sparse data problem, however, may not be a concern for loose-matching data when the matching between cases and controls is not unique, and one case can be matched to other controls without substantially changing the association. Data matched on a few demographic variables are clearly loose-matching data, and we hypothesize that unconditional logistic regression is a proper method to perform. To address the hypothesis, we compare unconditional and conditional logistic regression models by precision in estimates and hypothesis testing using simulated matched case–control data. Our results support our hypothesis; however, the unconditional model is not as robust as the conditional model to the matching distortion that the matching process not only makes cases and controls similar for matching variables but also for the exposure status. When the study design involves other complex features or the computational burden is high, matching in loose-matching data can be ignored for negligible loss in testing and estimation if the distributions of matching variables are not extremely different between cases and controls. PMID:29552553
Communication Profile of Primary School-Aged Children with Foetal Growth Restriction
ERIC Educational Resources Information Center
Partanen, Lea Aulikki; Olsén, Päivi; Mäkikallio, Kaarin; Korkalainen, Noora; Heikkinen, Hanna; Heikkinen, Minna; Yliherva, Anneli
2017-01-01
Foetal growth restriction is associated with problems in neurocognitive development. In the present study, prospectively collected cohorts of foetal growth restricted (FGR) and appropriate for gestational age grown (AGA) children were examined at early school-age by using the Children's Communication Checklist-2 (CCC-2) to test the hypothesis that…
ERIC Educational Resources Information Center
Jayawickreme, Nuwan; Jayawickreme, Eranda; Atanasov, Pavel; Goonasekera, Michelle A.; Foa, Edna B.
2012-01-01
The hypothesis that psychometric instruments incorporating local idioms of distress predict functional impairment in a non-Western, war-affected population above and beyond translations of already established instruments was tested. Exploratory factor analysis was conducted on the War-Related Psychological and Behavioral Problems section of the…
Maternal Drinking Problems and Children's Auditory, Intellectual, and Linguistic Functioning.
ERIC Educational Resources Information Center
Czarnecki, Donna M.; And Others
This study tested the hypothesis that maternal drinking early in pregnancy affects the development of the child's central auditory processing. A follow-up study of 167 children took place 6 years after their mothers participated in a survey concerning health and drinking practices during the early stages of pregnancy. Indications of problem…
Setting the Mood for Critical Thinking in the Classroom
ERIC Educational Resources Information Center
Lewine, Rich; Sommers, Alison; Waford, Rachel; Robertson, Catherine
2015-01-01
Most current efforts to enhance critical thinking focus on skills practice and training. The empirical research from the fields of cognition and affect sciences suggest that positive mood, even when transiently induced, can have beneficial effects on cognitive flexibility and problem solving. We undertook this study to test this hypothesis in a…
Evidence That Thinking about Death Relates to Time-Estimation Behavior
ERIC Educational Resources Information Center
Martens, Andy; Schmeichel, Brandon J.
2011-01-01
Time and death are linked--the passing of time brings us closer to death. Terror management theory proposes that awareness of death represents a potent problem that motivates a variety of psychological defenses (Greenberg, Pyszczynski, & Solomon, 1997). We tested the hypothesis that thinking about death motivates elongated perceptions of brief…
Anchoring the Deficit of the Anchor Deficit: Dyslexia or Attention?
ERIC Educational Resources Information Center
Willburger, Edith; Landerl, Karin
2010-01-01
In the anchoring deficit hypothesis of dyslexia ("Trends Cogn. Sci.", 2007; 11: 458-465), it is proposed that perceptual problems arise from the lack of forming a perceptual anchor for repeatedly presented stimuli. A study designed to explicitly test the specificity of the anchoring deficit for dyslexia is presented. Four groups, representing all…
Math Activities Using LogoWriter--Patterns and Designs.
ERIC Educational Resources Information Center
Flewelling, Gary
This book is one in a series of teacher resource books developed to: (1) rescue students from the clutches of computers that drill and control; and (2) supply teachers with computer activities compatible with a mathematics program that emphasizes investigation, problem solving, creativity, and hypothesis making and testing. This is not a book…
Math Activities Using LogoWriter--Numbers & Operations.
ERIC Educational Resources Information Center
Flewelling, Gary
This book is one in a series of teacher resource books developed to: (1) rescue students from the clutches of computers that drill and control; and (2) supply teachers with computer activities compatible with a mathematics program that emphasizes investigation, problem solving, creativity, and hypothesis making and testing. This is not a book…
Cognitive Consequences of Participation in a "Fifth Dimension" After-School Computer Club.
ERIC Educational Resources Information Center
Mayer, Richard E.; Quilici, Jill; Moreno, Roxana; Duran, Richard; Woodbridge, Scott; Simon, Rebecca; Sanchez, David; Lavezzo, Amy
1997-01-01
Children who attended the Fifth Dimension after-school computer club at least 10 times during the 1994-95 school year performed better on word problem comprehension tests than did non-participating children. Results support the hypothesis that experience in using computer software in the Fifth Dimension club produces measurable, resilient, and…
ERIC Educational Resources Information Center
Andreas, Jasmina Burdzovic; O'Farrell, Timothy J.
2007-01-01
Psychosocial adjustment in children of alcoholics (N = 114) was examined in the year before and at three follow-ups in the 15 months after their alcoholic fathers entered alcoholism treatment, testing the hypothesis that children's adjustment problems will vary over time as a function of their fathers' heavy drinking patterns. Three unique…
Exploring Mathematics Problems Prepares Children to Learn from Instruction
ERIC Educational Resources Information Center
DeCaro, Marci S.; Rittle-Johnson, Bethany
2012-01-01
Both exploration and explicit instruction are thought to benefit learning in many ways, but much less is known about how the two can be combined. We tested the hypothesis that engaging in exploratory activities prior to receiving explicit instruction better prepares children to learn from the instruction. Children (159 second- to fourth-grade…
ERIC Educational Resources Information Center
Stringaris, Argyris; Maughan, Barbara; Goodman, Robert
2010-01-01
Objective: Oppositional defiant disorder (ODD) is classified as a disruptive disorder, but shows a wide range of associations with other psychopathology, including internalizing problems. The reasons for these associations are unclear. Here we test the hypothesis that two distinct early temperamental precursors--emotionality and activity--underlie…
Statistical modeling, detection, and segmentation of stains in digitized fabric images
NASA Astrophysics Data System (ADS)
Gururajan, Arunkumar; Sari-Sarraf, Hamed; Hequet, Eric F.
2007-02-01
This paper will describe a novel and automated system based on a computer vision approach, for objective evaluation of stain release on cotton fabrics. Digitized color images of the stained fabrics are obtained, and the pixel values in the color and intensity planes of these images are probabilistically modeled as a Gaussian Mixture Model (GMM). Stain detection is posed as a decision theoretic problem, where the null hypothesis corresponds to absence of a stain. The null hypothesis and the alternate hypothesis mathematically translate into a first order GMM and a second order GMM respectively. The parameters of the GMM are estimated using a modified Expectation-Maximization (EM) algorithm. Minimum Description Length (MDL) is then used as the test statistic to decide the verity of the null hypothesis. The stain is then segmented by a decision rule based on the probability map generated by the EM algorithm. The proposed approach was tested on a dataset of 48 fabric images soiled with stains of ketchup, corn oil, mustard, ragu sauce, revlon makeup and grape juice. The decision theoretic part of the algorithm produced a correct detection rate (true positive) of 93% and a false alarm rate of 5% on these set of images.
A test of the substitution-habitat hypothesis in amphibians.
Martínez-Abraín, Alejandro; Galán, Pedro
2018-06-01
Most examples that support the substitution-habitat hypothesis (human-made habitats act as substitutes of original habitat) deal with birds and mammals. We tested this hypothesis in 14 amphibians by using percentage occupancy as a proxy of habitat quality (i.e., higher occupancy percentages indicate higher quality). We classified water body types as original habitat (no or little human influence) depending on anatomical, behavioral, or physiological adaptations of each amphibian species. Ten species had relatively high probabilities (0.16-0.28) of occurrence in original habitat, moderate probability of occurrence in substitution habitats (0.11-0.14), and low probability of occurrence in refuge habitats (0.05-0.08). Thus, the substitution-habitat hypothesis only partially applies to amphibians because the low occupancy of refuges could be due to the negligible human persecution of this group (indicating good conservation status). However, low occupancy of refuges could also be due to low tolerance of refuge conditions, which could have led to selective extinction or colonization problems due to poor dispersal capabilities. That original habitats had the highest probabilities of occupancy suggests amphibians have a good conservation status in the region. They also appeared highly adaptable to anthropogenic substitution habitats. © 2017 Society for Conservation Biology.
[Dilemma of null hypothesis in ecological hypothesis's experiment test.
Li, Ji
2016-06-01
Experimental test is one of the major test methods of ecological hypothesis, though there are many arguments due to null hypothesis. Quinn and Dunham (1983) analyzed the hypothesis deduction model from Platt (1964) and thus stated that there is no null hypothesis in ecology that can be strictly tested by experiments. Fisher's falsificationism and Neyman-Pearson (N-P)'s non-decisivity inhibit statistical null hypothesis from being strictly tested. Moreover, since the null hypothesis H 0 (α=1, β=0) and alternative hypothesis H 1 '(α'=1, β'=0) in ecological progresses are diffe-rent from classic physics, the ecological null hypothesis can neither be strictly tested experimentally. These dilemmas of null hypothesis could be relieved via the reduction of P value, careful selection of null hypothesis, non-centralization of non-null hypothesis, and two-tailed test. However, the statistical null hypothesis significance testing (NHST) should not to be equivalent to the causality logistical test in ecological hypothesis. Hence, the findings and conclusions about methodological studies and experimental tests based on NHST are not always logically reliable.
Statistical Analysis Techniques for Small Sample Sizes
NASA Technical Reports Server (NTRS)
Navard, S. E.
1984-01-01
The small sample sizes problem which is encountered when dealing with analysis of space-flight data is examined. Because of such a amount of data available, careful analyses are essential to extract the maximum amount of information with acceptable accuracy. Statistical analysis of small samples is described. The background material necessary for understanding statistical hypothesis testing is outlined and the various tests which can be done on small samples are explained. Emphasis is on the underlying assumptions of each test and on considerations needed to choose the most appropriate test for a given type of analysis.
ERIC Educational Resources Information Center
Blackburn, J. Joey; Robinson, J. Shane
2016-01-01
The purpose of this experimental study was to assess the effects of cognitive style, problem complexity, and hypothesis generation on the problem solving ability of school-based agricultural education students. Problem solving ability was defined as time to solution. Kirton's Adaption-Innovation Inventory was employed to assess students' cognitive…
Invited Commentary: Can Issues With Reproducibility in Science Be Blamed on Hypothesis Testing?
Weinberg, Clarice R
2017-09-15
In the accompanying article (Am J Epidemiol. 2017;186(6):646-647), Dr. Timothy Lash makes a forceful case that the problems with reproducibility in science stem from our "culture" of null hypothesis significance testing. He notes that when attention is selectively given to statistically significant findings, the estimated effects will be systematically biased away from the null. Here I revisit the recent history of genetic epidemiology and argue for retaining statistical testing as an important part of the tool kit. Particularly when many factors are considered in an agnostic way, in what Lash calls "innovative" research, investigators need a selection strategy to identify which findings are most likely to be genuine, and hence worthy of further study. Published by Oxford University Press on behalf of the Johns Hopkins Bloomberg School of Public Health 2017. This work is written by (a) US Government employee(s) and is in the public domain in the US.
Vélez, Alejandro; Bee, Mark A
2013-05-01
This study tested three hypotheses about the ability of female frogs to exploit temporal fluctuations in the level of background noise to overcome the problem of recognizing male advertisement calls in noisy breeding choruses. Phonotaxis tests with green treefrogs (Hyla cinerea) and Cope's gray treefrogs (Hyla chrysoscelis) were used to measure thresholds for recognizing calls in the presence of noise maskers with (a) no level fluctuations, (b) random fluctuations, or level fluctuations characteristic of (c) conspecific choruses and (d) heterospecific choruses. The dip-listening hypothesis predicted lower signal recognition thresholds in the presence of fluctuating maskers compared with nonfluctuating maskers. Support for the dip-listening hypothesis was weak; only Cope's gray treefrogs experienced dip listening and only in the presence of randomly fluctuating maskers. The natural soundscapes advantage hypothesis predicted lower recognition thresholds when level fluctuations resembled those of natural soundscapes compared with artificial fluctuations. This hypothesis was rejected. In noise backgrounds with natural fluctuations, the species-specific advantage hypothesis predicted lower recognition thresholds when fluctuations resembled species-specific patterns of conspecific soundscapes. No evidence was found to support this hypothesis. These results corroborate previous findings showing that Cope's gray treefrogs, but not green treefrogs, experience dip listening under some noise conditions. Together, the results suggest level fluctuations in the soundscape of natural breeding choruses may present few dip-listening opportunities. The findings of this study provide little support for the hypothesis that receivers are adapted to exploit level fluctuations of natural soundscapes in recognizing communication signals.
Vélez, Alejandro; Bee, Mark A.
2013-01-01
This study tested three hypotheses about the ability of female frogs to exploit temporal fluctuations in the level of background noise to overcome the problem of recognizing male advertisement calls in noisy breeding choruses. Phonotaxis tests with green treefrogs (Hyla cinerea) and Cope’s gray treefrogs (Hyla chrysoscelis) were used to measure thresholds for recognizing calls in the presence of noise maskers with (i) no level fluctuations, (ii) random fluctuations, or level fluctuations characteristic of (iii) conspecific choruses and (iv) heterospecific choruses. The dip-listening hypothesis predicted lower signal recognition thresholds in the presence of fluctuating maskers compared with non-fluctuating maskers. Support for the dip listening hypothesis was weak; only Cope’s gray treefrogs experienced dip listening and only in the presence of randomly fluctuating maskers. The natural soundscapes advantage hypothesis predicted lower recognition thresholds when level fluctuations resembled those of natural soundscapes compared with artificial fluctuations. This hypothesis was rejected. In noise backgrounds with natural fluctuations, the species-specific advantage hypothesis predicted lower recognition thresholds when fluctuations resembled species-specific patterns of conspecific soundscapes. No evidence was found to support this hypothesis. These results corroborate previous findings showing that Cope’s gray treefrogs, but not green treefrogs, experience dip listening under some noise conditions. Together, the results suggest level fluctuations in the soundscape of natural breeding choruses may present few dip-listening opportunities. The findings of this study provide little support for the hypothesis that receivers are adapted to exploit level fluctuations of natural soundscapes in recognizing communication signals. PMID:23106802
Framing Effects are Robust to Linguistic Disambiguation: A Critical Test of Contemporary Theory
Chick, Christina F.; Reyna, Valerie F.; Corbin, Jonathan C.
2015-01-01
Theoretical accounts of risky choice framing effects assume that decision makers interpret framing options as extensionally equivalent, such that if 600 lives are at stake, saving 200 implies that 400 die. However, many scholars have argued that framing effects are caused, instead, by filling in pragmatically implied information. This linguistic ambiguity hypothesis is grounded in neo-Gricean pragmatics, information leakage, and schema theory. In two experiments, we conducted a critical test of the linguistic ambiguity hypothesis and its relation to framing. We controlled for this crucial implied information by disambiguating it using instructions and detailed examples, followed by multiple quizzes. After disambiguating missing information, we presented standard framing problems plus truncated versions, varying types of missing information. Truncations were also critical tests of prospect theory and fuzzy trace theory. Participants were not only college students, but also middle-aged adults (who showed similar results). Contrary to the ambiguity hypothesis, participants who interpreted missing information as complementary to stated information none the less showed robust framing effects. Although adding words like “at least” can change interpretations of framing information, this form of linguistic ambiguity is not necessary to observe risky choice framing effects. PMID:26348200
Framing effects are robust to linguistic disambiguation: A critical test of contemporary theory.
Chick, Christina F; Reyna, Valerie F; Corbin, Jonathan C
2016-02-01
Theoretical accounts of risky choice framing effects assume that decision makers interpret framing options as extensionally equivalent, such that if 600 lives are at stake, saving 200 implies that 400 die. However, many scholars have argued that framing effects are caused, instead, by filling in pragmatically implied information. This linguistic ambiguity hypothesis is grounded in neo-Gricean pragmatics, information leakage, and schema theory. In 2 experiments, we conducted critical tests of the linguistic ambiguity hypothesis and its relation to framing. We controlled for this crucial implied information by disambiguating it using instructions and detailed examples, followed by multiple quizzes. After disambiguating missing information, we presented standard framing problems plus truncated versions, varying types of missing information. Truncations were also critical tests of prospect theory and fuzzy trace theory. Participants were not only college students, but also middle-age adults (who showed similar results). Contrary to the ambiguity hypothesis, participants who interpreted missing information as complementary to stated information nonetheless showed robust framing effects. Although adding words like "at least" can change interpretations of framing information, this form of linguistic ambiguity is not necessary to observe risky choice framing effects. (c) 2016 APA, all rights reserved).
ERIC Educational Resources Information Center
Brown, Randall Anthony
How participation in extracurricular activity participation (EAP) encourages prosocial behavior is investigated. A sense of connection to prosocial entities is understood to influence youth behavior. This study tests the hypothesis that the impact of EAP is mediated by a youth's sense of connection to the school. Using a diverse sample of…
Problems in Bibliographic Access to Non-Print Materials. Project Media Base: Final Report.
ERIC Educational Resources Information Center
Brong, Gerald; And Others
Project Media Base reports its conclusions and recommendations for the establishment of bibliographic control of audiovisual resources as a part of an overall objective to plan, develop, and implement a nationwide network of library and information services. The purpose of this project was to test the hypothesis that the essential elements of a…
ERIC Educational Resources Information Center
Asada, Kosuke; Tomiwa, Kiyotaka; Okada, Masako; Itakura, Shoji
2010-01-01
Children with Williams syndrome (WS) have been reported to often face problems in daily communication and to have deficits in their pragmatic language abilities. To test this hypothesis, we examined whether children with WS could modify their verbal communication according to others' attention in order to share what they did. The children with WS…
Heights of selected ponderosa pine seedlings during 20 years
R. Z. Callaham; J. W. Duffield
1963-01-01
Many silviculturists and geneticists, concerned with the problem of increasing the rate of production of forest plantations, advocate or practice the selection of the larger seedlings in the nursery bed. Such selection implies a hypothesis that size of seedlings is positively correlated with size of the same plants at some more advanced age. Two tests were established...
Behavioral Health and Adjustment to College Life for Student Service Members/Veterans
ERIC Educational Resources Information Center
Schonfeld, Lawrence; Braue, Lawrence A.; Stire, Sheryl; Gum, Amber M.; Cross, Brittany L.; Brown, Lisa M.
2015-01-01
Objective: Increasing numbers of student service members/veterans (SSM/Vs) are enrolling in college. However, little is known about how their previous military experience affects their adjustment to this new role. The present study tested the hypothesis that SSM/Vs who report adjustment problems in college have a higher incidence of posttraumatic…
ERIC Educational Resources Information Center
Pedersen, Sara; Vitaro, Frank; Barker, Edward D.; Borge, Anne I. H.
2007-01-01
This study used a sample of 551 children surveyed yearly from ages 6 to 13 to examine the longitudinal associations among early behavior, middle-childhood peer rejection and friendedness, and early-adolescent depressive symptoms, loneliness, and delinquency. The study tested a sequential mediation hypothesis in which (a) behavior problems in the…
ERIC Educational Resources Information Center
Franko, Debra L.; Omori, Mika
1999-01-01
Investigates the severity of disturbed eating and its psychological correlates in college freshman women. Reports that 9% fell into the problem bulimic or dieter at-risk categories, 23% were classified as intensive dieters, 17% as casual dieters, and over half were non-dieters. Depression, dysfunctional thinking, and disturbed eating attitudes…
The Relentless Search for Effects of Divorce: Forging New Trails or Tumbling down the Beaten Path?
ERIC Educational Resources Information Center
Demo, David H.
1993-01-01
Responds to previous article by Amato on children's adjustment to divorce. Cites number of serious limitations including problems in language and logic of hypothesis-testing, in derivation of hypotheses, and in interpretation and assessment of accumulated evidence. Finds Amato's basic premise--that children of divorce suffer lifelong adjustment…
ERIC Educational Resources Information Center
Katzenmeyer, W. G.; Stenner, A. Jackson
1977-01-01
The problem of demonstrating invariance of factor structures across criterion groups is addressed. Procedures are outlined which combine the replication of factor structures across sex-race groups with use of the coefficient of invariance to demonstrate the level of invariance associated with factors identified in a self concept measure.…
Thinking about Diagnostic Thinking: A 30-Year Perspective
ERIC Educational Resources Information Center
Elstein, Arthur S.
2009-01-01
This paper has five objectives: (a) to review the scientific background of, and major findings reported in, Medical Problem Solving, now widely recognized as a classic in the field; (b) to compare these results with some of the findings in a recent best-selling collection of case studies; (c) to summarize criticisms of the hypothesis-testing model…
Gifted and Maladjusted? Implicit Attitudes and Automatic Associations Related to Gifted Children
ERIC Educational Resources Information Center
Preckel, Franzis; Baudson, Tanja Gabriele; Krolak-Schwerdt, Sabine; Glock, Sabine
2015-01-01
The disharmony hypothesis (DH) states that high intelligence comes at a cost to the gifted, resulting in adjustment problems. We investigated whether there is a gifted stereotype that falls in line with the DH and affects attitudes toward gifted students. Preservice teachers (N = 182) worked on single-target association tests and affective priming…
ERIC Educational Resources Information Center
Fujiwara, Takeo; Okuyama, Makiko; Izumi, Mayuko
2012-01-01
The authors test the hypothesis that separation from a violent husband or partner improves maternal parenting in Japan and examine how childhood abuse history (CAH), experience of domestic violence (DV), mental health problems, husband or partner's child maltreatment, and other demographic factors affect maternal parenting after such separation. A…
Scientific rigor through videogames.
Treuille, Adrien; Das, Rhiju
2014-11-01
Hypothesis-driven experimentation - the scientific method - can be subverted by fraud, irreproducibility, and lack of rigorous predictive tests. A robust solution to these problems may be the 'massive open laboratory' model, recently embodied in the internet-scale videogame EteRNA. Deploying similar platforms throughout biology could enforce the scientific method more broadly. Copyright © 2014 Elsevier Ltd. All rights reserved.
Does Problem Behavior Elicit Poor Parenting?: A Prospective Study of Adolescent Girls
Huh, David; Tristan, Jennifer; Wade, Emily; Stice, Eric
2006-01-01
This study tested the hypothesis that perceived parenting would show reciprocal relations with adolescents' problem behavior using longitudinal data from 496 adolescent girls. Results provided support for the assertion that female problem behavior has an adverse effect on parenting; elevated externalizing symptoms and substance abuse symptoms predicted future decreases in perceived parental support and control. There was less support for the assertion that parenting deficits foster adolescent problem behaviors; initially low parental control predicted future increases in substance abuse, but not externalizing symptoms, and low parental support did not predict future increases in externalizing or substance abuse symptoms. Results suggest that problem behavior is a more consistent predictor of parenting than parenting is of problem behavior, at least for girls during middle adolescence. PMID:16528407
Hambrick, David Z; Libarkin, Julie C; Petcovic, Heather L; Baker, Kathleen M; Elkins, Joe; Callahan, Caitlin N; Turner, Sheldon P; Rench, Tara A; Ladue, Nicole D
2012-08-01
Sources of individual differences in scientific problem solving were investigated. Participants representing a wide range of experience in geology completed tests of visuospatial ability and geological knowledge, and performed a geological bedrock mapping task, in which they attempted to infer the geological structure of an area in the Tobacco Root Mountains of Montana. A Visuospatial Ability × Geological Knowledge interaction was found, such that visuospatial ability positively predicted mapping performance at low, but not high, levels of geological knowledge. This finding suggests that high levels of domain knowledge may sometimes enable circumvention of performance limitations associated with cognitive abilities. (PsycINFO Database Record (c) 2012 APA, all rights reserved).
Nonparametric estimation and testing of fixed effects panel data models
Henderson, Daniel J.; Carroll, Raymond J.; Li, Qi
2009-01-01
In this paper we consider the problem of estimating nonparametric panel data models with fixed effects. We introduce an iterative nonparametric kernel estimator. We also extend the estimation method to the case of a semiparametric partially linear fixed effects model. To determine whether a parametric, semiparametric or nonparametric model is appropriate, we propose test statistics to test between the three alternatives in practice. We further propose a test statistic for testing the null hypothesis of random effects against fixed effects in a nonparametric panel data regression model. Simulations are used to examine the finite sample performance of the proposed estimators and the test statistics. PMID:19444335
Some controversial multiple testing problems in regulatory applications.
Hung, H M James; Wang, Sue-Jane
2009-01-01
Multiple testing problems in regulatory applications are often more challenging than the problems of handling a set of mathematical symbols representing multiple null hypotheses under testing. In the union-intersection setting, it is important to define a family of null hypotheses relevant to the clinical questions at issue. The distinction between primary endpoint and secondary endpoint needs to be considered properly in different clinical applications. Without proper consideration, the widely used sequential gate keeping strategies often impose too many logical restrictions to make sense, particularly to deal with the problem of testing multiple doses and multiple endpoints, the problem of testing a composite endpoint and its component endpoints, and the problem of testing superiority and noninferiority in the presence of multiple endpoints. Partitioning the null hypotheses involved in closed testing into clinical relevant orderings or sets can be a viable alternative to resolving the illogical problems requiring more attention from clinical trialists in defining the clinical hypotheses or clinical question(s) at the design stage. In the intersection-union setting there is little room for alleviating the stringency of the requirement that each endpoint must meet the same intended alpha level, unless the parameter space under the null hypothesis can be substantially restricted. Such restriction often requires insurmountable justification and usually cannot be supported by the internal data. Thus, a possible remedial approach to alleviate the possible conservatism as a result of this requirement is a group-sequential design strategy that starts with a conservative sample size planning and then utilizes an alpha spending function to possibly reach the conclusion early.
Perceptions of control in adults with epilepsy.
Gehlert, S
1994-01-01
That psychosocial problems are extant in epilepsy is evidenced by a suicide rate among epileptic persons five times that of the general population and an unemployment rate estimated to be more than twice that of the population as a whole. External perceptions of control secondary to repeated episodes of seizure activity that generalize to the social sphere have been implicated as causes of these problems. The hypothesis that individuals who continue to have seizures become more and more external in perceptions of control was tested by a survey mailed to a sample of individuals with epilepsy in a metropolitan area of the Midwest. Dependent variables were, scores on instruments measuring locus of control and attributional style. The independent variable was a measure of seizure control based on present age, age at onset, and length of time since last seizure. Gender, socioeconomic status, and certain parenting characteristics were included as control variables, as they are also known to affect perceptions of control. Analysis by multiple regression techniques supported the study's hypothesis when perceptions of control was conceptualized as learned helplessness for bad, but not for good, events. The hypothesis was not confirmed when perceptions of control was conceptualized as either general or health locus of control.
The impact of problem solving strategy with online feedback on students’ conceptual understanding
NASA Astrophysics Data System (ADS)
Pratiwi, H. Y.; Winarko, W.; Ayu, H. D.
2018-04-01
The study aimed to determine the impact of the implementation of problem solving strategy with online feedback towards the students’ concept understanding. This study used quasi experimental design with post-test only control design. The participants were all Physics Education students of Kanjuruhan University year 2015. Then, they were divided into two different groups; 30 students belong to experiment class and the remaining 30 students belong to class of control. The students’ concept understanding was measured by the concept understanding test on multiple integral lesson. The result of the concept understanding test was analyzed by prerequisite test and stated to be normal and homogenic distributed, then the hypothesis was examined by T-test. The result of the study shows that there is difference in the concept understanding between experiment class and control class. Next, the result also shows that the students’ concept understanding which was taught using problem solving strategy with online feedback was higher than those using conventional learning; with average score of 72,10 for experiment class and 52,27 for control class.
Bellido-Zanin, Gloria; Vázquez-Morejón, Antonio J; Pérez-San-Gregorio, Maria Ángeles; Martín-Rodríguez, Agustín
2017-10-01
Mental health models proposed for predicting more use of mental health resources by patients with severe mental illness are including a wider variety of predictor variables, but there are still many more remaining to be explored for a complete model. The purpose of this study was to enquire into the relationship between two variables, behaviour problems and burden of care, and the use of mental health resources in patients with severe mental illness. Our hypothesis was that perceived burden of care mediates between behaviour problems of patients with serious mental illness and the use of mental health resources. The Behaviour Problem Inventory, which was filled out by the main caregiver, was used to evaluate 179 patients cared for in a community mental health unit. They also answered a questionnaire on perceived family burden. A structural equation analysis was done to test our hypothesis. The results showed that both the behaviour problems and perceived burden of care are good predictors of the use of mental health resources, where perceived burden of care mediates between behaviour problems and use of resources. These variables seem to be relevant for inclusion in complete models for predicting use of mental health resources. Copyright © 2017 Elsevier B.V. All rights reserved.
Life shocks and crime: a test of the "turning point" hypothesis.
Corman, Hope; Noonan, Kelly; Reichman, Nancy E; Schwartz-Soicher, Ofira
2011-08-01
Other researchers have posited that important events in men's lives-such as employment, marriage, and parenthood-strengthen their social ties and lead them to refrain from crime. A challenge in empirically testing this hypothesis has been the issue of self-selection into life transitions. This study contributes to this literature by estimating the effects of an exogenous life shock on crime. We use data from the Fragile Families and Child Wellbeing Study, augmented with information from hospital medical records, to estimate the effects of the birth of a child with a severe health problem on the likelihood that the infant's father engages in illegal activities. We conduct a number of auxiliary analyses to examine exogeneity assumptions. We find that having an infant born with a severe health condition increases the likelihood that the father is convicted of a crime in the three-year period following the birth of the child, and at least part of the effect appears to operate through work and changes in parental relationships. These results provide evidence that life events can cause crime and, as such, support the "turning point" hypothesis.
Innovation in Health Care Delivery.
Sharan, Alok D; Schroeder, Gregory D; West, Michael E; Vaccaro, Alexander R
2016-02-01
As reimbursement transitions from a volume-based to a value-based system, innovation in health care delivery will be needed. The process of innovation begins with framing the problem that needs to be solved along with the strategic vision that has to be achieved. Similar to scientific testing, a hypothesis is generated for a new solution to a problem. Innovation requires conducting a disciplined form of experimentation and then learning from the process. This manuscript will discuss the different types of innovation, and the key steps necessary for successful innovation in the health care field.
[Screening for psychiatric risk factors in a facial trauma patients. Validating a questionnaire].
Foletti, J M; Bruneau, S; Farisse, J; Thiery, G; Chossegros, C; Guyot, L
2014-12-01
We recorded similarities between patients managed in the psychiatry department and in the maxillo-facial surgical unit. Our hypothesis was that some psychiatric conditions act as risk factors for facial trauma. We had for aim to test our hypothesis and to validate a simple and efficient questionnaire to identify these psychiatric disorders. Fifty-eight consenting patients with facial trauma, recruited prospectively in the 3 maxillo-facial surgery departments of the Marseille area during 3 months (December 2012-March 2013) completed a self-questionnaire based on the French version of 3 validated screening tests (Self Reported Psychopathy test, Rapid Alcohol Problem Screening test quantity-frequency, and Personal Health Questionnaire). This preliminary study confirmed that psychiatric conditions detected by our questionnaire, namely alcohol abuse and dependence, substance abuse, and depression, were risk factors for facial trauma. Maxillo-facial surgeons are often unaware of psychiatric disorders that may be the cause of facial trauma. The self-screening test we propose allows documenting the psychiatric history of patients and implementing earlier psychiatric care. Copyright © 2014 Elsevier Masson SAS. All rights reserved.
Joint Source-Channel Coding by Means of an Oversampled Filter Bank Code
NASA Astrophysics Data System (ADS)
Marinkovic, Slavica; Guillemot, Christine
2006-12-01
Quantized frame expansions based on block transforms and oversampled filter banks (OFBs) have been considered recently as joint source-channel codes (JSCCs) for erasure and error-resilient signal transmission over noisy channels. In this paper, we consider a coding chain involving an OFB-based signal decomposition followed by scalar quantization and a variable-length code (VLC) or a fixed-length code (FLC). This paper first examines the problem of channel error localization and correction in quantized OFB signal expansions. The error localization problem is treated as an[InlineEquation not available: see fulltext.]-ary hypothesis testing problem. The likelihood values are derived from the joint pdf of the syndrome vectors under various hypotheses of impulse noise positions, and in a number of consecutive windows of the received samples. The error amplitudes are then estimated by solving the syndrome equations in the least-square sense. The message signal is reconstructed from the corrected received signal by a pseudoinverse receiver. We then improve the error localization procedure by introducing a per-symbol reliability information in the hypothesis testing procedure of the OFB syndrome decoder. The per-symbol reliability information is produced by the soft-input soft-output (SISO) VLC/FLC decoders. This leads to the design of an iterative algorithm for joint decoding of an FLC and an OFB code. The performance of the algorithms developed is evaluated in a wavelet-based image coding system.
Taking a systems approach to ecological systems
Grace, James B.
2015-01-01
Increasingly, there is interest in a systems-level understanding of ecological problems, which requires the evaluation of more complex, causal hypotheses. In this issue of the Journal of Vegetation Science, Soliveres et al. use structural equation modeling to test a causal network hypothesis about how tree canopies affect understorey communities. Historical analysis suggests structural equation modeling has been under-utilized in ecology.
The Theory of a Free Jet of a Compressible Gas
NASA Technical Reports Server (NTRS)
Abramovich, G. N.
1944-01-01
In the present report the theory of free turbulence propagation and the boundary layer theory are developed for a plane-parallel free stream of a compressible fluid. In constructing the theory use was made of the turbulence hypothesis by Taylor (transport of vorticity) which gives best agreement with test results for problems involving heat transfer in free jets.
USDA-ARS?s Scientific Manuscript database
Cardiovascular disease (CVD) and osteoporosis are 2 major public health problems that share common pathophysiological mechanisms. It is possible that strategies to reduce CVD risk may also benefit bone health. We tested the hypothesis that adherence to the 2006 American Heart Association Diet and Li...
ERIC Educational Resources Information Center
Elliott, Timothy R.; And Others
1996-01-01
Tested hypothesis that higher levels of positive affect and lower levels of negative affect would predict depression during pregnancy and in the postpartum period. Analysis of 100 women indicated that women at risk for depression during pregnancy and in the postpartum period may exhibit heightened negative moods and a dearth of positive affective…
ERIC Educational Resources Information Center
Rossell, Christine H.; Hawley, Willis D.
By examining the attitudes and perceptions of 1625 fifth grade students in North Carolina, this study tested the hypothesis that the way teachers treat their students can have an effect on their political attitudes. It was found that when teachers treat students fairly and show interest in their ideas and problems, students are less politically…
ERIC Educational Resources Information Center
Love, Edwin; Stelling, Pete
2012-01-01
The reaction that occurs when Mentos are added to bottled soft drinks has become a staple demonstration in earth science courses to explain how volcanoes erupt. This paper presents how this engaging exercise can be used in a marketing research course to provide hands-on experience with problem formation, hypothesis testing, and causal research. A…
ERIC Educational Resources Information Center
McNeil, Nicole M.; Chesney, Dana L.; Matthews, Percival G.; Fyfe, Emily R.; Petersen, Lori A.; Dunwiddie, April E.; Wheeler, Mary C.
2012-01-01
This experiment tested the hypothesis that organizing arithmetic fact practice by equivalent values facilitates children's understanding of math equivalence. Children (M age = 8 years 6 months, N = 104) were randomly assigned to 1 of 3 practice conditions: (a) equivalent values, in which problems were grouped by equivalent sums (e.g., 3 + 4 = 7, 2…
Children's understanding of death as the cessation of agency: a test using sleep versus death.
Barrett, H Clark; Behne, Tanya
2005-06-01
An important problem faced by children is discriminating between entities capable of goal-directed action, i.e. intentional agents, and non-agents. In the case of discriminating between living and dead animals, including humans, this problem is particularly difficult, because of the large number of perceptual cues that living and dead animals share. However, there are potential costs of failing to discriminate between living and dead animals, including unnecessary vigilance and lost opportunities from failing to realize that an animal, such as an animal killed for food, is dead. This might have led to the evolution of mechanisms specifically for distinguishing between living and dead animals in terms of their ability to act. Here we test this hypothesis by examining patterns of inferences about sleeping and dead organisms by Shuar and German children between 3 and 5-years old. The results show that by age 4, causal cues to death block agency attributions to animals and people, whereas cues to sleep do not. The developmental trajectory of this pattern of inferences is identical across cultures, consistent with the hypothesis of a living/dead discrimination mechanism as a reliably developing part of core cognitive architecture.
Elaboration of the Environmental Stress Hypothesis–Results from a Population-Based 6-Year Follow-Up
Wagner, Matthias; Jekauc, Darko; Worth, Annette; Woll, Alexander
2016-01-01
The aim of this paper was to contribute to the elaboration of the Environmental Stress Hypothesis framework by testing eight hypotheses addressing the direct impact of gross motor coordination problems in elementary-school on selected physical, behavioral and psychosocial outcomes in adolescence. Results are based on a longitudinal sample of 940 participants who were (i) recruited as part of a population-based representative survey on health, physical fitness and physical activity in childhood and adolescence, (ii) assessed twice within 6 years, between the ages of 6 and 10 years old as well as between the ages of 12 and 16 years old (Response Rate: 55.9%) and (iii) classified as having gross motor coordination problems (N = 115) or having no gross motor coordination problems (N = 825) at baseline. Motor tests from the Körperkoordinationstest, measures of weight and height, a validated physical activity questionnaire as well as the Strength and Difficulties Questionnaire were conducted. Data were analyzed by use of binary logistic regressions. Results indicated that elementary-school children with gross motor coordination problems show a higher risk of persistent gross motor coordination problems (OR = 7.99, p < 0.001), avoiding organized physical activities (OR = 1.53, p < 0.05), an elevated body mass (OR = 1.78, p < 0.05), bonding with sedentary peers (OR = 1.84, p < 0.01) as well as emotional (OR = 1.73, p < 0.05) and conduct (OR = 1.79, p < 0.05) problems in adolescence in comparison to elementary-school children without gross motor coordination problems. However, elementary-school children with gross motor coordination problems did not show a significantly higher risk of peer problems (OR = 1.35, p = 0.164) or diminished prosocial behavior (OR = 1.90, p = 0.168) in adolescence, respectively in comparison to elementary-school children without gross motor coordination problems. This study is the first to provide population-based longitudinal data ranging from childhood to adolescence in the context of the Environmental Stress Hypothesis which can be considered a substantial methodological progress. In summary, gross motor coordination problems represent a serious issue for a healthy transition from childhood to adolescence which substantiates respective early movement interventions. PMID:28018254
Seeing the conflict: an attentional account of reasoning errors.
Mata, André; Ferreira, Mário B; Voss, Andreas; Kollei, Tanja
2017-12-01
In judgment and reasoning, intuition and deliberation can agree on the same responses, or they can be in conflict and suggest different responses. Incorrect responses to conflict problems have traditionally been interpreted as a sign of faulty problem-solving-an inability to solve the conflict. However, such errors might emerge earlier, from insufficient attention to the conflict. To test this attentional hypothesis, we manipulated the conflict in reasoning problems and used eye-tracking to measure attention. Across several measures, correct responders paid more attention than incorrect responders to conflict problems, and they discriminated between conflict and no-conflict problems better than incorrect responders. These results are consistent with a two-stage account of reasoning, whereby sound problem solving in the second stage can only lead to accurate responses when sufficient attention is paid in the first stage.
Carroll, Raymond J; Delaigle, Aurore; Hall, Peter
2011-03-01
In many applications we can expect that, or are interested to know if, a density function or a regression curve satisfies some specific shape constraints. For example, when the explanatory variable, X, represents the value taken by a treatment or dosage, the conditional mean of the response, Y , is often anticipated to be a monotone function of X. Indeed, if this regression mean is not monotone (in the appropriate direction) then the medical or commercial value of the treatment is likely to be significantly curtailed, at least for values of X that lie beyond the point at which monotonicity fails. In the case of a density, common shape constraints include log-concavity and unimodality. If we can correctly guess the shape of a curve, then nonparametric estimators can be improved by taking this information into account. Addressing such problems requires a method for testing the hypothesis that the curve of interest satisfies a shape constraint, and, if the conclusion of the test is positive, a technique for estimating the curve subject to the constraint. Nonparametric methodology for solving these problems already exists, but only in cases where the covariates are observed precisely. However in many problems, data can only be observed with measurement errors, and the methods employed in the error-free case typically do not carry over to this error context. In this paper we develop a novel approach to hypothesis testing and function estimation under shape constraints, which is valid in the context of measurement errors. Our method is based on tilting an estimator of the density or the regression mean until it satisfies the shape constraint, and we take as our test statistic the distance through which it is tilted. Bootstrap methods are used to calibrate the test. The constrained curve estimators that we develop are also based on tilting, and in that context our work has points of contact with methodology in the error-free case.
Adaptive seamless designs: selection and prospective testing of hypotheses.
Jennison, Christopher; Turnbull, Bruce W
2007-01-01
There is a current trend towards clinical protocols which involve an initial "selection" phase followed by a hypothesis testing phase. The selection phase may involve a choice between competing treatments or different dose levels of a drug, between different target populations, between different endpoints, or between a superiority and a non-inferiority hypothesis. Clearly there can be benefits in elapsed time and economy in organizational effort if both phases can be designed up front as one experiment, with little downtime between phases. Adaptive designs have been proposed as a way to handle these selection/testing problems. They offer flexibility and allow final inferences to depend on data from both phases, while maintaining control of overall false positive rates. We review and critique the methods, give worked examples and discuss the efficiency of adaptive designs relative to more conventional procedures. Where gains are possible using the adaptive approach, a variety of logistical, operational, data handling and other practical difficulties remain to be overcome if adaptive, seamless designs are to be effectively implemented.
Insecure attachment is associated with math anxiety in middle childhood.
Bosmans, Guy; De Smedt, Bert
2015-01-01
Children's anxiety for situations requiring mathematical problem solving, a concept referred to as math anxiety, has a unique and detrimental impact on concurrent and long-term mathematics achievement and life success. Little is known about the factors that contribute to the emergence of math anxiety. The current study builds on the hypothesis that math anxiety might reflect a maladaptive affect regulation mechanism that is characteristic for insecure attachment relationships. To test this hypothesis, 87 children primary school children (M age = 10.34 years; SD age = 0.63) filled out questionnaires measuring insecure attachment and math anxiety. They all completed a timed and untimed standardized test of mathematics achievement. Our data revealed that individual differences in math anxiety were significantly related to insecure attachment, independent of age, sex, and IQ. Both tests of mathematics achievement were associated with insecure attachment and this effect was mediated by math anxiety. This study is the first to indicate that math anxiety might develop in the context of insecure parent-child attachment relationships.
Insecure attachment is associated with math anxiety in middle childhood
Bosmans, Guy; De Smedt, Bert
2015-01-01
Children’s anxiety for situations requiring mathematical problem solving, a concept referred to as math anxiety, has a unique and detrimental impact on concurrent and long-term mathematics achievement and life success. Little is known about the factors that contribute to the emergence of math anxiety. The current study builds on the hypothesis that math anxiety might reflect a maladaptive affect regulation mechanism that is characteristic for insecure attachment relationships. To test this hypothesis, 87 children primary school children (Mage = 10.34 years; SDage = 0.63) filled out questionnaires measuring insecure attachment and math anxiety. They all completed a timed and untimed standardized test of mathematics achievement. Our data revealed that individual differences in math anxiety were significantly related to insecure attachment, independent of age, sex, and IQ. Both tests of mathematics achievement were associated with insecure attachment and this effect was mediated by math anxiety. This study is the first to indicate that math anxiety might develop in the context of insecure parent–child attachment relationships. PMID:26528233
Multispectral processing based on groups of resolution elements
NASA Technical Reports Server (NTRS)
Richardson, W.; Gleason, J. M.
1975-01-01
Several nine-point rules are defined and compared with previously studied rules. One of the rules performed well in boundary areas, but with reduced efficiency in field interiors; another combined best performance on field interiors with good sensitivity to boundary detail. The basic threshold gradient and some modifications were investigated as a means of boundary point detection. The hypothesis testing methods of closed-boundary formation were also tested and evaluated. An analysis of the boundary detection problem was initiated, employing statistical signal detection and parameter estimation techniques to analyze various formulations of the problem. These formulations permit the atmospheric and sensor system effects on the data to be thoroughly analyzed. Various boundary features and necessary assumptions can also be investigated in this manner.
Sedek, G; Kofta, M
1990-04-01
This study tested a new information-processing explanation of learned helplessness that proposes that an uncontrollable situation produces helplessness symptoms because it is a source of inconsistent, self-contradictory task information during problem-solving attempts. The flow of such information makes hypothesis-testing activity futile. Prolonged and inefficient activity of this kind leads in turn to the emergence of a state of cognitive exhaustion, with accompanying performance deficits. In 3 experiments, Ss underwent informational helplessness training (IHT): They were sequentially exposed to inconsistent task information during discrimination problems. As predicted, IHT was associated with subjective symptoms of irreducible uncertainty and resulted in (a) performance deterioration on subsequent avoidance learning, (b) heightened negative mood, and (c) subjective symptoms of cognitive exhaustion.
Explorations in statistics: hypothesis tests and P values.
Curran-Everett, Douglas
2009-06-01
Learning about statistics is a lot like learning about science: the learning is more meaningful if you can actively explore. This second installment of Explorations in Statistics delves into test statistics and P values, two concepts fundamental to the test of a scientific null hypothesis. The essence of a test statistic is that it compares what we observe in the experiment to what we expect to see if the null hypothesis is true. The P value associated with the magnitude of that test statistic answers this question: if the null hypothesis is true, what proportion of possible values of the test statistic are at least as extreme as the one I got? Although statisticians continue to stress the limitations of hypothesis tests, there are two realities we must acknowledge: hypothesis tests are ingrained within science, and the simple test of a null hypothesis can be useful. As a result, it behooves us to explore the notions of hypothesis tests, test statistics, and P values.
Teaching the Pressure-Flow Hypothesis of Phloem Transport in a Problem-Solving Session
ERIC Educational Resources Information Center
Clifford, Paul
2004-01-01
Problem solving is an ideal learning strategy, especially for topics that are perceived as difficult to teach. As an example, a format is described for a problem-solving session designed to help students understand the pressure-flow hypothesis of phloem transport in plants. Five key facts and their discussion can lead to the conclusion that a…
NASA Astrophysics Data System (ADS)
Saleh, H.; Suryadi, D.; Dahlan, J. A.
2018-01-01
The aim of this research was to find out whether 7E learning cycle under hypnoteaching model can enhance students’ mathematical problem-solving skill. This research was quasi-experimental study. The design of this study was pretest-posttest control group design. There were two groups of sample used in the study. The experimental group was given 7E learning cycle under hypnoteaching model, while the control group was given conventional model. The population of this study was the student of mathematics education program at one university in Tangerang. The statistical analysis used to test the hypothesis of this study were t-test and Mann-Whitney U. The result of this study show that: (1) The students’ achievement of mathematical problem solving skill who obtained 7E learning cycle under hypnoteaching model are higher than the students who obtained conventional model; (2) There are differences in the students’ enhancement of mathematical problem-solving skill based on students’ prior mathematical knowledge (PMK) category (high, middle, and low).
Do men believe that physically attractive women are more healthy and capable of having children?
Mathes, Eugene W; Arms, Clarissa; Bryant, Alicia; Fields, Jeni; Witowski, Aggie
2005-06-01
The purpose of this research was to test the hypothesis that men view physical attractiveness as an index of a woman's health and her capacity to have children. 21 men and 26 women from an introductory psychology course were shown photographs from 1972 of men and women college students, judged in 2002 to be attractive or unattractive. Subjects were asked to rate the photographed individuals' current health, the probability that they were married, the probability that they had children, and whether they had reproductive problems. The hypothesis was generally supported; the men rated the photographs of attractive women as healthier, more likely to be married, and more likely to have children.
The evolution of intelligence in mammalian carnivores.
Holekamp, Kay E; Benson-Amram, Sarah
2017-06-06
Although intelligence should theoretically evolve to help animals solve specific types of problems posed by the environment, it is unclear which environmental challenges favour enhanced cognition, or how general intelligence evolves along with domain-specific cognitive abilities. The social intelligence hypothesis posits that big brains and great intelligence have evolved to cope with the labile behaviour of group mates. We have exploited the remarkable convergence in social complexity between cercopithecine primates and spotted hyaenas to test predictions of the social intelligence hypothesis in regard to both cognition and brain size. Behavioural data indicate that there has been considerable convergence between primates and hyaenas with respect to their social cognitive abilities. Moreover, compared with other hyaena species, spotted hyaenas have larger brains and expanded frontal cortex, as predicted by the social intelligence hypothesis. However, broader comparative study suggests that domain-general intelligence in carnivores probably did not evolve in response to selection pressures imposed specifically in the social domain. The cognitive buffer hypothesis, which suggests that general intelligence evolves to help animals cope with novel or changing environments, appears to offer a more robust explanation for general intelligence in carnivores than any hypothesis invoking selection pressures imposed strictly by sociality or foraging demands.
The evolution of intelligence in mammalian carnivores
Benson-Amram, Sarah
2017-01-01
Although intelligence should theoretically evolve to help animals solve specific types of problems posed by the environment, it is unclear which environmental challenges favour enhanced cognition, or how general intelligence evolves along with domain-specific cognitive abilities. The social intelligence hypothesis posits that big brains and great intelligence have evolved to cope with the labile behaviour of group mates. We have exploited the remarkable convergence in social complexity between cercopithecine primates and spotted hyaenas to test predictions of the social intelligence hypothesis in regard to both cognition and brain size. Behavioural data indicate that there has been considerable convergence between primates and hyaenas with respect to their social cognitive abilities. Moreover, compared with other hyaena species, spotted hyaenas have larger brains and expanded frontal cortex, as predicted by the social intelligence hypothesis. However, broader comparative study suggests that domain-general intelligence in carnivores probably did not evolve in response to selection pressures imposed specifically in the social domain. The cognitive buffer hypothesis, which suggests that general intelligence evolves to help animals cope with novel or changing environments, appears to offer a more robust explanation for general intelligence in carnivores than any hypothesis invoking selection pressures imposed strictly by sociality or foraging demands. PMID:28479979
Kristofferzon, Marja-Leena; Engström, Maria; Nilsson, Annika
2018-07-01
The aim of the present study was to investigate relationships between sense of coherence, emotion-focused coping, problem-focused coping, coping efficiency, and mental quality of life (QoL) in patients with chronic illness. A model based on Lazarus' and Folkman's stress and coping theory tested the specific hypothesis: Sense of coherence has a direct and indirect effect on mental QoL mediated by emotion-focused coping, problem-focused coping, and coping efficiency in serial adjusted for age, gender, educational level, comorbidity, and economic status. The study used a cross-sectional and correlational design. Patients (n = 292) with chronic diseases (chronic heart failure, end-stage renal disease, multiple sclerosis, stroke, and Parkinson) completed three questionnaires and provided background data. Data were collected in 2012, and a serial multiple mediator model was tested using PROCESS macro for SPSS. The test of the conceptual model confirmed the hypothesis. There was a significant direct and indirect effect of sense of coherence on mental QoL through the three mediators. The model explained 39% of the variance in mental QoL. Self-perceived effective coping strategies are the most important mediating factors between sense of coherence and QoL in patients with chronic illness, which supports Lazarus' and Folkman's stress and coping theory.
Effendy, Christantie; Vissers, Kris; Osse, Bart H P; Tejawinata, Sunaryadi; Vernooij-Dassen, Myrra; Engels, Yvonne
2015-06-01
Patients with advanced cancer experience problems and unmet needs. However, we assume that patients with advanced cancer will have more problems and unmet needs in a country with a lower economic status than in an economically stronger country. We studied whether patients with advanced cancer in Indonesia have more problems and unmet needs than a similar group of patients in the Netherlands. We performed a cross-sectional survey. We compared the data for 180 Indonesian and 94 Dutch patients relating to 24 items of the Problems and Needs in Palliative Care-short version questionnaire. We performed descriptive and χ(2) analysis with Bonferroni correction. The prevalence of most physical problems, including pain, was similar in the 2 groups. In Indonesia, financial problems were the most common: 70 to 80% vs. 30 to 42% in the Netherlands. In Indonesia, 25 to 50% of the patients reported psychological and autonomy problems versus 55 to 86% in the Netherlands. The Indonesian group had many more unmet needs for each problem (> 54%) than the Dutch group (< 35%). Apparently, economic and cultural differences hardly influence physical problems. Nonetheless, fewer Indonesian patients reported psychological and autonomy problems than Dutch patients. This difference contradicts our hypothesis. However, we found more unmet needs for professional attention in Indonesia than in the Netherlands, which is compatible with our hypothesis. These simple comparative data provide interesting insights into problems and unmet needs and give rise to our new hypothesis about cultural influences. This hypothesis should be studied in more depth. © 2014 World Institute of Pain.
The Scientific Method and the Creative Process: Implications for the K-6 Classroom
ERIC Educational Resources Information Center
Nichols, Amanda J.; Stephens, April H.
2013-01-01
Science and the arts might seem very different, but the processes that both fields use are very similar. The scientific method is a way to explore a problem, form and test a hypothesis, and answer questions. The creative process creates, interprets, and expresses art. Inquiry is at the heart of both of these methods. The purpose of this article is…
Carbon dioxide efflux from a 550 m3 soil across a range of soul temperatues
Ramesh Murthy; Kevin L. Griffin; Stanley J. Zarnoch; Philip M. Dougherty; Barbara Watson; Joost Van Haren; Randy L. Patterson; Tilka Mahato
2003-01-01
Because of scaling problems point measurements of soil CO2 efflux on a small volume of soil may not necessarily reflect an overall community response. The aim of this study was to test this hypothesis in the Biosphere 2 facility and achieve the following broad goals: (1) investigate soil net CO2 exchangeâtemperature...
Case Studies of Predictive Analysis Applications in Law Enforcement
2015-12-01
analysis to crime problems to be considered effective. To test this hypothesis, case studies were conducted on municipal police departments in the...Forecasting in Law Enforcement Operations. 6 Vlahos, “The Department of Pre -Crime.” 4 studies were conducted on municipal police departments in the... case studies are described and recommendations for future research , as well as recommendations for police executives considering an investment in
The computationalist reformulation of the mind-body problem.
Marchal, Bruno
2013-09-01
Computationalism, or digital mechanism, or simply mechanism, is a hypothesis in the cognitive science according to which we can be emulated by a computer without changing our private subjective feeling. We provide a weaker form of that hypothesis, weaker than the one commonly referred to in the (vast) literature and show how to recast the mind-body problem in that setting. We show that such a mechanist hypothesis does not solve the mind-body problem per se, but does help to reduce partially the mind-body problem into another problem which admits a formulation in pure arithmetic. We will explain that once we adopt the computationalist hypothesis, which is a form of mechanist assumption, we have to derive from it how our belief in the physical laws can emerge from *only* arithmetic and classical computer science. In that sense we reduce the mind-body problem to a body problem appearance in computer science, or in arithmetic. The general shape of the possible solution of that subproblem, if it exists, is shown to be closer to "Platonist or neoplatonist theology" than to the "Aristotelian theology". In Plato's theology, the physical or observable reality is only the shadow of a vaster hidden nonphysical and nonobservable, perhaps mathematical, reality. The main point is that the derivation is constructive, and it provides the technical means to derive physics from arithmetic, and this will make the computationalist hypothesis empirically testable, and thus scientific in the Popperian analysis of science. In case computationalism is wrong, the derivation leads to a procedure for measuring "our local degree of noncomputationalism". Copyright © 2013 Elsevier Ltd. All rights reserved.
Raybould, Alan
2010-01-01
The bucket and the searchlight are metaphors for opposing theories of the growth of scientific knowledge. The bucket theory proposes that knowledge is gained by observing the world without preconceptions, and that knowledge emerges from the accumulation of observations that support a hypothesis. There are many problems with this theory, the most serious of which is that it does not appear to offer a means to distinguish between the many hypotheses that could explain a particular set of observations. The searchlight theory proposes that preconceptions are unavoidable and that knowledge advances through the improvement of our preconceptions - our hypotheses - by continuous criticism and revision. A hypothesis is a searchlight that illuminates observations that test the hypothesis and reveal its flaws, and knowledge thereby increases through the elimination of false hypotheses. Research into the risks posed by the cultivation of transgenic crops often appears to apply the bucket theory; many data are produced, but knowledge of risk is not advanced. Application of the searchlight theory, whereby risk assessments test hypotheses that transgenic crops will not be harmful, seems to offer a better way to characterise risk. The effectiveness of an environmental risk assessment should not be measured by the size of the bucket of observations on a transgenic crop, but by the power of the risk hypothesis searchlights to clarify the risks that may arise from cultivation of that crop. These points are illustrated by examples of hypotheses that could be tested to assess the risks from transgenic crops and their hybrids becoming weeds or invading non-agricultural habitats. © ISBR, EDP Sciences, 2011.
Skagerlund, Kenny; Träff, Ulf
2016-01-01
This study investigated if developmental dyscalculia (DD) in children with different profiles of mathematical deficits has the same or different cognitive origins. The defective approximate number system hypothesis and the access deficit hypothesis were tested using two different groups of children with DD (11-13 years old): a group with arithmetic fact dyscalculia (AFD) and a group with general dyscalculia (GD). Several different aspects of number magnitude processing were assessed in these two groups and compared with age-matched typically achieving children. The GD group displayed weaknesses with both symbolic and nonsymbolic number processing, whereas the AFD group displayed problems only with symbolic number processing. These findings provide evidence that the origins of DD in children with different profiles of mathematical problems diverge. Children with GD have impairment in the innate approximate number system, whereas children with AFD suffer from an access deficit. These findings have implications for researchers' selection procedures when studying dyscalculia, and also for practitioners in the educational setting. © Hammill Institute on Disabilities 2014.
Watson, Paul J; Andrews, Paul W
2002-10-01
Evolutionary biologists use Darwinian theory and functional design ("reverse engineering") analyses, to develop and test hypotheses about the adaptive functions of traits. Based upon a consideration of human social life and a functional design analysis of depression's core symptomatology we offer a comprehensive theory of its adaptive significance called the Social Navigation Hypothesis (SNH). The SNH attempts to account for all intensities of depression based on standard evolutionary theories of sociality, communication and psychological pain. The SNH suggests that depression evolved to perform two complimentary social problem-solving functions. First, depression induces cognitive changes that focus and enhance capacities for the accurate analysis and solution of key social problems, suggesting a social rumination function. Second, the costs associated with the anhedonia and psychomotor perturbation of depression can persuade reluctant social partners to provide help or make concessions via two possible mechanisms, namely, honest signaling and passive, unintentional fitness extortion. Thus it may also have a social motivation function.
Testing the self-medication hypothesis of depression and aggression in cannabis-dependent subjects.
Arendt, Mikkel; Rosenberg, Raben; Fjordback, Lone; Brandholdt, Jack; Foldager, Leslie; Sher, Leo; Munk-Jørgensen, Povl
2007-07-01
A self-medication hypothesis has been proposed to explain the association between cannabis use and psychiatric and behavioral problems. However, little is known about the reasons for use and reactions while intoxicated in cannabis users who suffer from depression or problems controlling violent behavior. We assessed 119 cannabis-dependent subjects using the Schedules of Clinical Assessment in Neuropsychiatry (SCAN), parts of the Addiction Severity Index (ASI), and questionnaires on reasons for cannabis use and reactions to cannabis use while intoxicated. Participants with lifetime depression and problems controlling violent behavior were compared to subjects without such problems. Validity of the groupings was corroborated by use of a psychiatric treatment register, previous use of psychotropic medication and convictions for violence. Subjects with lifetime depression used cannabis for the same reasons as others. While under the influence of cannabis, they more often experienced depression, sadness, anxiety and paranoia, and they were less likely to report happiness or euphoria. Participants reporting problems controlling violent behavior more often used cannabis to decrease aggression, decrease suspiciousness, and for relaxation; while intoxicated they more often reacted with aggression. Subjects with prior depression do not use cannabis as a mean of self-medication. They are more likely to experience specific increases of adverse symptoms while under the influence of cannabis, and are less likely to experience specific symptom relief. There is some evidence that cannabis is used as a means of self-medication for problems controlling aggression.
Memory inhibition as a critical factor preventing creative problem solving.
Gómez-Ariza, Carlos J; Del Prete, Francesco; Prieto Del Val, Laura; Valle, Tania; Bajo, M Teresa; Fernandez, Angel
2017-06-01
The hypothesis that reduced accessibility to relevant information can negatively affect problem solving in a remote associate test (RAT) was tested by using, immediately before the RAT, a retrieval practice procedure to hinder access to target solutions. The results of 2 experiments clearly showed that, relative to baseline, target words that had been competitors during selective retrieval were much less likely to be provided as solutions in the RAT, demonstrating that performance in the problem-solving task was strongly influenced by the predetermined accessibility status of the solutions in memory. Importantly, this was so even when participants were unaware of the relationship between the memory and the problem-solving procedures in the experiments. This finding is consistent with an inhibitory account of retrieval-induced forgetting effects and, more generally, constitutes support for the idea that the activation status of mental representations originating in a given task (e.g., episodic memory) can unwittingly have significant consequences for a different, unrelated task (e.g., problem solving). (PsycINFO Database Record (c) 2017 APA, all rights reserved).
Predictors of Physical Altercation among Adolescents in Residential Substance Abuse Treatment
Crawley, Rachel D.; Becan, Jennifer Edwards; Knight, Danica Kalling; Joe, George W.; Flynn, Patrick M.
2014-01-01
This study tested the hypothesis that basic social information-processing components represented by family conflict, peer aggression, and pro-aggression cognitive scripts are related to aggression and social problems among adolescents in substance abuse treatment. The sample consisted of 547 adolescents in two community-based residential facilities. Correlation results indicated that more peer aggression is related to more pro-aggression scripts; scripts, peer aggression, and family conflict are associated with social problems; and in-treatment physical altercation involvement is predicted by higher peer aggression. Findings suggest that social information-processing components are valuable for treatment research. PMID:26622072
Jiang, Hong; Chess, Leonard
2008-11-01
By discriminating self from nonself and controlling the magnitude and class of immune responses, the immune system mounts effective immunity against virtually any foreign antigens but avoids harmful immune responses to self. These are two equally important and related but distinct processes, which function in concert to ensure an optimal function of the immune system. Immunologically relevant clinical problems often occur because of failure of either process, especially the former. Currently, there is no unified conceptual framework to characterize the precise relationship between thymic negative selection and peripheral immune regulation, which is the basis for understanding self-non-self discrimination versus control of magnitude and class of immune responses. In this article, we explore a novel hypothesis of how the immune system discriminates self from nonself in the periphery during adaptive immunity. This hypothesis permits rational analysis of various seemingly unrelated biomedical problems inherent in immunologic disorders that cannot be uniformly interpreted by any currently existing paradigms. The proposed hypothesis is based on a unified conceptual framework of the "avidity model of peripheral T-cell regulation" that we originally proposed and tested, in both basic and clinical immunology, to understand how the immune system achieves self-nonself discrimination in the periphery.
Invited Commentary: The Need for Cognitive Science in Methodology.
Greenland, Sander
2017-09-15
There is no complete solution for the problem of abuse of statistics, but methodological training needs to cover cognitive biases and other psychosocial factors affecting inferences. The present paper discusses 3 common cognitive distortions: 1) dichotomania, the compulsion to perceive quantities as dichotomous even when dichotomization is unnecessary and misleading, as in inferences based on whether a P value is "statistically significant"; 2) nullism, the tendency to privilege the hypothesis of no difference or no effect when there is no scientific basis for doing so, as when testing only the null hypothesis; and 3) statistical reification, treating hypothetical data distributions and statistical models as if they reflect known physical laws rather than speculative assumptions for thought experiments. As commonly misused, null-hypothesis significance testing combines these cognitive problems to produce highly distorted interpretation and reporting of study results. Interval estimation has so far proven to be an inadequate solution because it involves dichotomization, an avenue for nullism. Sensitivity and bias analyses have been proposed to address reproducibility problems (Am J Epidemiol. 2017;186(6):646-647); these methods can indeed address reification, but they can also introduce new distortions via misleading specifications for bias parameters. P values can be reframed to lessen distortions by presenting them without reference to a cutoff, providing them for relevant alternatives to the null, and recognizing their dependence on all assumptions used in their computation; they nonetheless require rescaling for measuring evidence. I conclude that methodological development and training should go beyond coverage of mechanistic biases (e.g., confounding, selection bias, measurement error) to cover distortions of conclusions produced by statistical methods and psychosocial forces. © The Author(s) 2017. Published by Oxford University Press on behalf of the Johns Hopkins Bloomberg School of Public Health. All rights reserved. For permissions, please e-mail: journals.permissions@oup.com.
Coccaro, Emil F; Hirsch, Sharon L; Stein, Mark A
2007-01-15
Central dopaminergic activity is critical to the functioning of both motor and cognitive systems. Based on the therapeutic action of dopaminergic agents in treating attention deficit hyperactivity disorder (ADHD), ADHD symptoms may be related to a reduction in central dopaminergic activity. We tested the hypothesis that dopaminergic activity, as reflected by plasma homovanillic acid (pHVA), may be related to dimensional aspects of ADHD in adults. Subjects were 30 healthy volunteer and 39 personality disordered subjects, in whom morning basal pHVA concentration and a dimensional measure of childhood ADHD symptoms (Wender Utah Rating Scale: WURS) were obtained. A significant inverse correlation was found between WURS Total score and pHVA concentration in the total sample. Among WURS factor scores, a significant inverse relationship was noted between pHVA and history of "childhood learning problems". Consistent with the dopaminergic dysfunction hypothesis of ADHD and of cognitive function, pHVA concentrations were correlated with childhood history of ADHD symptoms in general and with history of "learning problems" in non-ADHD psychiatric patients and controls. Replication is needed in treated and untreated ADHD samples to confirm these initial results.
Mancini, Vincent O; Rigoli, Daniela; Roberts, Lynne D; Heritage, Brody; Piek, Jan P
2017-09-08
The elaborated environmental stress hypothesis (EESH) provides a framework that describes how motor skills may indirectly cause internalizing problems through various mediating psychosocial factors. While there is evidence to support this framework, little is known about how the proposed relationships may vary across different stages of development. This study aimed to investigate whether peer problems and perceived self-competence mediated the relationship between motor skills and internalizing problems in pre-primary children, and at 18-month follow up. A community sample of 197 pre-primary school children (M = 5.40 years, SD = 0.30 years; 102 males, 95 females) participated at Time 1, with 107 completing the Time 2 follow-up. Standardized instruments were used to measure motor skills and verbal IQ. Perceived self-competence was measured using a self-report measure. Participant peer problems and internalizing problems were measured using teacher report. Age, gender, and verbal IQ were included as covariates. Mediation analysis using PROCESS showed that the relationship between motor skills and internalizing problems was mediated by peer problems at Time 1. At Time 2, the relationship was mediated by peer problems and perceived physical competence. The current results indicate the EESH may function differently across different periods of development. The transition from pre-primary to Grade 1 represents a time of important cognitive and psychosocial development, which has implications for how the relationship between motor skills and internalizing problems can be understood. These findings highlight potential age-appropriate targets for psychomotor interventions aiming to improve the emotional well-being of young children. © 2017 The British Psychological Society.
Developing the research hypothesis.
Toledo, Alexander H; Flikkema, Robert; Toledo-Pereyra, Luis H
2011-01-01
The research hypothesis is needed for a sound and well-developed research study. The research hypothesis contributes to the solution of the research problem. Types of research hypotheses include inductive and deductive, directional and non-directional, and null and alternative hypotheses. Rejecting the null hypothesis and accepting the alternative hypothesis is the basis for building a good research study. This work reviews the most important aspects of organizing and establishing an efficient and complete hypothesis.
Life Shocks and Crime: A Test of the “Turning Point” Hypothesis
Noonan, Kelly; Reichman, Nancy E.; Schwartz-Soicher, Ofira
2012-01-01
Other researchers have posited that important events in men’s lives—such as employment, marriage, and parenthood—strengthen their social ties and lead them to refrain from crime. A challenge in empirically testing this hypothesis has been the issue of self-selection into life transitions. This study contributes to this literature by estimating the effects of an exogenous life shock on crime. We use data from the Fragile Families and Child Wellbeing Study, augmented with information from hospital medical records, to estimate the effects of the birth of a child with a severe health problem on the likelihood that the infant’s father engages in illegal activities. We conduct a number of auxiliary analyses to examine exogeneity assumptions. We find that having an infant born with a severe health condition increases the likelihood that the father is convicted of a crime in the three-year period following the birth of the child, and at least part of the effect appears to operate through work and changes in parental relationships. These results provide evidence that life events can cause crime and, as such, support the “turning point” hypothesis. PMID:21660628
The PMHT: solutions for some of its problems
NASA Astrophysics Data System (ADS)
Wieneke, Monika; Koch, Wolfgang
2007-09-01
Tracking multiple targets in a cluttered environment is a challenging task. Probabilistic Multiple Hypothesis Tracking (PMHT) is an efficient approach for dealing with it. Essentially PMHT is based on the method of Expectation-Maximization for handling with association conflicts. Linearity in the number of targets and measurements is the main motivation for a further development and extension of this methodology. Unfortunately, compared with the Probabilistic Data Association Filter (PDAF), PMHT has not yet shown its superiority in terms of track-lost statistics. Furthermore, the problem of track extraction and deletion is apparently not yet satisfactorily solved within this framework. Four properties of PMHT are responsible for its problems in track maintenance: Non-Adaptivity, Hospitality, Narcissism and Local Maxima. 1, 2 In this work we present a solution for each of them and derive an improved PMHT by integrating the solutions into the PMHT formalism. The new PMHT is evaluated by Monte-Carlo simulations. A sequential Likelihood-Ratio (LR) test for track extraction has been developed and already integrated into the framework of traditional Bayesian Multiple Hypothesis Tracking. 3 As a multi-scan approach, also the PMHT methodology has the potential for track extraction. In this paper an analogous integration of a sequential LR test into the PMHT framework is proposed. We present an LR formula for track extraction and deletion using the PMHT update formulae. As PMHT provides all required ingredients for a sequential LR calculation, the LR is thus a by-product of the PMHT iteration process. Therefore the resulting update formula for the sequential LR test affords the development of Track-Before-Detect algorithms for PMHT. The approach is illustrated by a simple example.
Examination of the Eyberg Child Behavior Inventory Discrepancy Hypothesis
ERIC Educational Resources Information Center
Butler, Ashley M.; Brestan, Elizabeth V.; Eyberg, Sheila M.
2008-01-01
This study examined the Eyberg Child Behavior Inventory (ECBI) "discrepancy hypothesis", which asserts that a discrepancy in score elevations on the ECBI Intensity and Problem Scales is related to problematic parenting styles. The Intensity Scale measures the frequency of child disruptive behavior, and the Problem Scale measures parent…
Arnett, Anne B; Pennington, Bruce F; Young, Jami F; Hankin, Benjamin L
2016-04-01
The onset of hyperactivity/impulsivity and attention problems (HAP) is typically younger than that of conduct problems (CP), and some research supports a directional relation wherein HAP precedes CP. Studies have tested this theory using between-person and between-group comparisons, with conflicting results. In contrast, prior research has not examined the effects of within-person fluctuations in HAP on CP. This study tested the hypothesis that within-person variation in HAP would positively predict subsequent within-person variation in CP, in two population samples of youth (N = 620) who participated in identical methods of assessment over the course of 30 months. Three-level, hierarchical models were used to test for within-person, longitudinal associations between HAP and CP, as well as moderating effects of between-person and between-family demographics. We found a small but significant association in the expected direction for older youth, but the opposite effect in younger and non-Caucasian youth. These results were replicated across both samples. The process by which early HAP relates to later CP may vary by age and racial identity. © 2015 Association for Child and Adolescent Mental Health.
Fagerland, Morten W; Sandvik, Leiv; Mowinckel, Petter
2011-04-13
The number of events per individual is a widely reported variable in medical research papers. Such variables are the most common representation of the general variable type called discrete numerical. There is currently no consensus on how to compare and present such variables, and recommendations are lacking. The objective of this paper is to present recommendations for analysis and presentation of results for discrete numerical variables. Two simulation studies were used to investigate the performance of hypothesis tests and confidence interval methods for variables with outcomes {0, 1, 2}, {0, 1, 2, 3}, {0, 1, 2, 3, 4}, and {0, 1, 2, 3, 4, 5}, using the difference between the means as an effect measure. The Welch U test (the T test with adjustment for unequal variances) and its associated confidence interval performed well for almost all situations considered. The Brunner-Munzel test also performed well, except for small sample sizes (10 in each group). The ordinary T test, the Wilcoxon-Mann-Whitney test, the percentile bootstrap interval, and the bootstrap-t interval did not perform satisfactorily. The difference between the means is an appropriate effect measure for comparing two independent discrete numerical variables that has both lower and upper bounds. To analyze this problem, we encourage more frequent use of parametric hypothesis tests and confidence intervals.
Seeking health information on the web: positive hypothesis testing.
Kayhan, Varol Onur
2013-04-01
The goal of this study is to investigate positive hypothesis testing among consumers of health information when they search the Web. After demonstrating the extent of positive hypothesis testing using Experiment 1, we conduct Experiment 2 to test the effectiveness of two debiasing techniques. A total of 60 undergraduate students searched a tightly controlled online database developed by the authors to test the validity of a hypothesis. The database had four abstracts that confirmed the hypothesis and three abstracts that disconfirmed it. Findings of Experiment 1 showed that majority of participants (85%) exhibited positive hypothesis testing. In Experiment 2, we found that the recommendation technique was not effective in reducing positive hypothesis testing since none of the participants assigned to this server could retrieve disconfirming evidence. Experiment 2 also showed that the incorporation technique successfully reduced positive hypothesis testing since 75% of the participants could retrieve disconfirming evidence. Positive hypothesis testing on the Web is an understudied topic. More studies are needed to validate the effectiveness of the debiasing techniques discussed in this study and develop new techniques. Search engine developers should consider developing new options for users so that both confirming and disconfirming evidence can be presented in search results as users test hypotheses using search engines. Copyright © 2012 Elsevier Ireland Ltd. All rights reserved.
Optimal Sensor Scheduling for Multiple Hypothesis Testing
1981-09-01
Naval Research, under contract N00014-77-0532 is gratpfully acknowledged. 2 Laboratory for Information and Decision Systems , MIT Room 35-213, Cambridge...treat the more general problem [9,10]. However, two common threads connect these approaches: they obtain feedback laws mapping posterior destributions ...objective of a detection or identification algorithm is to produce correct estimates of the true state of a system . It is also bene- ficial if these
Latash, M; Gottleib, G
1990-01-01
Problems of single-joint movement variability are analysed in the framework of the equilibrium-point hypothesis (the lambda-model). Control of the movements is described with three parameters related to movement amplitude speed, and time. Three strategies emerge from this description. Only one of them is likely to lead to a Fitts' type speed-accuracy trade-off. Experiments were performed to test one of the predictions of the model. Subjects performed identical sets of single-joint fast movements with open or closed eyes and some-what different instructions. Movements performed with closed eyes were characterized with higher peak speeds and unchanged variability in seeming violation of the Fitt's law and in a good correspondence to the model.
Media and human capital development: Can video game playing make you smarter?
Suziedelyte, Agne
2015-04-01
According to the literature, video game playing can improve such cognitive skills as problem solving, abstract reasoning, and spatial logic. I test this hypothesis using The Child Development Supplement to the Panel Study of Income Dynamics. The endogeneity of video game playing is addressed by using panel data methods and controlling for an extensive list of child and family characteristics. To address the measurement error in video game playing, I instrument children's weekday time use with their weekend time use. After taking into account the endogeneity and measurement error, video game playing is found to positively affect children's problem solving ability. The effect of video game playing on problem solving ability is comparable to the effect of educational activities.
Media and human capital development: Can video game playing make you smarter?1
Suziedelyte, Agne
2015-01-01
According to the literature, video game playing can improve such cognitive skills as problem solving, abstract reasoning, and spatial logic. I test this hypothesis using The Child Development Supplement to the Panel Study of Income Dynamics. The endogeneity of video game playing is addressed by using panel data methods and controlling for an extensive list of child and family characteristics. To address the measurement error in video game playing, I instrument children's weekday time use with their weekend time use. After taking into account the endogeneity and measurement error, video game playing is found to positively affect children's problem solving ability. The effect of video game playing on problem solving ability is comparable to the effect of educational activities. PMID:25705064
A randomized controlled trial of Moderation-Oriented Cue Exposure.
Heather, N; Brodie, J; Wale, S; Wilkinson, G; Luce, A; Webb, E; McCarthy, S
2000-07-01
A randomized controlled trial was conducted to examine the effectiveness of Moderation-Oriented Cue Exposure (MOCE) in comparison to Behavioral Self-Control Training (BSCT). The main hypothesis was that MOCE would be more effective than BSCT among a sample of problem drinkers aiming at moderate drinking. A subsidiary hypothesis was that MOCE would be relatively more effective than BSCT among problem drinkers with higher levels of alcohol dependence. Clients (N = 91; 75% men) were randomly allocated to either MOCE or BSCT. Treatment was delivered in weekly sessions by two trained therapists, in a nested design in which therapists switched to the alternative treatment modality approximately halfway through the trial. Follow-up was carried out 6 months following posttreatment assessment, with 85% successful contact. There was no evidence for the general superiority of MOCE over BSCT. The subsidiary hypothesis was not confirmed. A subsample of clients (n = 14) showing levels of dependence at baseline above the commonly accepted cut-point for a moderation goal (Severity of Alcohol Dependence Questionnaire [SADQ] > 29) showed outcomes at least as favorable as those below the cut-point. The validity of self-reports of alcohol consumption and problems was supported by significant relationships with liver function tests (gamma-glutamyl transferase and alanine transferase). These results provide no grounds for the replacement of BSCT by MOCE in routine, moderation-oriented treatment practice. Assuming they prefer it to abstinence and that it is not contra-indicated on other grounds, there seems no reason why clients showing a higher level of dependence (SADQ = 30-45) should not be offered a moderation goal.
Gender Differences in Eye Movements in Solving Text-and-Diagram Science Problems
ERIC Educational Resources Information Center
Huang, Po-Sheng; Chen, Hsueh-Chih
2016-01-01
The main purpose of this study was to examine possible gender differences in how junior high school students integrate printed texts and diagrams while solving science problems. We proposed the response style hypothesis and the spatial working memory hypothesis to explain possible gender differences in the integration process. Eye-tracking…
Working Memory Capacity and Fluid Intelligence: Maintenance and Disengagement.
Shipstead, Zach; Harrison, Tyler L; Engle, Randall W
2016-11-01
Working memory capacity and fluid intelligence have been demonstrated to be strongly correlated traits. Typically, high working memory capacity is believed to facilitate reasoning through accurate maintenance of relevant information. In this article, we present a proposal reframing this issue, such that tests of working memory capacity and fluid intelligence are seen as measuring complementary processes that facilitate complex cognition. Respectively, these are the ability to maintain access to critical information and the ability to disengage from or block outdated information. In the realm of problem solving, high working memory capacity allows a person to represent and maintain a problem accurately and stably, so that hypothesis testing can be conducted. However, as hypotheses are disproven or become untenable, disengaging from outdated problem solving attempts becomes important so that new hypotheses can be generated and tested. From this perspective, the strong correlation between working memory capacity and fluid intelligence is due not to one ability having a causal influence on the other but to separate attention-demanding mental functions that can be contrary to one another but are organized around top-down processing goals. © The Author(s) 2016.
Intra-fraction motion of the prostate is a random walk
NASA Astrophysics Data System (ADS)
Ballhausen, H.; Li, M.; Hegemann, N.-S.; Ganswindt, U.; Belka, C.
2015-01-01
A random walk model for intra-fraction motion has been proposed, where at each step the prostate moves a small amount from its current position in a random direction. Online tracking data from perineal ultrasound is used to validate or reject this model against alternatives. Intra-fraction motion of a prostate was recorded by 4D ultrasound (Elekta Clarity system) during 84 fractions of external beam radiotherapy of six patients. In total, the center of the prostate was tracked for 8 h in intervals of 4 s. Maximum likelihood model parameters were fitted to the data. The null hypothesis of a random walk was tested with the Dickey-Fuller test. The null hypothesis of stationarity was tested by the Kwiatkowski-Phillips-Schmidt-Shin test. The increase of variance in prostate position over time and the variability in motility between fractions were analyzed. Intra-fraction motion of the prostate was best described as a stochastic process with an auto-correlation coefficient of ρ = 0.92 ± 0.13. The random walk hypothesis (ρ = 1) could not be rejected (p = 0.27). The static noise hypothesis (ρ = 0) was rejected (p < 0.001). The Dickey-Fuller test rejected the null hypothesis ρ = 1 in 25% to 32% of cases. On average, the Kwiatkowski-Phillips-Schmidt-Shin test rejected the null hypothesis ρ = 0 with a probability of 93% to 96%. The variance in prostate position increased linearly over time (r2 = 0.9 ± 0.1). Variance kept increasing and did not settle at a maximum as would be expected from a stationary process. There was substantial variability in motility between fractions and patients with maximum aberrations from isocenter ranging from 0.5 mm to over 10 mm in one patient alone. In conclusion, evidence strongly suggests that intra-fraction motion of the prostate is a random walk and neither static (like inter-fraction setup errors) nor stationary (like a cyclic motion such as breathing, for example). The prostate tends to drift away from the isocenter during a fraction, and this variance increases with time, such that shorter fractions are beneficial to the problem of intra-fraction motion. As a consequence, fixed safety margins (which would over-compensate at the beginning and under-compensate at the end of a fraction) cannot optimally account for intra-fraction motion. Instead, online tracking and position correction on-the-fly should be considered as the preferred approach to counter intra-fraction motion.
Ruttenber, A J; Harrison, L T; Baron, A; McClure, D; Glanz, J; Quillin, R; O'Neill, J P; Sullivan, L; Campbell, J; Nicklas, J A
2001-01-01
The hypothesis that exposure to domestic radon raises the risk for leukemia and other nonpulmonary cancers has been proposed and tested in a number of epidemiologic studies over the past decade. During this period, interest in this hypothesis was heightened by evidence of increased frequencies of mutations at the hypoxanthine guanine phosphoribosyl transferase (hprt) gene in persons exposed to domestic radon (Bridges BA et al. [1991]: Lancet 337:1187-1189). An extension of this study (Cole J et al. [lsqb[1996]: Radiat Res 145:61-69) and two independent studies (Albering HJ et al. [1992[: Lancet 340:739; Albering HJ et al. [1994[: Lancet 344:750-751) found that hprt mutant frequency was not correlated with domestic radon exposure, and two well-designed epidemiologic studies showed no evidence of a relation between radon exposure and leukemia in children or adults. In this report, we present additional data from a study of Colorado high school students showing no correlation between domestic radon exposure and hprt mutant frequency. We use reanalyses of previous studies of radon and hprt mutant frequency to identify problems with this assay as a biomarker for domestic radon exposure and to illustrate difficulties in interpreting the statistical data. We also show with analyses of combined data sets that there is no support for the hypothesis that domestic radon exposure elevates hprt mutant frequency. Taken together, the scientific evidence provides a useful example of the problems associated with analyzing and interpreting data that link environmental exposures, biomarkers, and diseases in epidemiologic studies. Copyright 2001 Wiley-Liss, Inc.
Examining the development of scientific reasoning in ninth-grade physical science students
NASA Astrophysics Data System (ADS)
Westbrook, Susan L.; Rogers, Laura N.
This study-was designed to test the hypothesis that descriptive learning cycles are neither sufficient to stimulate students to reason at a formal operational level nor to encourage facility with the processes of scientific investigation. A 6-week long, three-investigation unit on simple machines drawn from a ninth-grade physical science curriculum was selected for the study. Students in the course were assigned to one of three instructional groups: descriptive group (DE), question design group (QD), and hypothesis testing group (HT). Each group completed identical exploration and invention activities. Each group participated in qualitatively distinct activities during the expansion phase. The DE students completed the activities outlined in the curriculum (a descriptive learning cycle). The QD group designed and conducted experiments to answer a question posed by the teacher. The HT group generated hypotheses concerning a problem, then designed and conducted experiments to test those hypotheses (a hypothetico-deductive expansion). The effects of the treatments were assessed in a pretest-posttest format using Lawson's Seven Logic-Tasks, the Test of Integrated Process Skills, and Lawson's Revised Classroom Test of Scientific Reasoning. Analyses of the data indicated that the HT group exhibited a significant increase on the Test of Integrated Process Skills and on Task 1 of the Seven Logic Tasks during the 6-week period.
On the stabilizing role of species diffusion in chemical enhanced oil recovery
NASA Astrophysics Data System (ADS)
Daripa, Prabir; Gin, Craig
2015-11-01
In this talk, the speaker will discuss a problem on the stability analysis related to the effect of species diffusion on stabilization of fingering in a Hele-Shaw model of chemical enhanced oil recovery. The formulation of the problem is motivated by a specific design principle of the immiscible interfaces in the hope that this will lead to significant stabilization of interfacial instabilities, there by improving oil recovery in the context of porous media flow. Testing the merits of this hypothesis poses some challenges which will be discussed along with some numerical results based on current formulation of this problem. Several open problems in this context will be discussed. This work is currently under progress. Supported by the grant NPRP 08-777-1-141 from the Qatar National Research Fund (a member of The Qatar Foundation).
Machine learning search for variable stars
NASA Astrophysics Data System (ADS)
Pashchenko, Ilya N.; Sokolovsky, Kirill V.; Gavras, Panagiotis
2018-04-01
Photometric variability detection is often considered as a hypothesis testing problem: an object is variable if the null hypothesis that its brightness is constant can be ruled out given the measurements and their uncertainties. The practical applicability of this approach is limited by uncorrected systematic errors. We propose a new variability detection technique sensitive to a wide range of variability types while being robust to outliers and underestimated measurement uncertainties. We consider variability detection as a classification problem that can be approached with machine learning. Logistic Regression (LR), Support Vector Machines (SVM), k Nearest Neighbours (kNN), Neural Nets (NN), Random Forests (RF), and Stochastic Gradient Boosting classifier (SGB) are applied to 18 features (variability indices) quantifying scatter and/or correlation between points in a light curve. We use a subset of Optical Gravitational Lensing Experiment phase two (OGLE-II) Large Magellanic Cloud (LMC) photometry (30 265 light curves) that was searched for variability using traditional methods (168 known variable objects) as the training set and then apply the NN to a new test set of 31 798 OGLE-II LMC light curves. Among 205 candidates selected in the test set, 178 are real variables, while 13 low-amplitude variables are new discoveries. The machine learning classifiers considered are found to be more efficient (select more variables and fewer false candidates) compared to traditional techniques using individual variability indices or their linear combination. The NN, SGB, SVM, and RF show a higher efficiency compared to LR and kNN.
[How are the hypothesis and the objectives established in a Radiology research project?].
Alústiza Echeverría, J M; Salvador Pardo, E; Castiella Eguzkiza, A
2012-01-01
Research is a systematic process designed to answer a question. This is the starting point of the whole project and specifically formulates a problem observed in the analysis of the reality. The answer to this attempts to clarify an uncertainty in our knowledge. The conceptual hypothesis is the theoretical answer to the question set out. The operational hypothesis is the particular form that which sets out to demonstrate the conceptual hypothesis. The objectives are the justification for conducting the research. They help to define what it attempts to obtain, and what answers it will give to the formulated questions. It must show a clear and consistent relationship with the description of the problem and, specifically, with the questions and/or hypothesis that are to be resolved. Copyright © 2011 SERAM. Published by Elsevier Espana. All rights reserved.
Targeting the Adipocyte-Tumor Cell Interaction in Prostate Cancer Treatment
2016-12-01
Prostate cancer (PCa) is one of the leading causes of death among men in the United States. Obesity is another growing epidemic health problem in Western...last decade have pointed to an association between obesity and increased risk factor for PCa progression and aggressiveness. However, despite the... obesity and inflammation, and the role of the adipocyte-cancer cell interaction in this process. The goal of this project is to test the hypothesis
ERIC Educational Resources Information Center
Gartstein, Maria A.; Bridgett, David J.; Dishion, Thomas J.; Kaufman, Noah K.
2009-01-01
Caregiver depression has been described as leading to overreport of child behavior problems. This study examines this "depression-distortion" hypothesis in terms of high-risk families of young adolescents. Questionnaire data were collected from mothers, teachers, and fathers, and self-report information was obtained from youth between ages 10 and…
ERIC Educational Resources Information Center
Vaessen, Anniek; Gerretsen, Patty; Blomert, Leo
2009-01-01
The double deficit hypothesis states that naming speed problems represent a second core deficit in dyslexia independent from a phonological deficit. The current study investigated the main assumptions of this hypothesis in a large sample of well-diagnosed dyslexics. The three main findings were that (a) naming speed was consistently related only…
Deductive reasoning, brain maturation, and science concept acquisition: Are they linked?
NASA Astrophysics Data System (ADS)
Lawson, Anton E.
The present study tested the alternative hypotheses that the poor performance of the intuitive and transitional students on the concept acquisition tasks employed in the Lawson et al. (1991) study was due either to their failure (a) to use deductive reasoning to test potentially relevant task features, as suggested by Lawson et al. (1991); (b) to identify potentially relevant features; or (c) to derive and test a successful problem-solving strategy. To test these hypotheses a training session, which consisted of a series of seven concept acquisition tasks, was designed to reveal to students key task features and the deductive reasoning pattern necessary to solve the tasks. The training was individually administered to students (ages 5-14 years). Results revealed that none of the five- and six-year-olds, approximately half of the seven-year-olds, and virtually all of the students eight years and older responded successfully to the training. These results are viewed as contradictory to the hypothesis that the intuitive and transitional students in the Lawson et al. (1991) study lacked the reasoning skills necessary to identify and test potentially relevant task features. Instead, the results support the hypothesis that their poor performance was due to their failure to use hypothetico-deductive reasoning to derive an effective strategy. Previous research is cited that indicates that the brain's frontal lobes undergo a pronounced growth spurt from about four years of age to about seven years of age. In fact, the performance of normal six-year-olds and adults with frontal lobe damage on tasks such as the Wisconsin Card Sorting Task (WCST), a task similar in many ways to the present concept acquisition tasks, has been found to be identical. Consequently, the hypothesis is advanced that maturation of the frontal lobes can explain the striking improvement in performance at age seven. A neural network of the role of the frontal lobes in task performance based upon the work of Levine and Prueitt (1989) is presented. The advance in reasoning that presumably results from effective operation of the frontal lobes is seen as a fundamental advance in intellectual development because it enables children to employ an inductive-deductive reasoning pattern to change their minds when confronted with contradictory evidence regarding features of perceptible objects, a skill necessary for descriptive concept acquisition. It is suggested that a further qualitative advance in intellectual development occurs when an analogous pattern of abductive-deductive reasoning is applied to hypothetical objects and/or processes to allow for alternative hypothesis testing and theoretical concept acquisition. Apparently this is the reasoning pattern needed to derive an effective problem-solving strategy to solve the concept acquisition tasks of Lawson et al. (1991) when direct instruction is not provided. Implications for the science classroom are suggested.
A risk-based approach to flood management decisions in a nonstationary world
NASA Astrophysics Data System (ADS)
Rosner, Ana; Vogel, Richard M.; Kirshen, Paul H.
2014-03-01
Traditional approaches to flood management in a nonstationary world begin with a null hypothesis test of "no trend" and its likelihood, with little or no attention given to the likelihood that we might ignore a trend if it really existed. Concluding a trend exists when it does not, or rejecting a trend when it exists are known as type I and type II errors, respectively. Decision-makers are poorly served by statistical and/or decision methods that do not carefully consider both over- and under-preparation errors, respectively. Similarly, little attention is given to how to integrate uncertainty in our ability to detect trends into a flood management decision context. We show how trend hypothesis test results can be combined with an adaptation's infrastructure costs and damages avoided to provide a rational decision approach in a nonstationary world. The criterion of expected regret is shown to be a useful metric that integrates the statistical, economic, and hydrological aspects of the flood management problem in a nonstationary world.
NASA Astrophysics Data System (ADS)
Sasmita, E.; Edriati, S.; Yunita, A.
2018-04-01
Related to the math score of the first semester in class at seventh grade of MTSN Model Padang which much the score that low (less than KKM). It because of the students who feel less involved in learning process because the teacher don't do assessment the discussions. The solution of the problem is discussion assessment in Cooperative Learning Model type Numbered Head Together. This study aims to determine whether the discussion assessment in NHT effect on student learning outcomes of class VII MTsN Model Padang. The instrument used in this study is discussion assessment and final tests. The data analysis technique used is the simple linear regression analysis. Hypothesis test results Fcount greater than the value of Ftable then the hypothesis in this study received. So it concluded that the assessment of the discussion in NHT effect on student learning outcomes of class VII MTsN Model Padang.
Ensembles vs. information theory: supporting science under uncertainty
NASA Astrophysics Data System (ADS)
Nearing, Grey S.; Gupta, Hoshin V.
2018-05-01
Multi-model ensembles are one of the most common ways to deal with epistemic uncertainty in hydrology. This is a problem because there is no known way to sample models such that the resulting ensemble admits a measure that has any systematic (i.e., asymptotic, bounded, or consistent) relationship with uncertainty. Multi-model ensembles are effectively sensitivity analyses and cannot - even partially - quantify uncertainty. One consequence of this is that multi-model approaches cannot support a consistent scientific method - in particular, multi-model approaches yield unbounded errors in inference. In contrast, information theory supports a coherent hypothesis test that is robust to (i.e., bounded under) arbitrary epistemic uncertainty. This paper may be understood as advocating a procedure for hypothesis testing that does not require quantifying uncertainty, but is coherent and reliable (i.e., bounded) in the presence of arbitrary (unknown and unknowable) uncertainty. We conclude by offering some suggestions about how this proposed philosophy of science suggests new ways to conceptualize and construct simulation models of complex, dynamical systems.
Du, Bo; Zhang, Yuxiang; Zhang, Liangpei; Tao, Dacheng
2016-08-18
Hyperspectral images provide great potential for target detection, however, new challenges are also introduced for hyperspectral target detection, resulting that hyperspectral target detection should be treated as a new problem and modeled differently. Many classical detectors are proposed based on the linear mixing model and the sparsity model. However, the former type of model cannot deal well with spectral variability in limited endmembers, and the latter type of model usually treats the target detection as a simple classification problem and pays less attention to the low target probability. In this case, can we find an efficient way to utilize both the high-dimension features behind hyperspectral images and the limited target information to extract small targets? This paper proposes a novel sparsitybased detector named the hybrid sparsity and statistics detector (HSSD) for target detection in hyperspectral imagery, which can effectively deal with the above two problems. The proposed algorithm designs a hypothesis-specific dictionary based on the prior hypotheses for the test pixel, which can avoid the imbalanced number of training samples for a class-specific dictionary. Then, a purification process is employed for the background training samples in order to construct an effective competition between the two hypotheses. Next, a sparse representation based binary hypothesis model merged with additive Gaussian noise is proposed to represent the image. Finally, a generalized likelihood ratio test is performed to obtain a more robust detection decision than the reconstruction residual based detection methods. Extensive experimental results with three hyperspectral datasets confirm that the proposed HSSD algorithm clearly outperforms the stateof- the-art target detectors.
Whiplash and the compensation hypothesis.
Spearing, Natalie M; Connelly, Luke B
2011-12-01
Review article. To explain why the evidence that compensation-related factors lead to worse health outcomes is not compelling, either in general, or in the specific case of whiplash. There is a common view that compensation-related factors lead to worse health outcomes ("the compensation hypothesis"), despite the presence of important, and unresolved sources of bias. The empirical evidence on this question has ramifications for the design of compensation schemes. Using studies on whiplash, this article outlines the methodological problems that impede attempts to confirm or refute the compensation hypothesis. Compensation studies are prone to measurement bias, reverse causation bias, and selection bias. Errors in measurement are largely due to the latent nature of whiplash injuries and health itself, a lack of clarity over the unit of measurement (specific factors, or "compensation"), and a lack of appreciation for the heterogeneous qualities of compensation-related factors and schemes. There has been a failure to acknowledge and empirically address reverse causation bias, or the likelihood that poor health influences the decision to pursue compensation: it is unclear if compensation is a cause or a consequence of poor health, or both. Finally, unresolved selection bias (and hence, confounding) is evident in longitudinal studies and natural experiments. In both cases, between-group differences have not been addressed convincingly. The nature of the relationship between compensation-related factors and health is unclear. Current approaches to testing the compensation hypothesis are prone to several important sources of bias, which compromise the validity of their results. Methods that explicitly test the hypothesis and establish whether or not a causal relationship exists between compensation factors and prolonged whiplash symptoms are needed in future studies.
Young, Robin L; Weinberg, Janice; Vieira, Verónica; Ozonoff, Al; Webster, Thomas F
2010-07-19
A common, important problem in spatial epidemiology is measuring and identifying variation in disease risk across a study region. In application of statistical methods, the problem has two parts. First, spatial variation in risk must be detected across the study region and, second, areas of increased or decreased risk must be correctly identified. The location of such areas may give clues to environmental sources of exposure and disease etiology. One statistical method applicable in spatial epidemiologic settings is a generalized additive model (GAM) which can be applied with a bivariate LOESS smoother to account for geographic location as a possible predictor of disease status. A natural hypothesis when applying this method is whether residential location of subjects is associated with the outcome, i.e. is the smoothing term necessary? Permutation tests are a reasonable hypothesis testing method and provide adequate power under a simple alternative hypothesis. These tests have yet to be compared to other spatial statistics. This research uses simulated point data generated under three alternative hypotheses to evaluate the properties of the permutation methods and compare them to the popular spatial scan statistic in a case-control setting. Case 1 was a single circular cluster centered in a circular study region. The spatial scan statistic had the highest power though the GAM method estimates did not fall far behind. Case 2 was a single point source located at the center of a circular cluster and Case 3 was a line source at the center of the horizontal axis of a square study region. Each had linearly decreasing logodds with distance from the point. The GAM methods outperformed the scan statistic in Cases 2 and 3. Comparing sensitivity, measured as the proportion of the exposure source correctly identified as high or low risk, the GAM methods outperformed the scan statistic in all three Cases. The GAM permutation testing methods provide a regression-based alternative to the spatial scan statistic. Across all hypotheses examined in this research, the GAM methods had competing or greater power estimates and sensitivities exceeding that of the spatial scan statistic.
2010-01-01
Background A common, important problem in spatial epidemiology is measuring and identifying variation in disease risk across a study region. In application of statistical methods, the problem has two parts. First, spatial variation in risk must be detected across the study region and, second, areas of increased or decreased risk must be correctly identified. The location of such areas may give clues to environmental sources of exposure and disease etiology. One statistical method applicable in spatial epidemiologic settings is a generalized additive model (GAM) which can be applied with a bivariate LOESS smoother to account for geographic location as a possible predictor of disease status. A natural hypothesis when applying this method is whether residential location of subjects is associated with the outcome, i.e. is the smoothing term necessary? Permutation tests are a reasonable hypothesis testing method and provide adequate power under a simple alternative hypothesis. These tests have yet to be compared to other spatial statistics. Results This research uses simulated point data generated under three alternative hypotheses to evaluate the properties of the permutation methods and compare them to the popular spatial scan statistic in a case-control setting. Case 1 was a single circular cluster centered in a circular study region. The spatial scan statistic had the highest power though the GAM method estimates did not fall far behind. Case 2 was a single point source located at the center of a circular cluster and Case 3 was a line source at the center of the horizontal axis of a square study region. Each had linearly decreasing logodds with distance from the point. The GAM methods outperformed the scan statistic in Cases 2 and 3. Comparing sensitivity, measured as the proportion of the exposure source correctly identified as high or low risk, the GAM methods outperformed the scan statistic in all three Cases. Conclusions The GAM permutation testing methods provide a regression-based alternative to the spatial scan statistic. Across all hypotheses examined in this research, the GAM methods had competing or greater power estimates and sensitivities exceeding that of the spatial scan statistic. PMID:20642827
NASA Technical Reports Server (NTRS)
Josephson, John R.
1989-01-01
A layered-abduction model of perception is presented which unifies bottom-up and top-down processing in a single logical and information-processing framework. The process of interpreting the input from each sense is broken down into discrete layers of interpretation, where at each layer a best explanation hypothesis is formed of the data presented by the layer or layers below, with the help of information available laterally and from above. The formation of this hypothesis is treated as a problem of abductive inference, similar to diagnosis and theory formation. Thus this model brings a knowledge-based problem-solving approach to the analysis of perception, treating perception as a kind of compiled cognition. The bottom-up passing of information from layer to layer defines channels of information flow, which separate and converge in a specific way for any specific sense modality. Multi-modal perception occurs where channels converge from more than one sense. This model has not yet been implemented, though it is based on systems which have been successful in medical and mechanical diagnosis and medical test interpretation.
Enhancing memory and imagination improves problem solving among individuals with depression.
McFarland, Craig P; Primosch, Mark; Maxson, Chelsey M; Stewart, Brandon T
2017-08-01
Recent work has revealed links between memory, imagination, and problem solving, and suggests that increasing access to detailed memories can lead to improved imagination and problem-solving performance. Depression is often associated with overgeneral memory and imagination, along with problem-solving deficits. In this study, we tested the hypothesis that an interview designed to elicit detailed recollections would enhance imagination and problem solving among both depressed and nondepressed participants. In a within-subjects design, participants completed a control interview or an episodic specificity induction prior to completing memory, imagination, and problem-solving tasks. Results revealed that compared to the control interview, the episodic specificity induction fostered increased detail generation in memory and imagination and more relevant steps on the problem-solving task among depressed and nondepressed participants. This study builds on previous work by demonstrating that a brief interview can enhance problem solving among individuals with depression and supports the notion that episodic memory plays a key role in problem solving. It should be noted, however, that the results of the interview are relatively short-lived.
Access to health care and community social capital.
Hendryx, Michael S; Ahern, Melissa M; Lovrich, Nicholas P; McCurdy, Arthur H
2002-02-01
To test the hypothesis that variation in reported access to health care is positively related to the level of social capital present in a community. The 1996 Household Survey of the Community Tracking Study, drawn from 22 metropolitan statistical areas across the United States (n = 19,672). Additional data for the 22 communities are from a 1996 multicity broadcast media marketing database, including key social capital indicators, the 1997 National Profile of Local Health Departments survey, and Interstudy, American Hospital Association, and American Medical Association sources. The design is cross-sectional. Self-reported access to care problems is the dependent variable. Independent variables include individual sociodemographic variables, community-level health sector variables, and social capital variables. Data are merged from the various sources and weighted to be population representative and are analyzed using hierarchical categorical modeling. Persons who live in metropolitan statistical areas featuring higher levels of social capital report fewer problems accessing health care. A higher HMO penetration rate in a metropolitan statistical area was also associated with fewer access problems. Other health sector variables were not related to health care access. The results observed for 22 major U.S. cities are consistent with the hypothesis that community social capital enables better access to care, perhaps through improving community accountability mechanisms.
Is council tax valuation band a predictor of mortality?
Beale, Norman R; Taylor, Gordon J; Straker-Cook, Dawn MK
2002-01-01
Background All current UK indices of socio-economic status have inherent problems, especially those used to govern resource allocation to the health sphere. The search for improved markers continues: this study proposes and tests the possibility that Council Tax Valuation Band (CTVB) might match requirements. Presentation of the hypothesis To determine if there is an association between CTVB of final residence and mortality risk using the death registers of a UK general practice. Testing the hypothesis Standardised death rates and odds ratios (ORs) for groups defined by CTVB of dwelling (A – H) were calculated using one in four denominator samples from the practice lists. Analyses were repeated three times – between number of deaths and CTVB of residence of deceased 1992 – 1994 inclusive, 1995 – 1997 inc., 1998 – 2000 inc. In 856 deaths there were consistent and significant differences in death rates between CTVBs: above average for bands A and B residents; below average for other band residents. There were significantly higher ORs for A, B residents who were female and who died prematurely (before average group life expectancy). Implications of the hypothesis CTVB of final residence appears to be a proxy marker of mortality risk and could be a valuable indicator of health needs resource at household level. It is worthy of further exploration. PMID:12207828
Dávid-Barrett, T.; Dunbar, R. I. M.
2013-01-01
Sociality is primarily a coordination problem. However, the social (or communication) complexity hypothesis suggests that the kinds of information that can be acquired and processed may limit the size and/or complexity of social groups that a species can maintain. We use an agent-based model to test the hypothesis that the complexity of information processed influences the computational demands involved. We show that successive increases in the kinds of information processed allow organisms to break through the glass ceilings that otherwise limit the size of social groups: larger groups can only be achieved at the cost of more sophisticated kinds of information processing that are disadvantageous when optimal group size is small. These results simultaneously support both the social brain and the social complexity hypotheses. PMID:23804623
Receiving instrumental support at work: when help is not welcome.
Deelstra, Janna T; Peeters, Maria C W; Schaufeli, Wilmar B; Stroebe, Wolfgang; Zijlstra, Fred R H; van Doornen, Lorenz P
2003-04-01
Although the role of social support in promoting employees' health and well-being has been studied extensively, the evidence is inconsistent, sometimes even suggesting that social support might have negative effects. The authors examined some psychological processes that might explain such effects. On the basis of the threat-to-self-esteem model, the authors tested the hypothesis that receiving imposed support elicits negative reactions, which are moderated by someone's need for support. The authors distinguished 3 different reactions: (a) self-related, (b) interaction-related, and (c) physiological. The results of an experiment with 48 temporary administrative workers generally confirmed the hypothesis. Imposed support elicited negative reactions, except when there was an unsolvable problem, but even then the effect of imposed support was not positive but neutral.
Proportional reasoning as a heuristic-based process: time constraint and dual task considerations.
Gillard, Ellen; Van Dooren, Wim; Schaeken, Walter; Verschaffel, Lieven
2009-01-01
The present study interprets the overuse of proportional solution methods from a dual process framework. Dual process theories claim that analytic operations involve time-consuming executive processing, whereas heuristic operations are fast and automatic. In two experiments to test whether proportional reasoning is heuristic-based, the participants solved "proportional" problems, for which proportional solution methods provide correct answers, and "nonproportional" problems known to elicit incorrect answers based on the assumption of proportionality. In Experiment 1, the available solution time was restricted. In Experiment 2, the executive resources were burdened with a secondary task. Both manipulations induced an increase in proportional answers and a decrease in correct answers to nonproportional problems. These results support the hypothesis that the choice for proportional methods is heuristic-based.
Lee, Matthew R.; Chassin, Laurie; MacKinnon, David P.
2015-01-01
Background Research has shown a developmental process of “maturing out” of problem drinking beginning in young adulthood. Perhaps surprisingly, past studies suggests that young adult drinking reductions may be particularly pronounced among those exhibiting relatively severe forms of problem drinking earlier in emerging adulthood. This may occur because more severe problem drinkers experience stronger ameliorative effects of normative young adult role transitions like marriage. Methods The hypothesis of stronger marriage effects among more severe problem drinkers was tested using three waves of data from a large ongoing study of familial alcohol disorder (Chassin et al., 1992; N=844; 51% children of alcoholics). Results Longitudinal growth models characterized (1) the curvilinear trajectory of drinking quantity from ages 17-40, (2) effects of marriage on altering this age-related trajectory, and moderation of this effect by pre-marriage problem drinking levels (alcohol consequences and dependence symptoms). Results confirmed the hypothesis that protective marriage effects on drinking quantity trajectories would be stronger among more severe pre-marriage problem drinkers. Supplemental analyses showed that results were robust to alternative construct operationalizations and modeling approaches. Conclusions Consistent with role incompatibility theory, findings support the view of role conflict as a key mechanism of role-driven behavior change, as greater problem drinking likely conflicts more with demands of roles like marriage. This is also consistent with the developmental psychopathology view of transitions and turning points. Role transitions among already low-severity drinkers may merely represent developmental continuity of a low-risk trajectory, whereas role transitions among higher-severity problem drinkers may represent developmentally discontinuous “turning points” that divert individuals from a higher- to a lower-risk trajectory. Practically, findings support the clinical relevance of role-related “maturing out processes” by suggesting that they often reflect natural recovery from clinically significant problem drinking. Thus, understanding these processes could help clarify the nature of pathological drinking and inform interventions. PMID:26009967
Lee, Matthew R; Chassin, Laurie; MacKinnon, David P
2015-06-01
Research has shown a developmental process of "maturing out" of problem drinking beginning in young adulthood. Perhaps surprisingly, past studies suggest that young adult drinking reductions may be particularly pronounced among those exhibiting relatively severe forms of problem drinking earlier in emerging adulthood. This may occur because more severe problem drinkers experience stronger ameliorative effects of normative young adult role transitions like marriage. The hypothesis of stronger marriage effects among more severe problem drinkers was tested using 3 waves of data from a large ongoing study of familial alcohol disorder (N = 844; 51% children of alcoholics). Longitudinal growth models characterized (i) the curvilinear trajectory of drinking quantity from ages 17 to 40, (ii) effects of marriage on altering this age-related trajectory, and (iii) moderation of this effect by premarriage problem drinking levels (alcohol consequences and dependence symptoms). Results confirmed the hypothesis that protective marriage effects on drinking quantity trajectories would be stronger among more severe premarriage problem drinkers. Supplemental analyses showed that results were robust to alternative construct operationalizations and modeling approaches. Consistent with role incompatibility theory, findings support the view of role conflict as a key mechanism of role-driven behavior change, as greater problem drinking likely conflicts more with demands of roles like marriage. This is also consistent with the developmental psychopathology view of transitions and turning points. Role transitions among already low-severity drinkers may merely represent developmental continuity of a low-risk trajectory, whereas role transitions among higher-severity problem drinkers may represent developmentally discontinuous "turning points" that divert individuals from a higher- to a lower-risk trajectory. Practically, findings support the clinical relevance of role-related "maturing out processes" by suggesting that they often reflect natural recovery from clinically significant problem drinking. Thus, understanding these processes could help clarify the nature of pathological drinking and inform interventions. Copyright © 2015 by the Research Society on Alcoholism.
On the Hypothesis of Control of the Universe
NASA Astrophysics Data System (ADS)
Kalanov, Temur Z.
2007-04-01
The problem of the SETI is not solved till now because idea of the SETI represents a methodological error in cosmology and astrophysics. This fact means that one should prove existence of Supreme Intelligence in a correct way. In this connection, the hypothesis of control of the Universe is proposed. The hypothesis is based on the new point of view [1] according to which information is essence of the Universe, and material objects are manifestation of the essence. The hypothesis is formulated as follows: (1) the Universe represents the cybernetic system; (2) the cybernetic system is a set of mutual connected elements which receive, memorize, process, and transmit information; (3) each material element (for example, atom, molecule, man, the Earth, the Sun) is a unity of opposites: the controlling aspect and the controllable aspect; (4) the Universe as a system is a unity of opposites: the controlling aspect and the controllable aspect. Consequently, the Universe is controlled by the certain object. Thus, the problem of definition of the controlling object arises. Correct solution of this problem is the key to exploration of the Universe. Ref.: [1] T.Z. Kalanov, ``On the hypothesis of Universe's ``system block'' ''. Bulletin of the APS, Vol. 51, No. 2 (2006), p. 61.
ERIC Educational Resources Information Center
Kim, Hye Jeong; Pedersen, Susan
2010-01-01
Recently, the importance of ill-structured problem-solving in real-world contexts has become a focus of educational research. Particularly, the hypothesis-development process has been examined as one of the keys to developing a high-quality solution in a problem context. The authors of this study examined predictive relations between young…
Application of Second-Moment Source Analysis to Three Problems in Earthquake Forecasting
NASA Astrophysics Data System (ADS)
Donovan, J.; Jordan, T. H.
2011-12-01
Though earthquake forecasting models have often represented seismic sources as space-time points (usually hypocenters), a more complete hazard analysis requires the consideration of finite-source effects, such as rupture extent, orientation, directivity, and stress drop. The most compact source representation that includes these effects is the finite moment tensor (FMT), which approximates the degree-two polynomial moments of the stress glut by its projection onto the seismic (degree-zero) moment tensor. This projection yields a scalar space-time source function whose degree-one moments define the centroid moment tensor (CMT) and whose degree-two moments define the FMT. We apply this finite-source parameterization to three forecasting problems. The first is the question of hypocenter bias: can we reject the null hypothesis that the conditional probability of hypocenter location is uniformly distributed over the rupture area? This hypothesis is currently used to specify rupture sets in the "extended" earthquake forecasts that drive simulation-based hazard models, such as CyberShake. Following McGuire et al. (2002), we test the hypothesis using the distribution of FMT directivity ratios calculated from a global data set of source slip inversions. The second is the question of source identification: given an observed FMT (and its errors), can we identify it with an FMT in the complete rupture set that represents an extended fault-based rupture forecast? Solving this problem will facilitate operational earthquake forecasting, which requires the rapid updating of earthquake triggering and clustering models. Our proposed method uses the second-order uncertainties as a norm on the FMT parameter space to identify the closest member of the hypothetical rupture set and to test whether this closest member is an adequate representation of the observed event. Finally, we address the aftershock excitation problem: given a mainshock, what is the spatial distribution of aftershock probabilities? The FMT representation allows us to generalize the models typically used for this purpose (e.g., marked point process models, such as ETAS), which will again be necessary in operational earthquake forecasting. To quantify aftershock probabilities, we compare mainshock FMTs with the first and second spatial moments of weighted aftershock hypocenters. We will describe applications of these results to the Uniform California Earthquake Rupture Forecast, version 3, which is now under development by the Working Group on California Earthquake Probabilities.
Analyzing thematic maps and mapping for accuracy
Rosenfield, G.H.
1982-01-01
Two problems which exist while attempting to test the accuracy of thematic maps and mapping are: (1) evaluating the accuracy of thematic content, and (2) evaluating the effects of the variables on thematic mapping. Statistical analysis techniques are applicable to both these problems and include techniques for sampling the data and determining their accuracy. In addition, techniques for hypothesis testing, or inferential statistics, are used when comparing the effects of variables. A comprehensive and valid accuracy test of a classification project, such as thematic mapping from remotely sensed data, includes the following components of statistical analysis: (1) sample design, including the sample distribution, sample size, size of the sample unit, and sampling procedure; and (2) accuracy estimation, including estimation of the variance and confidence limits. Careful consideration must be given to the minimum sample size necessary to validate the accuracy of a given. classification category. The results of an accuracy test are presented in a contingency table sometimes called a classification error matrix. Usually the rows represent the interpretation, and the columns represent the verification. The diagonal elements represent the correct classifications. The remaining elements of the rows represent errors by commission, and the remaining elements of the columns represent the errors of omission. For tests of hypothesis that compare variables, the general practice has been to use only the diagonal elements from several related classification error matrices. These data are arranged in the form of another contingency table. The columns of the table represent the different variables being compared, such as different scales of mapping. The rows represent the blocking characteristics, such as the various categories of classification. The values in the cells of the tables might be the counts of correct classification or the binomial proportions of these counts divided by either the row totals or the column totals from the original classification error matrices. In hypothesis testing, when the results of tests of multiple sample cases prove to be significant, some form of statistical test must be used to separate any results that differ significantly from the others. In the past, many analyses of the data in this error matrix were made by comparing the relative magnitudes of the percentage of correct classifications, for either individual categories, the entire map or both. More rigorous analyses have used data transformations and (or) two-way classification analysis of variance. A more sophisticated step of data analysis techniques would be to use the entire classification error matrices using the methods of discrete multivariate analysis or of multiviariate analysis of variance.
Behavioral Health and Adjustment to College Life for Student Service Members/Veterans.
Schonfeld, Lawrence; Braue, Lawrence A; Stire, Sheryl; Gum, Amber M; Cross, Brittany L; Brown, Lisa M
2015-01-01
Increasing numbers of student service members/veterans (SSM/Vs) are enrolling in college. However, little is known about how their previous military experience affects their adjustment to this new role. The present study tested the hypothesis that SSM/Vs who report adjustment problems in college have a higher incidence of posttraumatic stress disorder (PTSD), depression, and other behavioral health problems compared with those who do not report adjustment problems. SSM/Vs (N = 173) at a large, southeastern, public university completed online surveys that included well-validated screens measuring substance use, depression, PTSD, and other mental disorders. Those reporting difficulties adjusting to university life (28%) reported significantly higher frequencies of behavioral and health problems while in the military, and significantly higher levels of PTSD, depression, and mental health disorders, but no difference in substance use. Implications for improved behavioral health screening and coordination of university behavioral health services with veterans' health systems are discussed.
Vasilyeva, Marina; Laski, Elida V; Shen, Chen
2015-10-01
The present study tested the hypothesis that children's fluency with basic number facts and knowledge of computational strategies, derived from early arithmetic experience, predicts their performance on complex arithmetic problems. First-grade students from United States and Taiwan (N = 152, mean age: 7.3 years) were presented with problems that differed in difficulty: single-, mixed-, and double-digit addition. Children's strategy use varied as a function of problem difficulty, consistent with Siegler's theory of strategy choice. The use of decomposition strategy interacted with computational fluency in predicting the accuracy of double-digit addition. Further, the frequency of decomposition and computational fluency fully mediated cross-national differences in accuracy on these complex arithmetic problems. The results indicate the importance of both fluency with basic number facts and the decomposition strategy for later arithmetic performance. (c) 2015 APA, all rights reserved).
Multivariate Analysis and Its Applications
1989-02-14
defined in situations where measurements are taken on natural clusters of individuals like brothers in a family. A number of problems arise in the study of...intraclass correlations. How do we estimate it when observations are available on clusters of different sizes? How do we test the hypothesis that the...the random variable y(X) = #I X + G2X 2 + ... + GmX m , follows an exponential distribution with mean unity. Such a class of life distributions, has a
High Performance Visualization using Query-Driven Visualizationand Analytics
DOE Office of Scientific and Technical Information (OSTI.GOV)
Bethel, E. Wes; Campbell, Scott; Dart, Eli
2006-06-15
Query-driven visualization and analytics is a unique approach for high-performance visualization that offers new capabilities for knowledge discovery and hypothesis testing. The new capabilities akin to finding needles in haystacks are the result of combining technologies from the fields of scientific visualization and scientific data management. This approach is crucial for rapid data analysis and visualization in the petascale regime. This article describes how query-driven visualization is applied to a hero-sized network traffic analysis problem.
Visualization-based analysis of multiple response survey data
NASA Astrophysics Data System (ADS)
Timofeeva, Anastasiia
2017-11-01
During the survey, the respondents are often allowed to tick more than one answer option for a question. Analysis and visualization of such data have difficulties because of the need for processing multiple response variables. With standard representation such as pie and bar charts, information about the association between different answer options is lost. The author proposes a visualization approach for multiple response variables based on Venn diagrams. For a more informative representation with a large number of overlapping groups it is suggested to use similarity and association matrices. Some aggregate indicators of dissimilarity (similarity) are proposed based on the determinant of the similarity matrix and the maximum eigenvalue of association matrix. The application of the proposed approaches is well illustrated by the example of the analysis of advertising sources. Intersection of sets indicates that the same consumer audience is covered by several advertising sources. This information is very important for the allocation of the advertising budget. The differences between target groups in advertising sources are of interest. To identify such differences the hypothesis of homogeneity and independence are tested. Recent approach to the problem are briefly reviewed and compared. An alternative procedure is suggested. It is based on partition of a consumer audience into pairwise disjoint subsets and includes hypothesis testing of the difference between the population proportions. It turned out to be more suitable for the real problem being solved.
Joh, Ju Youn; Kim, Sun; Park, Jun Li; Kim, Yeon Pyo
2013-05-01
The Family Adaptability and Cohesion Evaluation Scale (FACES) III using the circumplex model has been widely used in investigating family function. However, the criticism of the curvilinear hypothesis of the circumplex model has always been from an empirical point of view. This study examined the relationship between adolescent adaptability, cohesion, and adolescent problem behaviors, and especially testing the consistency of the curvilinear hypotheses with FACES III. We used the data from 398 adolescent participants who were in middle school. A self-reported questionnaire was used to evaluate the FACES III and Youth Self Report. According to the level of family adaptability, significant differences were evident in internalizing problems (P = 0.014). But, in externalizing problems, the results were not significant (P = 0.305). Also, according to the level of family cohesion, significant differences were in internalizing problems (P = 0.002) and externalizing problems (P = 0.004). The relationship between the dimensions of adaptability, cohesion and adolescent problem behaviors was not curvilinear. In other words, adolescents with high adaptability and high cohesion showed low problem behaviors.
Joh, Ju Youn; Kim, Sun; Park, Jun Li
2013-01-01
Background The Family Adaptability and Cohesion Evaluation Scale (FACES) III using the circumplex model has been widely used in investigating family function. However, the criticism of the curvilinear hypothesis of the circumplex model has always been from an empirical point of view. This study examined the relationship between adolescent adaptability, cohesion, and adolescent problem behaviors, and especially testing the consistency of the curvilinear hypotheses with FACES III. Methods We used the data from 398 adolescent participants who were in middle school. A self-reported questionnaire was used to evaluate the FACES III and Youth Self Report. Results According to the level of family adaptability, significant differences were evident in internalizing problems (P = 0.014). But, in externalizing problems, the results were not significant (P = 0.305). Also, according to the level of family cohesion, significant differences were in internalizing problems (P = 0.002) and externalizing problems (P = 0.004). Conclusion The relationship between the dimensions of adaptability, cohesion and adolescent problem behaviors was not curvilinear. In other words, adolescents with high adaptability and high cohesion showed low problem behaviors. PMID:23730484
A Critique of One-Tailed Hypothesis Test Procedures in Business and Economics Statistics Textbooks.
ERIC Educational Resources Information Center
Liu, Tung; Stone, Courtenay C.
1999-01-01
Surveys introductory business and economics statistics textbooks and finds that they differ over the best way to explain one-tailed hypothesis tests: the simple null-hypothesis approach or the composite null-hypothesis approach. Argues that the composite null-hypothesis approach contains methodological shortcomings that make it more difficult for…
Hayes, Brett K; Hawkins, Guy E; Newell, Ben R
2016-05-01
Four experiments examined the locus of impact of causal knowledge on consideration of alternative hypotheses in judgments under uncertainty. Two possible loci were examined; overcoming neglect of the alternative when developing a representation of a judgment problem and improving utilization of statistics associated with the alternative hypothesis. In Experiment 1, participants could search for information about the various components of Bayes's rule in a diagnostic problem. A majority failed to spontaneously search for information about an alternative hypothesis, but this bias was reduced when a specific alternative hypothesis was mentioned before search. No change in search patterns was found when a generic alternative cause was mentioned. Experiments 2a and 2b broadly replicated these patterns when participants rated or made binary judgments about the relevance of each of the Bayesian components. In contrast, Experiment 3 showed that when participants were given the likelihood of the data given a focal hypothesis p(D|H) and an alternative hypothesis p(D|¬H), they gave estimates of p(H|D) that were consistent with Bayesian principles. Additional causal knowledge had relatively little impact on such judgments. These results show that causal knowledge primarily affects neglect of the alternative hypothesis at the initial stage of problem representation. (PsycINFO Database Record (c) 2016 APA, all rights reserved).
NASA Astrophysics Data System (ADS)
Kartikasari, A.; Widjajanti, D. B.
2017-02-01
The aim of this study is to explore the effectiveness of learning approach using problem-based learning based on multiple intelligences in developing student’s achievement, mathematical connection ability, and self-esteem. This study is experimental research with research sample was 30 of Grade X students of MIA III MAN Yogyakarta III. Learning materials that were implemented consisting of trigonometry and geometry. For the purpose of this study, researchers designed an achievement test made up of 44 multiple choice questions with respectively 24 questions on the concept of trigonometry and 20 questions for geometry. The researcher also designed a connection mathematical test and self-esteem questionnaire that consisted of 7 essay questions on mathematical connection test and 30 items of self-esteem questionnaire. The learning approach said that to be effective if the proportion of students who achieved KKM on achievement test, the proportion of students who achieved a minimum score of high category on the results of both mathematical connection test and self-esteem questionnaire were greater than or equal to 70%. Based on the hypothesis testing at the significance level of 5%, it can be concluded that the learning approach using problem-based learning based on multiple intelligences was effective in terms of student’s achievement, mathematical connection ability, and self-esteem.
NASA Astrophysics Data System (ADS)
Ren, Xiaoqiang; Yan, Jiaqi; Mo, Yilin
2018-03-01
This paper studies binary hypothesis testing based on measurements from a set of sensors, a subset of which can be compromised by an attacker. The measurements from a compromised sensor can be manipulated arbitrarily by the adversary. The asymptotic exponential rate, with which the probability of error goes to zero, is adopted to indicate the detection performance of a detector. In practice, we expect the attack on sensors to be sporadic, and therefore the system may operate with all the sensors being benign for extended period of time. This motivates us to consider the trade-off between the detection performance of a detector, i.e., the probability of error, when the attacker is absent (defined as efficiency) and the worst-case detection performance when the attacker is present (defined as security). We first provide the fundamental limits of this trade-off, and then propose a detection strategy that achieves these limits. We then consider a special case, where there is no trade-off between security and efficiency. In other words, our detection strategy can achieve the maximal efficiency and the maximal security simultaneously. Two extensions of the secure hypothesis testing problem are also studied and fundamental limits and achievability results are provided: 1) a subset of sensors, namely "secure" sensors, are assumed to be equipped with better security countermeasures and hence are guaranteed to be benign, 2) detection performance with unknown number of compromised sensors. Numerical examples are given to illustrate the main results.
NASA Astrophysics Data System (ADS)
Darma, I. K.
2018-01-01
This research is aimed at determining: 1) the differences of mathematical problem solving ability between the students facilitated with problem-based learning model and conventional learning model, 2) the differences of mathematical problem solving ability between the students facilitated with authentic and conventional assessment model, and 3) interaction effect between learning and assessment model on mathematical problem solving. The research was conducted in Bali State Polytechnic, using the 2x2 experiment factorial design. The samples of this research were 110 students. The data were collected using a theoretically and empirically-validated test. Instruments were validated by using Aiken’s approach of technique content validity and item analysis, and then analyzed using anova stylistic. The result of the analysis shows that the students facilitated with problem-based learning and authentic assessment models get the highest score average compared to the other students, both in the concept understanding and mathematical problem solving. The result of hypothesis test shows that, significantly: 1) there is difference of mathematical problem solving ability between the students facilitated with problem-based learning model and conventional learning model, 2) there is difference of mathematical problem solving ability between the students facilitated with authentic assessment model and conventional assessment model, and 3) there is interaction effect between learning model and assessment model on mathematical problem solving. In order to improve the effectiveness of mathematics learning, collaboration between problem-based learning model and authentic assessment model can be considered as one of learning models in class.
Psychosocial dimensions of solving an indoor air problem.
Lahtinen, Marjaana; Huuhtanen, Pekka; Kähkönen, Erkki; Reijula, Kari
2002-03-01
This investigation focuses on the psychological and social dimensions of managing and solving indoor air problems. The data were collected in nine workplaces by interviews (n = 85) and questionnaires (n = 375). Indoor air problems in office environments have traditionally utilized industrial hygiene or technical expertise. However, indoor air problems at workplaces are often more complex issues to solve. Technical questions are inter-related with the dynamics of the work community, and the cooperation and interaction skills of the parties involved in the solving process are also put to the test. In the present study, the interviewees were very critical of the process of solving the indoor air problem. The responsibility for coordinating the problem-managing process was generally considered vague, as were the roles and functions of the various parties. Communication problems occurred and rumors about the indoor air problem circulated widely. Conflicts were common, complicating the process in several ways. The research focused on examining different ways of managing and resolving an indoor air problem. In addition, reference material on the causal factors of the indoor air problem was also acquired. The study supported the hypothesis that psychosocial factors play a significant role in indoor air problems.
Study designs appropriate for the workplace.
Hogue, C J
1986-01-01
Carlo and Hearn have called for "refinement of old [epidemiologic] methods and an ongoing evaluation of where methods fit in the overall scheme as we address the multiple complexities of reproductive hazard assessment." This review is an attempt to bring together the current state-of-the-art methods for problem definition and hypothesis testing available to the occupational epidemiologist. For problem definition, meta analysis can be utilized to narrow the field of potential causal hypotheses. Passive active surveillance may further refine issues for analytic research. Within analytic epidemiology, several methods may be appropriate for the workplace setting. Those discussed here may be used to estimate the risk ratio in either a fixed or dynamic population.
NASA Astrophysics Data System (ADS)
Wisniewski, Nicholas Andrew
This dissertation is divided into two parts. First we present an exact solution to a generalization of the Behrens-Fisher problem by embedding the problem in the Riemannian manifold of Normal distributions. From this we construct a geometric hypothesis testing scheme. Secondly we investigate the most commonly used geometric methods employed in tensor field interpolation for DT-MRI analysis and cardiac computer modeling. We computationally investigate a class of physiologically motivated orthogonal tensor invariants, both at the full tensor field scale and at the scale of a single interpolation by doing a decimation/interpolation experiment. We show that Riemannian-based methods give the best results in preserving desirable physiological features.
Mancini, Vincent O.; Rigoli, Daniela; Cairney, John; Roberts, Lynne D.; Piek, Jan P.
2016-01-01
Poor motor skills have been shown to be associated with a range of psychosocial issues, including internalizing problems (anxiety and depression). While well-documented empirically, our understanding of why this relationship occurs remains theoretically underdeveloped. The Elaborated Environmental Stress Hypothesis by Cairney et al. (2013) provides a promising framework that seeks to explain the association between motor skills and internalizing problems, specifically in children with developmental coordination disorder (DCD). The framework posits that poor motor skills predispose the development of internalizing problems via interactions with intermediary environmental stressors. At the time the model was proposed, limited direct evidence was available to support or refute the framework. Several studies and developments related to the framework have since been published. This mini-review seeks to provide an up-to-date overview of recent developments related to the Elaborated Environmental Stress Hypothesis. We briefly discuss the past research that led to its development, before moving to studies that have investigated the framework since it was proposed. While originally developed within the context of DCD in childhood, recent developments have found support for the model in community samples. Through the reviewed literature, this article provides support for the Elaborated Environmental Stress Hypothesis as a promising theoretical framework that explains the psychosocial correlates across the broader spectrum of motor ability. However, given its recent conceptualization, ongoing evaluation of the Elaborated Environmental Stress Hypothesis is recommended. PMID:26941690
P value and the theory of hypothesis testing: an explanation for new researchers.
Biau, David Jean; Jolles, Brigitte M; Porcher, Raphaël
2010-03-01
In the 1920s, Ronald Fisher developed the theory behind the p value and Jerzy Neyman and Egon Pearson developed the theory of hypothesis testing. These distinct theories have provided researchers important quantitative tools to confirm or refute their hypotheses. The p value is the probability to obtain an effect equal to or more extreme than the one observed presuming the null hypothesis of no effect is true; it gives researchers a measure of the strength of evidence against the null hypothesis. As commonly used, investigators will select a threshold p value below which they will reject the null hypothesis. The theory of hypothesis testing allows researchers to reject a null hypothesis in favor of an alternative hypothesis of some effect. As commonly used, investigators choose Type I error (rejecting the null hypothesis when it is true) and Type II error (accepting the null hypothesis when it is false) levels and determine some critical region. If the test statistic falls into that critical region, the null hypothesis is rejected in favor of the alternative hypothesis. Despite similarities between the two, the p value and the theory of hypothesis testing are different theories that often are misunderstood and confused, leading researchers to improper conclusions. Perhaps the most common misconception is to consider the p value as the probability that the null hypothesis is true rather than the probability of obtaining the difference observed, or one that is more extreme, considering the null is true. Another concern is the risk that an important proportion of statistically significant results are falsely significant. Researchers should have a minimum understanding of these two theories so that they are better able to plan, conduct, interpret, and report scientific experiments.
Reflections on the "gesture-first" hypothesis of language origins.
Kendon, Adam
2017-02-01
The main lines of evidence taken as support for the "gesture-first" hypothesis of language origins are briefly evaluated, and the problem that speech poses for this hypothesis is discussed. I conclude that language must have evolved in the oral-aural and kinesic modalities together, with neither modality taking precedence over the other.
Linking Family Characteristics with Poor Peer Relations: The Mediating Role of Conduct Problems
Bierman, Karen Linn; Smoot, David L.
2012-01-01
Parent, teacher, and peer ratings were collected for 75 grade school boys to test the hypothesis that certain family interaction patterns would be associated with poor peer relations. Path analyses provided support for a mediational model, in which punitive and ineffective discipline was related to child conduct problems in home and school settings which, in turn, predicted poor peer relations. Further analyses suggested that distinct subgroups of boys could be identified who exhibited conduct problems at home only, at school only, in both settings, or in neither setting. Boys who exhibited cross-situational conduct problems were more likely to experience multiple concurrent problems (e.g., in both home and school settings) and were more likely than any other group to experience poor peer relations. However, only about one-third of the boys with poor peer relations in this sample exhibited problem profiles consistent with the proposed model (e.g., experienced high rates of punitive/ineffective home discipline and exhibited conduct problems in home and school settings), suggesting that the proposed model reflects one common (but not exclusive) pathway to poor peer relations. PMID:1865049
Teaching Hypothesis Testing by Debunking a Demonstration of Telepathy.
ERIC Educational Resources Information Center
Bates, John A.
1991-01-01
Discusses a lesson designed to demonstrate hypothesis testing to introductory college psychology students. Explains that a psychology instructor demonstrated apparent psychic abilities to students. Reports that students attempted to explain the instructor's demonstrations through hypothesis testing and revision. Provides instructions on performing…
Lash, Ayhan Aytekin; Plonczynski, Donna J; Sehdev, Amikar
2011-01-01
To compare the inclusion and the influences of selected variables on hypothesis testing during the 1980s and 1990s. In spite of the emphasis on conducting inquiry consistent with the tenets of logical positivism, there have been no studies investigating the frequency and patterns of hypothesis testing in nursing research The sample was obtained from the journal Nursing Research which was the research journal with the highest circulation during the study period under study. All quantitative studies published during the two decades including briefs and historical studies were included in the analyses A retrospective design was used to select the sample. Five years from the 1980s and 1990s each were randomly selected from the journal, Nursing Research. Of the 582 studies, 517 met inclusion criteria. Findings suggest that there has been a decline in the use of hypothesis testing in the last decades of the 20th century. Further research is needed to identify the factors that influence the conduction of research with hypothesis testing. Hypothesis testing in nursing research showed a steady decline from the 1980s to 1990s. Research purposes of explanation, and prediction/ control increased the likelihood of hypothesis testing. Hypothesis testing strengthens the quality of the quantitative studies, increases the generality of findings and provides dependable knowledge. This is particularly true for quantitative studies that aim to explore, explain and predict/control phenomena and/or test theories. The findings also have implications for doctoral programmes, research preparation of nurse-investigators, and theory testing.
A shift from significance test to hypothesis test through power analysis in medical research.
Singh, G
2006-01-01
Medical research literature until recently, exhibited substantial dominance of the Fisher's significance test approach of statistical inference concentrating more on probability of type I error over Neyman-Pearson's hypothesis test considering both probability of type I and II error. Fisher's approach dichotomises results into significant or not significant results with a P value. The Neyman-Pearson's approach talks of acceptance or rejection of null hypothesis. Based on the same theory these two approaches deal with same objective and conclude in their own way. The advancement in computing techniques and availability of statistical software have resulted in increasing application of power calculations in medical research and thereby reporting the result of significance tests in the light of power of the test also. Significance test approach, when it incorporates power analysis contains the essence of hypothesis test approach. It may be safely argued that rising application of power analysis in medical research may have initiated a shift from Fisher's significance test to Neyman-Pearson's hypothesis test procedure.
2011-01-01
Background Although many biological databases are applying semantic web technologies, meaningful biological hypothesis testing cannot be easily achieved. Database-driven high throughput genomic hypothesis testing requires both of the capabilities of obtaining semantically relevant experimental data and of performing relevant statistical testing for the retrieved data. Tissue Microarray (TMA) data are semantically rich and contains many biologically important hypotheses waiting for high throughput conclusions. Methods An application-specific ontology was developed for managing TMA and DNA microarray databases by semantic web technologies. Data were represented as Resource Description Framework (RDF) according to the framework of the ontology. Applications for hypothesis testing (Xperanto-RDF) for TMA data were designed and implemented by (1) formulating the syntactic and semantic structures of the hypotheses derived from TMA experiments, (2) formulating SPARQLs to reflect the semantic structures of the hypotheses, and (3) performing statistical test with the result sets returned by the SPARQLs. Results When a user designs a hypothesis in Xperanto-RDF and submits it, the hypothesis can be tested against TMA experimental data stored in Xperanto-RDF. When we evaluated four previously validated hypotheses as an illustration, all the hypotheses were supported by Xperanto-RDF. Conclusions We demonstrated the utility of high throughput biological hypothesis testing. We believe that preliminary investigation before performing highly controlled experiment can be benefited. PMID:21342584
The evolution of galaxies. III - Metal-enhanced star formation
NASA Technical Reports Server (NTRS)
Talbot, R. J., Jr.; Arnett, W. D.
1973-01-01
The problem of the paucity of low-metal-abundance low-mass stars is discussed. One alternative to the variable-initial-mass-function (VIMF) solution is proposed. It is shown that this solution - metal-enhanced star formation - satisfies the classical test which prompted the VIMF hypothesis. Furthermore, with no additional parameters it provides improved fits to other tests - e.g., inhomogeneities in the abundances in young stars, concordance of all nucleo-cosmochronologies, and a required yield of heavy-element production which is consistent with current stellar evolution theory. In this model the age of the Galaxy is 18.6 plus or minus 5.7 b.y.
Brouwers, Livia A M; Engels, Josephine A; Heerkens, Yvonne F; van der Beek, Allard J
2015-06-16
Most validated sustainable employability questionnaires are extensive and difficult to obtain. Our objective was to develop a usable and valid tool, a Vitality Scan, to determine possible signs of stagnation in one's functioning related to sustainable employability and to establish the instrument's internal consistency and construct validity. A literature review was performed and expert input was obtained to develop an online survey of 31 items. A sample of 1722 Dutch employees was recruited. Internal consistency was assessed by Cronbach's alpha. The underlying theoretical concepts were extracted by factor analysis using a principal component method. For construct validity, a priori hypotheses were defined for expected differences between known subgroups: 1) older workers would report more stagnation than younger workers, and 2) less educated workers would report more problems than the highly educated ones. Both hypotheses were statistically tested using ANOVA. Internal consistency measures and factor analysis resulted in five subscales with acceptable to good reliability (Cronbach's alpha 0.72-0.87). These subscales included: balance and competence, motivation and involvement, resilience, mental and physical health, and social support at work. Three items were removed following these analyses. In accordance with our a priori hypothesis 1, the ANOVA showed that older workers reported the most problems, while younger workers reported the least problems. However, hypothesis 2 was not confirmed: no significant differences were found for education level. The developed Vitality Scan - with the 28 remaining items - showed good measurement properties. It is applicable as a user-friendly, evaluative instrument for worker's sustainable employability. The scan's value for determining whether or not the employee is at risk for a decrease in functioning during present and future work, should be further tested.
The resolution of point sources of light as analyzed by quantum detection theory
NASA Technical Reports Server (NTRS)
Helstrom, C. W.
1972-01-01
The resolvability of point sources of incoherent light is analyzed by quantum detection theory in terms of two hypothesis-testing problems. In the first, the observer must decide whether there are two sources of equal radiant power at given locations, or whether there is only one source of twice the power located midway between them. In the second problem, either one, but not both, of two point sources is radiating, and the observer must decide which it is. The decisions are based on optimum processing of the electromagnetic field at the aperture of an optical instrument. In both problems the density operators of the field under the two hypotheses do not commute. The error probabilities, determined as functions of the separation of the points and the mean number of received photons, characterize the ultimate resolvability of the sources.
Brody, Gene H; Ge, Xiaojia; Kim, Su Yeong; Murry, Velma McBride; Simons, Ronald L; Gibbons, Frederick X; Gerrard, Meg; Conger, Rand D
2003-04-01
Data from 296 sibling pairs (mean ages 10 and 13 years), their primary caregivers, and census records were used to test the hypothesis that African American children's likelihood of developing conduct problems associated with harsh parenting, a lack of nurturant-involved parenting, and exposure to an older sibling's deviance-prone attitudes and behavior would be amplified among families residing in disadvantaged neighborhoods. A latent construct representing harsh-inconsistent parenting and low levels of nurturant-involved parenting was positively associated with younger siblings' conduct disorder symptoms, as were older siblings' problematic attitudes and behavior. These associations were strongest among families residing in the most disadvantaged neighborhoods. Future research and prevention programs should focus on the specific neighborhood processes associated with increased vulnerability for behavior problems.
Towards a Validation of the Three Pathways Model of Pathological Gambling.
Valleur, Marc; Codina, Irène; Vénisse, Jean-Luc; Romo, Lucia; Magalon, David; Fatséas, Mélina; Chéreau-Boudet, Isabelle; Gorsane, Mohamed-Ali; Guilleux, Alice; Grall-Bronnec, Marie; Challet-Bouju, Gaëlle
2016-06-01
With the aim of validating the three pathways hypothesis of pathological gambling (Blaszczynski and Nower in Addiction 97:487-499, 2002) 372 pathological gamblers meeting DSM IV (2000) criteria were assessed via a structured clinical interview as well as being subjected to personality tests and evaluation of their gambling practices. Our results show that it is possible to identify three subgroups corresponding to the three pathways: behaviourally conditioned problem gamblers, emotionally vulnerable problem gamblers and antisocial impulsivist problem gamblers. Our results particularly demonstrate that impulsivist gamblers preferentially choose semi-skilful gambling (horse racing and sports gambling) whereas emotionally vulnerable gamblers are significantly more attracted to games of chance (one-armed bandits, scratch cards, etc.) This led us to propose a functional presentation of the three pathways model which differs somewhat from the Blaszczynski and Nower presentation.
The adaptive value of tool-aided defense against wild animal attacks.
Crabb, Peter B; Elizaga, Andrew
2008-01-01
Throughout history humans have faced the persistent threat of attacks by wild animals, and how humans respond to this problem can make the difference between survival and death. In theory, the use of tools to fend off animal attacks would be more effective than resisting bare-handed, yet evidence for the advantage of tool-aided defense is scarce and equivocal. Two studies of news accounts of wild animal attacks against humans were conducted to test the hypothesis that tool-aided defense is indeed associated with reductions in injuries and deaths. Results of both Study 1 (N=172) and Study 2 (N=370) supported the hypothesis. The observed survival advantage of tool-aided defense for modern humans suggests that this tactic also would have worked for human ancestors who lived more closely to dangerous wild animals. 2008 Wiley-Liss, Inc.
Extended target recognition in cognitive radar networks.
Wei, Yimin; Meng, Huadong; Liu, Yimin; Wang, Xiqin
2010-01-01
We address the problem of adaptive waveform design for extended target recognition in cognitive radar networks. A closed-loop active target recognition radar system is extended to the case of a centralized cognitive radar network, in which a generalized likelihood ratio (GLR) based sequential hypothesis testing (SHT) framework is employed. Using Doppler velocities measured by multiple radars, the target aspect angle for each radar is calculated. The joint probability of each target hypothesis is then updated using observations from different radar line of sights (LOS). Based on these probabilities, a minimum correlation algorithm is proposed to adaptively design the transmit waveform for each radar in an amplitude fluctuation situation. Simulation results demonstrate performance improvements due to the cognitive radar network and adaptive waveform design. Our minimum correlation algorithm outperforms the eigen-waveform solution and other non-cognitive waveform design approaches.
Inference for local autocorrelations in locally stationary models.
Zhao, Zhibiao
2015-04-01
For non-stationary processes, the time-varying correlation structure provides useful insights into the underlying model dynamics. We study estimation and inferences for local autocorrelation process in locally stationary time series. Our constructed simultaneous confidence band can be used to address important hypothesis testing problems, such as whether the local autocorrelation process is indeed time-varying and whether the local autocorrelation is zero. In particular, our result provides an important generalization of the R function acf() to locally stationary Gaussian processes. Simulation studies and two empirical applications are developed. For the global temperature series, we find that the local autocorrelations are time-varying and have a "V" shape during 1910-1960. For the S&P 500 index, we conclude that the returns satisfy the efficient-market hypothesis whereas the magnitudes of returns show significant local autocorrelations.
Attention Problems and Stability of WISC-IV Scores Among Clinically Referred Children.
Green Bartoi, Marla; Issner, Jaclyn Beth; Hetterscheidt, Lesley; January, Alicia M; Kuentzel, Jeffrey Garth; Barnett, Douglas
2015-01-01
We examined the stability of Wechsler Intelligence Scale for Children-Fourth Edition (WISC-IV) scores among 51 diverse, clinically referred 8- to 16-year-olds (M(age) = 11.24 years, SD = 2.36). Children were referred to and tested at an urban, university-based training clinic; 70% of eligible children completed follow-up testing 12 months to 40 months later (M = 22.05, SD = 5.94). Stability for index scores ranged from .58 (Processing Speed) to .81 (Verbal Comprehension), with a stability of .86 for Full-Scale IQ. Subtest score stability ranged from .35 (Letter-Number Sequencing) to .81 (Vocabulary). Indexes believed to be more susceptible to concentration (Processing Speed and Working Memory) had lower stability. We also examined attention problems as a potential moderating factor of WISC-IV index and subtest score stability. Children with attention problems had significantly lower stability for Digit Span and Matrix Reasoning subtests compared with children without attention problems. These results provide support for the temporal stability of the WISC-IV and also provide some support for the idea that attention problems contribute to children producing less stable IQ estimates when completing the WISC-IV. We hope our report encourages further examination of this hypothesis and its implications.
Toward Joint Hypothesis-Tests Seismic Event Screening Analysis: Ms|mb and Event Depth
DOE Office of Scientific and Technical Information (OSTI.GOV)
Anderson, Dale; Selby, Neil
2012-08-14
Well established theory can be used to combine single-phenomenology hypothesis tests into a multi-phenomenology event screening hypothesis test (Fisher's and Tippett's tests). Commonly used standard error in Ms:mb event screening hypothesis test is not fully consistent with physical basis. Improved standard error - Better agreement with physical basis, and correctly partitions error to include Model Error as a component of variance, correctly reduces station noise variance through network averaging. For 2009 DPRK test - Commonly used standard error 'rejects' H0 even with better scaling slope ({beta} = 1, Selby et al.), improved standard error 'fails to rejects' H0.
Daws, Richard E.; Hampshire, Adam
2017-01-01
It is well established that religiosity correlates inversely with intelligence. A prominent hypothesis states that this correlation reflects behavioral biases toward intuitive problem solving, which causes errors when intuition conflicts with reasoning. We tested predictions of this hypothesis by analyzing data from two large-scale Internet-cohort studies (combined N = 63,235). We report that atheists surpass religious individuals in terms of reasoning but not working-memory performance. The religiosity effect is robust across sociodemographic factors including age, education and country of origin. It varies significantly across religions and this co-occurs with substantial cross-group differences in religious dogmatism. Critically, the religiosity effect is strongest for tasks that explicitly manipulate conflict; more specifically, atheists outperform the most dogmatic religious group by a substantial margin (0.6 standard deviations) during a color-word conflict task but not during a challenging matrix-reasoning task. These results support the hypothesis that behavioral biases rather than impaired general intelligence underlie the religiosity effect. PMID:29312057
Inflexible Minds: Impaired Attention Switching in Recent-Onset Schizophrenia
Smid, Henderikus G. O. M.; Martens, Sander; de Witte, Marc R.; Bruggeman, Richard
2013-01-01
Impairment of sustained attention is assumed to be a core cognitive abnormality in schizophrenia. However, this seems inconsistent with a recent hypothesis that in schizophrenia the implementation of selection (i.e., sustained attention) is intact but the control of selection (i.e., switching the focus of attention) is impaired. Mounting evidence supports this hypothesis, indicating that switching of attention is a bigger problem in schizophrenia than maintaining the focus of attention. To shed more light on this hypothesis, we tested whether schizophrenia patients are impaired relative to controls in sustaining attention, switching attention, or both. Fifteen patients with recent-onset schizophrenia and fifteen healthy volunteers, matched on age and intelligence, performed sustained attention and attention switching tasks, while performance and brain potential measures of selective attention were recorded. In the sustained attention task, patients did not differ from the controls on these measures. In the attention switching task, however, patients showed worse performance than the controls, and early selective attention related brain potentials were absent in the patients while clearly present in the controls. These findings support the hypothesis that schizophrenia is associated with an impairment of the mechanisms that control the direction of attention (attention switching), while the mechanisms that implement a direction of attention (sustained attention) are intact. PMID:24155980
Environmental Kuznets Curve Hypothesis: A Perspective of Sustainable Development in Indonesia
NASA Astrophysics Data System (ADS)
Nuansa, Citrasmara Galuh; Widodo, Wahyu
2018-02-01
Sustainable development with three main pillars, namely environmental, economic, and social, is the concept of country's development to achieve inclusive economic growth, good environmental quality, and improvement of people's welfare. However, the dominance of economic factors cause various environmental problem. This phenomenon occurs in most of developing countries, including in Indonesia. The relationship between economic activity and environmental quality has been widely discussed and empirically tested by scholars. This descriptive research analysed the hypothesis called Environmental Kuznets Curve (EKC) from a perspective of sustainable development in Indonesia. EKC hypothesis illustrates the relationship between economic growth and environmental degradation forming an inverted U-curve, indicating that at the beginning of development, environmental quality will decrease along with increasing economic growth, and then reached a certain point the environmental quality will gradually improve. In this paper will be discussed how the relationship between environmental quality and economic growth in Indonesia was investigated. The preliminary results show that most of the empirical studies use the conventional approach, in which the CO2 emission used as the proxy of environmental degradation. The existence of inverted U-curve is also inconclusive. Therefore, the extension research on the relationship between economic growth and environmental quality in Indonesia using the EKC hypothesis is required.
Benbassat, Jochanan
2018-02-24
Undergraduate clinical education follows the "bedside" tradition that exposes students to inpatients. However, the hospital learning environment has two main limitations. First, most inpatients require acute care, and students may complete their training without seeing patients with frequent non-emergent and chronic diseases that are managed in outpatient settings. Second, students rarely cope with diagnostic problems, because most inpatients are diagnosed in the community or the emergency room. These limitations have led some medical schools to offer longitudinal integrated clerkships in community settings instead of hospital block clerkship rotations. In this paper, I propose the hypothesis that the hospital learning environment has a third limitation: it causes students' distress and delays their development of reflectivity and medical professionalism. This hypothesis is supported by evidence that (a) the clinical learning environment, rather than students' personality traits, is the major driver of students' distress, and (b) the development of attributes, such as moral reasoning, empathy, emotional intelligence and tolerance of uncertainty that are included in the definitions of both reflectivity and medical professionalism, is arrested during undergraduate medical training. Future research may test the proposed hypothesis by comparing students' development of these attributes during clerkships in hospital wards with that during longitudinal clerkships in community settings.
Cummings, E Mark; Schermerhorn, Alice C; Merrilees, Christine E; Goeke-Morey, Marcie C; Shirlow, Peter; Cairns, Ed
2010-07-01
Moving beyond simply documenting that political violence negatively impacts children, we tested a social-ecological hypothesis for relations between political violence and child outcomes. Participants were 700 mother-child (M = 12.1 years, SD = 1.8) dyads from 18 working-class, socially deprived areas in Belfast, Northern Ireland, including single- and two-parent families. Sectarian community violence was associated with elevated family conflict and children's reduced security about multiple aspects of their social environment (i.e., family, parent-child relations, and community), with links to child adjustment problems and reductions in prosocial behavior. By comparison, and consistent with expectations, links with negative family processes, child regulatory problems, and child outcomes were less consistent for nonsectarian community violence. Support was found for a social-ecological model for relations between political violence and child outcomes among both single- and two-parent families, with evidence that emotional security and adjustment problems were more negatively affected in single-parent families. The implications for understanding social ecologies of political violence and children's functioning are discussed.
Cummings, E. Mark; Schermerhorn, Alice C.; Merrilees, Christine E.; Goeke-Morey, Marcie C.; Shirlow, Peter; Cairns, Ed
2013-01-01
Moving beyond simply documenting that political violence negatively impacts children, a social ecological hypothesis for relations between political violence and child outcomes was tested. Participants were 700 mother-child (M=12.1years, SD=1.8) dyads from 18 working class, socially deprived areas in Belfast, Northern Ireland, including single- and two-parent families. Sectarian community violence was associated with elevated family conflict and children’s reduced security about multiple aspects of their social environment (i.e., family, parent-child relations, and community), with links to child adjustment problems and reductions in prosocial behavior. By comparison, and consistent with expectations, links with negative family processes, child regulatory problems and child outcomes were less consistent for nonsectarian community violence. Support was found for a social ecological model for relations between political violence and child outcomes among both single and two parent families, with evidence that emotional security and adjustment problems were more negatively affected in single-parent families. The implications for understanding social ecologies of political violence and children’s functioning are discussed. PMID:20604605
Nature's style: Naturally trendy
Cohn, T.A.; Lins, H.F.
2005-01-01
Hydroclimatological time series often exhibit trends. While trend magnitude can be determined with little ambiguity, the corresponding statistical significance, sometimes cited to bolster scientific and political argument, is less certain because significance depends critically on the null hypothesis which in turn reflects subjective notions about what one expects to see. We consider statistical trend tests of hydroclimatological data in the presence of long-term persistence (LTP). Monte Carlo experiments employing FARIMA models indicate that trend tests which fail to consider LTP greatly overstate the statistical significance of observed trends when LTP is present. A new test is presented that avoids this problem. From a practical standpoint, however, it may be preferable to acknowledge that the concept of statistical significance is meaningless when discussing poorly understood systems.
Nature's style: Naturally trendy
NASA Astrophysics Data System (ADS)
Cohn, Timothy A.; Lins, Harry F.
2005-12-01
Hydroclimatological time series often exhibit trends. While trend magnitude can be determined with little ambiguity, the corresponding statistical significance, sometimes cited to bolster scientific and political argument, is less certain because significance depends critically on the null hypothesis which in turn reflects subjective notions about what one expects to see. We consider statistical trend tests of hydroclimatological data in the presence of long-term persistence (LTP). Monte Carlo experiments employing FARIMA models indicate that trend tests which fail to consider LTP greatly overstate the statistical significance of observed trends when LTP is present. A new test is presented that avoids this problem. From a practical standpoint, however, it may be preferable to acknowledge that the concept of statistical significance is meaningless when discussing poorly understood systems.
Zhang, Fanghong; Miyaoka, Etsuo; Huang, Fuping; Tanaka, Yutaka
2015-01-01
The problem for establishing noninferiority is discussed between a new treatment and a standard (control) treatment with ordinal categorical data. A measure of treatment effect is used and a method of specifying noninferiority margin for the measure is provided. Two Z-type test statistics are proposed where the estimation of variance is constructed under the shifted null hypothesis using U-statistics. Furthermore, the confidence interval and the sample size formula are given based on the proposed test statistics. The proposed procedure is applied to a dataset from a clinical trial. A simulation study is conducted to compare the performance of the proposed test statistics with that of the existing ones, and the results show that the proposed test statistics are better in terms of the deviation from nominal level and the power.
ON THE SUBJECT OF HYPOTHESIS TESTING
Ugoni, Antony
1993-01-01
In this paper, the definition of a statistical hypothesis is discussed, and the considerations which need to be addressed when testing a hypothesis. In particular, the p-value, significance level, and power of a test are reviewed. Finally, the often quoted confidence interval is given a brief introduction. PMID:17989768
Some consequences of using the Horsfall-Barratt scale for hypothesis testing
USDA-ARS?s Scientific Manuscript database
Comparing treatment effects by hypothesis testing is a common practice in plant pathology. Nearest percent estimates (NPEs) of disease severity were compared to Horsfall-Barratt (H-B) scale data to explore whether there was an effect of assessment method on hypothesis testing. A simulation model ba...
Hypothesis Testing in Task-Based Interaction
ERIC Educational Resources Information Center
Choi, Yujeong; Kilpatrick, Cynthia
2014-01-01
Whereas studies show that comprehensible output facilitates L2 learning, hypothesis testing has received little attention in Second Language Acquisition (SLA). Following Shehadeh (2003), we focus on hypothesis testing episodes (HTEs) in which learners initiate repair of their own speech in interaction. In the context of a one-way information gap…
Classroom-Based Strategies to Incorporate Hypothesis Testing in Functional Behavior Assessments
ERIC Educational Resources Information Center
Lloyd, Blair P.; Weaver, Emily S.; Staubitz, Johanna L.
2017-01-01
When results of descriptive functional behavior assessments are unclear, hypothesis testing can help school teams understand how the classroom environment affects a student's challenging behavior. This article describes two hypothesis testing strategies that can be used in classroom settings: structural analysis and functional analysis. For each…
Hypothesis Testing in the Real World
ERIC Educational Resources Information Center
Miller, Jeff
2017-01-01
Critics of null hypothesis significance testing suggest that (a) its basic logic is invalid and (b) it addresses a question that is of no interest. In contrast to (a), I argue that the underlying logic of hypothesis testing is actually extremely straightforward and compelling. To substantiate that, I present examples showing that hypothesis…
Hegarty, Peter
2017-01-01
Drawing together social psychologists' concerns with equality and cognitive psychologists' concerns with scientific inference, 6 studies (N = 841) showed how implicit category norms make the generation and test of hypothesis about race highly asymmetric. Having shown that Whiteness is the default race of celebrity actors (Study 1), Study 2 used a variant of Wason's (1960) rule discovery task to demonstrate greater difficulty in discovering rules that require specifying that race is shared by White celebrity actors than by Black celebrity actors. Clues to the Whiteness of White actors from analogous problems had little effect on hypothesis formation or rule discovery (Studies 3 and 4). Rather, across Studies 2 and 4 feedback about negative cases-non-White celebrities-facilitated the discovery that White actors shared a race, whether participants or experimenters generated the negative cases. These category norms were little affected by making White actors' Whiteness more informative (Study 5). Although participants understood that discovering that White actors are White would be harder than discovering that Black actors are Black, they showed limited insight into the information contained in negative cases (Study 6). Category norms render some identities as implicit defaults, making hypothesis formation and generalization about real social groups asymmetric in ways that have implications for scientific reasoning and social equality. (PsycINFO Database Record (c) 2017 APA, all rights reserved).
PTSD, alcohol dependence, and conduct problems: Distinct pathways via lability and disinhibition.
Simons, Jeffrey S; Simons, Raluca M; O'Brien, Carol; Stoltenberg, Scott F; Keith, Jessica A; Hudson, Jaime A
2017-01-01
This study tested the role of affect lability and disinhibition in mediating associations between PTSD symptoms and two forms of alcohol-related problems, dependence syndrome symptoms (e.g., impaired control over consumption) and conduct problems (e.g., assault, risk behaviors). Genotype at the serotonin transporter linked polymorphic region (5-HTTLPR) was hypothesized to moderate associations between traumatic stress and PTSD symptoms. In addition, the study tested whether childhood traumatic stress moderated associations between combat trauma and PTSD symptoms. Participants were 270 OIF/OEF/OND veterans. The hypothesized model was largely supported. Participants with the low expression alleles of 5-HTTLPR (S or L G ) exhibited stronger associations between childhood (but not combat) traumatic stress and PTSD symptoms. Affect lability mediated the associations between PTSD symptoms and alcohol dependence symptoms. Behavioral disinhibition mediated associations between PTSD symptoms and conduct related problems. Conditional indirect effects indicated stronger associations between childhood traumatic stress and lability, behavioral disinhibition, alcohol consumption, AUD symptoms, and associated conduct problems via PTSD symptoms among those with the low expression 5-HTTLPR alleles. However, interactions between combat trauma and either childhood trauma or genotype were not significant. The results support the hypothesis that affect lability and behavioral disinhibition are potential intermediate traits with distinct associations with AUD and associated externalizing problems. Copyright © 2016 Elsevier Ltd. All rights reserved.
Executive functions, parental punishment, and aggression: Direct and moderated relations.
Fatima, Shameem; Sharif, Imran
2017-12-01
The main focus of the current study was to assess whether executive functions (EFs) moderate the effect of parental punishment on adolescent aggression. The sample were 370 participants (53% girls, 47% boys) enrolled at secondary and higher secondary levels and ranged in age between 13-19 years (M = 15.5, SD = 1.3). Participants were assessed on a self-report measure of aggression and two punishment measures, in addition to a demographic sheet. Then, they were individually assessed on four tests taken from the Delis-Kaplan Executive Functions System (D-KEFS) namely Trial Making Test (TMT), Design Fluency Test (DFT), Color Word Interference Test (CWIT), and Card Sorting Test (CST) to assess cognitive flexibility, nonverbal fluency, inhibition, and problem-solving ability, respectively. Correlation coefficients indicated that all four executive functioning measures and the two punishment measures were significantly correlated with aggression. Moderation analysis indicated that all EFs moderated the relationship between physical punishment and aggression, and only inhibition and problem-solving ability, but not cognitive flexibility and nonverbal fluency, moderated the relations between symbolic punishment and aggression. The findings support the hypothesis that EFs are protective personal factors that promote healthy adolescent adjustment in the presence of challenging environmental factors.
Robust operative diagnosis as problem solving in a hypothesis space
NASA Technical Reports Server (NTRS)
Abbott, Kathy H.
1988-01-01
This paper describes an approach that formulates diagnosis of physical systems in operation as problem solving in a hypothesis space. Such a formulation increases robustness by: (1) incremental hypotheses construction via dynamic inputs, (2) reasoning at a higher level of abstraction to construct hypotheses, and (3) partitioning the space by grouping fault hypotheses according to the type of physical system representation and problem solving techniques used in their construction. It was implemented for a turbofan engine and hydraulic subsystem. Evaluation of the implementation on eight actual aircraft accident cases involving engine faults provided very promising results.
ERIC Educational Resources Information Center
Kwon, Yong-Ju; Jeong, Jin-Su; Park, Yun-Bok
2006-01-01
The purpose of the present study was to test the hypothesis that student's abductive reasoning skills play an important role in the generation of hypotheses on pendulum motion tasks. To test the hypothesis, a hypothesis-generating test on pendulum motion, and a prior-belief test about pendulum motion were developed and administered to a sample of…
Brophy-Herb, Holly E; Bocknek, Erika London; Vallotton, Claire D; Stansbury, Kathy E; Senehi, Neda; Dalimonte-Merckling, Danielle; Lee, Young-Eun
2015-09-01
To test the hypothesis that toddlers at highest risk for behavioral problems from the most economically vulnerable families will benefit most from maternal talk about emotions. This study included 89 toddlers and mothers from low-income families. Behavioral problems were rated at 2 time points by masters-level trained Early Head Start home visiting specialists. Maternal emotion talk was coded from a wordless book-sharing task. Coding focused on mothers' emotion bridging, which included labeling emotions, explaining the context of emotions, noting the behavioral cues of emotions, and linking emotions to toddlers' own experiences. Maternal demographic risk reflected a composite score of 5 risk factors. A significant 3-way interaction between Time 1 toddler behavior problems, maternal emotion talk, and maternal demographic risk (p = .001) and examination of slope difference tests revealed that when maternal demographic risk was greater, more maternal emotion talk buffered associations between earlier and later behavior problems. Greater demographic risk and lower maternal emotion talk intensified Time 1 behavior problems as a predictor of Time 2 behavior problems. The model explained 54% of the variance in toddlers' Time 2 behavior problems. Analyses controlled for maternal warmth to better examine the unique contributions of emotion bridging to toddlers' behaviors. Toddlers at highest risk, those with more early behavioral problems from higher demographic-risk families, benefit the most from mothers' emotion talk. Informing parents about the use of emotion talk may be a cost-effective, simple strategy to support at-risk toddlers' social-emotional development and reduce behavioral problems.
Dimensions of disinhibited personality and their relation with alcohol use and problems
Gunn, Rachel L.; Finn, Peter R.; Endres, Michael J.; Gerst, Kyle R.; Spinola, Suzanne
2013-01-01
Although alcohol use disorders (AUDs) have been associated with different aspects of disinhibited personality and antisociality, less is known about the specific relationships among different domains of disinhibited personality, antisociality, alcohol use, and alcohol problems. The current study was designed to address three goals, (i) to provide evidence of a three-factor model of disinhibited personality (comprised of impulsivity [IMP], risk taking/ low harm avoidance [RTHA], excitement seeking [ES]), (ii) to test hypotheses regarding the association between each dimension and alcohol use and problems, and (iii) to test the hypothesis that antisociality (social deviance proneness [SDP]) accounts for the direct association between IMP and alcohol problems, while ES is directly related to alcohol use. Measures of disinhibited personality IMP, RTHA, ES and SDP and alcohol use and problems were assessed in a sample of young adults (N=474), which included a high proportion of individuals with AUDs. Confirmatory factor analyses supported a three-factor model of disinhibited personality reflecting IMP, RTHA, and ES. A structural equation model (SEM) showed that IMP was specifically associated with alcohol problems, while ES was specifically associated with alcohol use. In a second SEM, SDP accounted for the majority of the variance in alcohol problems associated with IMP. The results suggest aspects of IMP associated with SDP represent a direct vulnerability to alcohol problems. In addition, the results suggest that ES reflects a specific vulnerability to excessive alcohol use, which is then associated with alcohol problems, while RTHA is not specifically associated with alcohol use or problems when controlling for IMP and ES. PMID:23588138
Lu, Wei-Hsin; Lee, Kun-Hua; Ko, Chih-Hung; Hsiao, Ray C; Hu, Huei-Fan; Yen, Cheng-Fang
2017-09-01
Aim To examine the relationship between borderline personality symptoms and Internet addiction as well as the mediating role of mental health problems between them. Methods A total of 500 college students from Taiwan were recruited and assessed for symptoms of Internet addiction using the Chen Internet Addiction Scale, borderline personality symptoms using the Taiwanese version of the Borderline Symptom List and mental health problems using four subscales from the Symptom Checklist-90-Revised Scale (interpersonal sensitivity, depression, anxiety, and hostility). Structural equation modeling (SEM) was used to test our hypothesis that borderline personality symptoms are associated with the severity of Internet addiction directly and also through the mediation of mental health problems. Results SEM analysis revealed that all paths in the hypothesized model were significant, indicating that borderline personality symptoms were directly related to the severity of Internet addiction as well as indirectly related to the severity of Internet addiction by increasing the severity of mental health problems. Conclusion Borderline personality symptoms and mental health problems should be taken into consideration when designing intervention programs for Internet addiction.
Are smokers rational addicts? Empirical evidence from the Indonesian Family Life Survey
2011-01-01
Background Indonesia is one of the largest consumers of tobacco in the world, however there has been little work done on the economics addiction of tobacco. This study provides an empirical test of a rational addiction (henceforth RA) hypothesis of cigarette demand in Indonesia. Methods Four estimators (OLS, 2SLS, GMM, and System-GMM) were explored to test the RA hypothesis. The author adopted several diagnostics tests to select the best estimator to overcome econometric problems faced in presence of the past and future cigarette consumption (suspected endogenous variables). A short-run and long-run price elasticities of cigarettes demand was then calculated. The model was applied to individuals pooled data derived from three-waves a panel of the Indonesian Family Life Survey spanning the period 1993-2000. Results The past cigarette consumption coefficients turned out to be a positive with a p-value < 1%, implying that cigarettes indeed an addictive goods. The rational addiction hypothesis was rejected in favour of myopic ones. The short-run cigarette price elasticity for male and female was estimated to be-0.38 and -0.57, respectively, and the long-run one was -0.4 and -3.85, respectively. Conclusions Health policymakers should redesign current public health campaign against cigarette smoking in the country. Given the demand for cigarettes to be more prices sensitive for the long run (and female) than the short run (and male), an increase in the price of cigarettes could lead to a significant fall in cigarette consumption in the long run rather than as a constant source of government revenue. PMID:21345229
Are smokers rational addicts? Empirical evidence from the Indonesian Family Life Survey.
Hidayat, Budi; Thabrany, Hasbullah
2011-02-23
Indonesia is one of the largest consumers of tobacco in the world, however there has been little work done on the economics addiction of tobacco. This study provides an empirical test of a rational addiction (henceforth RA) hypothesis of cigarette demand in Indonesia. Four estimators (OLS, 2SLS, GMM, and System-GMM) were explored to test the RA hypothesis. The author adopted several diagnostics tests to select the best estimator to overcome econometric problems faced in presence of the past and future cigarette consumption (suspected endogenous variables). A short-run and long-run price elasticities of cigarettes demand was then calculated. The model was applied to individuals pooled data derived from three-waves a panel of the Indonesian Family Life Survey spanning the period 1993-2000. The past cigarette consumption coefficients turned out to be a positive with a p-value < 1%, implying that cigarettes indeed an addictive goods. The rational addiction hypothesis was rejected in favour of myopic ones. The short-run cigarette price elasticity for male and female was estimated to be-0.38 and -0.57, respectively, and the long-run one was -0.4 and -3.85, respectively. Health policymakers should redesign current public health campaign against cigarette smoking in the country. Given the demand for cigarettes to be more prices sensitive for the long run (and female) than the short run (and male), an increase in the price of cigarettes could lead to a significant fall in cigarette consumption in the long run rather than as a constant source of government revenue.
Tampubolon, Gindo
2015-01-01
The ageing population poses a tremendous challenge in understanding the sources of inequalities in health. Though they appear to be far removed, childhood conditions are known to be inextricably linked with adult health, and in turn on health in later life. The long arm of childhood conditions hypothesis is often tested using recollection of childhood circumstances, but such subjective recall can yield potentially inaccurate or possibly biased inferences. We tested the long arm hypothesis on three outcomes in later life, arrayed from objective to subjective health, namely: gait speed, episodic memory and mental health. We used the English Longitudinal Study of Ageing 2006 enriched with retrospective life history (N = 5,913). To deal with recall problems two solutions, covariate measurement and endogenous treatment models, were applied. Retrospective childhood material lack includes growing up without running hot or cold water, fixed bath, indoor lavatory and central heating. Adjustment is made for an extensive set of confounders including sex, age, adult health, wealth, education, occupation, social support, social connections, chronic conditions, smoking, drinking, and physical exercise. It is found that material poverty when growing up shows no association with health when growing old, assuming accurate recall. Once recall problems are controlled, we found that childhood material poverty changes inversely with later life health. A poorer childhood goes with slower gait, poorer memory and more depression in later life. This result provides a further impetus to eliminate child poverty.
De Meeûs, Thierry
2014-03-01
In population genetics data analysis, researchers are often faced to the problem of decision making from a series of tests of the same null hypothesis. This is the case when one wants to test differentiation between pathogens found on different host species sampled from different locations (as many tests as number of locations). Many procedures are available to date but not all apply to all situations. Finding which tests are significant or if the whole series is significant, when tests are independent or not do not require the same procedures. In this note I describe several procedures, among the simplest and easiest to undertake, that should allow decision making in most (if not all) situations population geneticists (or biologists) should meet, in particular in host-parasite systems. Copyright © 2014 Elsevier B.V. All rights reserved.
Ream, Geoffrey L.; Elliott, Luther C.; Dunlap, Eloise
2011-01-01
This study tested the hypothesis that playing video games while using or feeling the effects of a substance—referred to herein as “concurrent use”—is related to substance use problems after controlling for substance use frequency, video gaming as an enthusiastic hobby, and demographic factors. Data were drawn from a nationally representative online survey of adult video gamers conducted by Knowledge Networks, valid n = 2,885. Problem video game playing behavior was operationalized using Tejeiro Salguero and Bersabé Morán’s 2002 problem video game play (PVP) measure, and measures for substance use problems were taken from the National Survey of Drug Use and Health (NSDUH). Separate structural equation modeling analyses were conducted for users of caffeine, tobacco, alcohol, and marijuana. In all four models, concurrent use was directly associated with substance use problems, but not with PVP. Video gaming as an enthusiastic hobby was associated with substance use problems via two indirect paths: through PVP for all substances, and through concurrent use for caffeine, tobacco, and alcohol only. Results illustrate the potential for “drug interaction” between self-reinforcing behaviors and addictive substances, with implications for the development of problem use. PMID:22073023
Ream, Geoffrey L; Elliott, Luther C; Dunlap, Eloise
2011-10-01
This study tested the hypothesis that playing video games while using or feeling the effects of a substance--referred to herein as "concurrent use"-is related to substance use problems after controlling for substance use frequency, video gaming as an enthusiastic hobby, and demographic factors. Data were drawn from a nationally representative online survey of adult video gamers conducted by Knowledge Networks, valid n = 2,885. Problem video game playing behavior was operationalized using Tejeiro Salguero and Bersabé Morán's 2002 problem video game play (PVP) measure, and measures for substance use problems were taken from the National Survey of Drug Use and Health (NSDUH). Separate structural equation modeling analyses were conducted for users of caffeine, tobacco, alcohol, and marijuana. In all four models, concurrent use was directly associated with substance use problems, but not with PVP. Video gaming as an enthusiastic hobby was associated with substance use problems via two indirect paths: through PVP for all substances, and through concurrent use for caffeine, tobacco, and alcohol only. Results illustrate the potential for "drug interaction" between self-reinforcing behaviors and addictive substances, with implications for the development of problem use.
Associations between content types of early media exposure and subsequent attentional problems.
Zimmerman, Frederick J; Christakis, Dimitri A
2007-11-01
Television and video/DVD viewing among very young children has become both pervasive and heavy. Previous studies have reported an association between early media exposure and problems with attention regulation but did not have data on the content type that children watched. We tested the hypothesis that early television viewing of 3 content types is associated with subsequent attentional problems. The 3 different content types are educational, nonviolent entertainment, and violent entertainment. Participants were children in a nationally representative sample collected in 1997 and reassessed in 2002. The analysis was a logistic regression of a high score on a validated parent-reported measure of attentional problems, regressed on early television exposure by content and several important sociodemographic control variables. Viewing of educational television before age 3 was not associated with attentional problems 5 years later. However, viewing of either violent or non-violent entertainment television before age 3 was significantly associated with subsequent attentional problems, and the magnitude of the association was large. Viewing of any content type at ages 4 to 5 was not associated with subsequent problems. The association between early television viewing and subsequent attentional problems is specific to noneducational viewing and to viewing before age 3.
Making Knowledge Delivery Failsafe: Adding Step Zero in Hypothesis Testing
ERIC Educational Resources Information Center
Pan, Xia; Zhou, Qiang
2010-01-01
Knowledge of statistical analysis is increasingly important for professionals in modern business. For example, hypothesis testing is one of the critical topics for quality managers and team workers in Six Sigma training programs. Delivering the knowledge of hypothesis testing effectively can be an important step for the incapable learners or…
Personality change at the intersection of autonomic arousal and stress.
Hart, Daniel; Eisenberg, Nancy; Valiente, Carlos
2007-06-01
We hypothesized that personality change in children can be predicted by the interaction of family risk with susceptibility to autonomic arousal and that children characterized by both high-risk families and highly reactive autonomic nervous systems tend to show maladaptive change. This hypothesis was tested in a 6-year longitudinal study in which personality-type prototypicality, problem behavior, and negative emotional intensity were measured at 2-year intervals. The results indicated that children who both had exaggerated skin conductance responses (a measure of autonomic reactivity) and were living in families with multiple risk factors were most likely to develop an undercontrolled personality type and to exhibit increases in problem behavior and negative emotional intensity. The implications of the results for understanding personality change are discussed.
Dynamic sensor management of dispersed and disparate sensors for tracking resident space objects
NASA Astrophysics Data System (ADS)
El-Fallah, A.; Zatezalo, A.; Mahler, R.; Mehra, R. K.; Donatelli, D.
2008-04-01
Dynamic sensor management of dispersed and disparate sensors for space situational awareness presents daunting scientific and practical challenges as it requires optimal and accurate maintenance of all Resident Space Objects (RSOs) of interest. We demonstrate an approach to the space-based sensor management problem by extending a previously developed and tested sensor management objective function, the Posterior Expected Number of Targets (PENT), to disparate and dispersed sensors. This PENT extension together with observation models for various sensor platforms, and a Probability Hypothesis Density Particle Filter (PHD-PF) tracker provide a powerful tool for tackling this challenging problem. We demonstrate the approach using simulations for tracking RSOs by a Space Based Visible (SBV) sensor and ground based radars.
A new role for Women Health Volunteers in urban Islamic Republic of Iran.
Behdjat, H; Rifkin, S B; Tarin, E; Sheikh, M R
2009-01-01
An action research project was carried out by a team from the National Public Health Management Centre in Tabriz, Iran to test the following hypothesis: Health Volunteers are more able to support health improvements by focusing on community participation and empowerment through facilitating communities to define and solve their own problems than by only providing information on health problems. Training on participatory approaches was given to Women Health Volunteers (WHV) in a pilot area. The results gave evidence that local people could identify and act upon their own health needs and request more information from professionals to improve their own health. Further research is needed however to assess how the pilot can be scaled up and how initial enthusiasm can be sustained.
The inverse problem of brain energetics: ketone bodies as alternative substrates
NASA Astrophysics Data System (ADS)
Calvetti, D.; Occhipinti, R.; Somersalo, E.
2008-07-01
Little is known about brain energy metabolism under ketosis, although there is evidence that ketone bodies have a neuroprotective role in several neurological disorders. We investigate the inverse problem of estimating reaction fluxes and transport rates in the different cellular compartments of the brain, when the data amounts to a few measured arterial venous concentration differences. By using a recently developed methodology to perform Bayesian Flux Balance Analysis and a new five compartment model of the astrocyte-glutamatergic neuron cellular complex, we are able to identify the preferred biochemical pathways during shortage of glucose and in the presence of ketone bodies in the arterial blood. The analysis is performed in a minimally biased way, therefore revealing the potential of this methodology for hypothesis testing.
The Relation between Cosmological Redshift and Scale Factor for Photons
DOE Office of Scientific and Technical Information (OSTI.GOV)
Tian, Shuxun, E-mail: tshuxun@mail.bnu.edu.cn; Department of Physics, Wuhan University, Wuhan 430072
The cosmological constant problem has become one of the most important ones in modern cosmology. In this paper, we try to construct a model that can avoid the cosmological constant problem and have the potential to explain the apparent late-time accelerating expansion of the universe in both luminosity distance and angular diameter distance measurement channels. In our model, the core is to modify the relation between cosmological redshift and scale factor for photons. We point out three ways to test our hypothesis: the supernova time dilation; the gravitational waves and its electromagnetic counterparts emitted by the binary neutron star systems;more » and the Sandage–Loeb effect. All of this method is feasible now or in the near future.« less
Speech Recognition in Noise by Children with and without Dyslexia: How is it Related to Reading?
Nittrouer, Susan; Krieg, Letitia M; Lowenstein, Joanna H
2018-06-01
Developmental dyslexia is commonly viewed as a phonological deficit that makes it difficult to decode written language. But children with dyslexia typically exhibit other problems, as well, including poor speech recognition in noise. The purpose of this study was to examine whether the speech-in-noise problems of children with dyslexia are related to their reading problems, and if so, if a common underlying factor might explain both. The specific hypothesis examined was that a spectral processing disorder results in these children receiving smeared signals, which could explain both the diminished sensitivity to phonological structure - leading to reading problems - and the speech recognition in noise difficulties. The alternative hypothesis tested in this study was that children with dyslexia simply have broadly based language deficits. Ninety-seven children between the ages of 7 years; 10 months and 12 years; 9 months participated: 46 with dyslexia and 51 without dyslexia. Children were tested on two dependent measures: word reading and recognition in noise with two types of sentence materials: as unprocessed (UP) signals, and as spectrally smeared (SM) signals. Data were collected for four predictor variables: phonological awareness, vocabulary, grammatical knowledge, and digit span. Children with dyslexia showed deficits on both dependent and all predictor variables. Their scores for speech recognition in noise were poorer than those of children without dyslexia for both the UP and SM signals, but by equivalent amounts across signal conditions indicating that they were not disproportionately hindered by spectral distortion. Correlation analyses on scores from children with dyslexia showed that reading ability and speech-in-noise recognition were only mildly correlated, and each skill was related to different underlying abilities. No substantial evidence was found to support the suggestion that the reading and speech recognition in noise problems of children with dyslexia arise from a single factor that could be defined as a spectral processing disorder. The reading and speech recognition in noise deficits of these children appeared to be largely independent. Copyright © 2018 Elsevier Ltd. All rights reserved.
The bright side of being blue: Depression as an adaptation for analyzing complex problems
Andrews, Paul W.; Thomson, J. Anderson
2009-01-01
Depression ranks as the primary emotional problem for which help is sought. Depressed people often have severe, complex problems, and rumination is a common feature. Depressed people often believe that their ruminations give them insight into their problems, but clinicians often view depressive rumination as pathological because it is difficult to disrupt and interferes with the ability to concentrate on other things. Abundant evidence indicates that depressive rumination involves the analysis of episode-related problems. Because analysis is time consuming and requires sustained processing, disruption would interfere with problem-solving. The analytical rumination (AR) hypothesis proposes that depression is an adaptation that evolved as a response to complex problems and whose function is to minimize disruption of rumination and sustain analysis of complex problems. It accomplishes this by giving episode-related problems priority access to limited processing resources, by reducing the desire to engage in distracting activities (anhedonia), and by producing psychomotor changes that reduce exposure to distracting stimuli. Because processing resources are limited, the inability to concentrate on other things is a tradeoff that must be made to sustain analysis of the triggering problem. The AR hypothesis is supported by evidence from many levels, including genes, neurotransmitters and their receptors, neurophysiology, neuroanatomy, neuroenergetics, pharmacology, cognition and behavior, and the efficacy of treatments. In addition, we address and provide explanations for puzzling findings in the cognitive and behavioral genetics literatures on depression. In the process, we challenge the belief that serotonin transmission is low in depression. Finally, we discuss implications of the hypothesis for understanding and treating depression. PMID:19618990
Testing of Hypothesis in Equivalence and Non Inferiority Trials-A Concept.
Juneja, Atul; Aggarwal, Abha R; Adhikari, Tulsi; Pandey, Arvind
2016-04-01
Establishing the appropriate hypothesis is one of the important steps for carrying out the statistical tests/analysis. Its understanding is important for interpreting the results of statistical analysis. The current communication attempts to provide the concept of testing of hypothesis in non inferiority and equivalence trials, where the null hypothesis is just reverse of what is set up for conventional superiority trials. It is similarly looked for rejection for establishing the fact the researcher is intending to prove. It is important to mention that equivalence or non inferiority cannot be proved by accepting the null hypothesis of no difference. Hence, establishing the appropriate statistical hypothesis is extremely important to arrive at meaningful conclusion for the set objectives in research.
NASA Astrophysics Data System (ADS)
Audenaert, Koenraad M. R.; Mosonyi, Milán
2014-10-01
We consider the multiple hypothesis testing problem for symmetric quantum state discrimination between r given states σ1, …, σr. By splitting up the overall test into multiple binary tests in various ways we obtain a number of upper bounds on the optimal error probability in terms of the binary error probabilities. These upper bounds allow us to deduce various bounds on the asymptotic error rate, for which it has been hypothesized that it is given by the multi-hypothesis quantum Chernoff bound (or Chernoff divergence) C(σ1, …, σr), as recently introduced by Nussbaum and Szkoła in analogy with Salikhov's classical multi-hypothesis Chernoff bound. This quantity is defined as the minimum of the pairwise binary Chernoff divergences min _{j
Matrix analysis and risk management to avert depression and suicide among workers
2010-01-01
Suicide is among the most tragic outcomes of all mental disorders, and the prevalence of suicide has risen dramatically during the last decade, particularly among workers. This paper reviews and proposes strategies to avert suicide and depression with regard to the mind body medicine equation hypothesis, metrics analysis of mental health problems from a public health and clinical medicine view. In occupational fields, the mind body medicine hypothesis has to deal with working environment, working condition, and workers' health. These three factors chosen in this paper were based on the concept of risk control, called San-kanri, which has traditionally been used in Japanese companies, and the causation concepts of host, agent, and environment. Working environment and working condition were given special focus with regard to tackling suicide problems. Matrix analysis was conducted by dividing the problem of working conditions into nine cells: three prevention levels (primary, secondary, and tertiary) were proposed for each of the three factors of the mind body medicine hypothesis (working environment, working condition, and workers' health). After using these main strategies (mind body medicine analysis and matrix analysis) to tackle suicide problems, the paper talks about the versatility of case-method teaching, "Hiyari-Hat activity," routine inspections by professionals, risk assessment analysis, and mandatory health check-up focusing on sleep and depression. In the risk assessment analysis, an exact assessment model was suggested using a formula based on multiplication of the following three factors: (1) severity, (2) frequency, and (3) possibility. Mental health problems, including suicide, are rather tricky to deal with because they involve evaluation of individual cases. The mind body medicine hypothesis and matrix analysis would be appropriate tactics for suicide prevention because they would help the evaluation of this issue as a tangible problem. PMID:21054837
Distributed resource allocation under communication constraints
NASA Astrophysics Data System (ADS)
Dodin, Pierre; Nimier, Vincent
2001-03-01
This paper deals with a study of the multi-sensor management problem for multi-target tracking. The collaboration between many sensors observing the same target means that they are able to fuse their data during the information process. Then one must take into account this possibility to compute the optimal association sensors-target at each step of time. In order to solve this problem for real large scale system, one must both consider the information aspect and the control aspect of the problem. To unify these problems, one possibility is to use a decentralized filtering algorithm locally driven by an assignment algorithm. The decentralized filtering algorithm we use in our model is the filtering algorithm of Grime, which relaxes the usual full-connected hypothesis. By full-connected, one means that the information in a full-connected system is totally distributed everywhere at the same moment, which is unacceptable for a real large scale system. We modelize the distributed assignment decision with the help of a greedy algorithm. Each sensor performs a global optimization, in order to estimate other information sets. A consequence of the relaxation of the full- connected hypothesis is that the sensors' information set are not the same at each step of time, producing an information dis- symmetry in the system. The assignment algorithm uses a local knowledge of this dis-symmetry. By testing the reactions and the coherence of the local assignment decisions of our system, against maneuvering targets, we show that it is still possible to manage with decentralized assignment control even though the system is not full-connected.
Skalická, Věra; Belsky, Jay; Stenseng, Frode; Wichstrøm, Lars
2015-01-01
The hypothesis was tested that the new open-group Norwegian day-care centers would more than traditionally organized centers negatively affect (a) current and (b) future teacher-child relationships, and (c) the developmental legacy of preschool problem behavior. The focus was on eight hundred and fifty 4-year-olds from 153 centers who were followed up in first grade. Results of this natural quasi-experiment revealed that children from open-group centers (a) experienced less teacher-child closeness in preschool and (b) more teacher-child conflict in first grade, and (c) that high levels of preschool problem behavior forecast especially high levels of future teacher-child conflict, but only for children from open-group centers. Results highlight the importance of spatial and social organization of day care and their translational implications. © 2015 The Authors. Child Development © 2015 Society for Research in Child Development, Inc.
Siffert, Andrea; Schwarz, Beate
2011-01-01
Guided by the emotional security hypothesis and the cognitive-contextual framework, the authors investigated whether the associations between negative parental conflict resolution styles and children's internalizing and externalizing problems were mediated by children's appraisals of threat and self-blame and their emotion regulation. Participants were 192 Swiss 2-parent families with children aged 9-12 years (M age = 10.62 years, SD = 0.41 years). Structural equation modeling was used to test the empirical validity of the theoretical model. Results indicated that children's maladaptive emotion regulation mediated the association between negative parental conflict resolution styles and children's internalizing as well as externalizing problems. Whereas perceived threat was related only to children's internalizing problems, self-blame did not mediate the links between negative parental conflict resolution styles and children's adjustment. Implications for understanding the mechanisms by which exposure to interparental conflict could lead to children's maladjustment and limitations of the study are discussed.
Considerations for multiple hypothesis correlation on tactical platforms
NASA Astrophysics Data System (ADS)
Thomas, Alan M.; Turpen, James E.
2013-05-01
Tactical platforms benefit greatly from the fusion of tracks from multiple sources in terms of increased situation awareness. As a necessary precursor to this track fusion, track-to-track association, or correlation, must first be performed. The related measurement-to-track fusion problem has been well studied with multiple hypothesis tracking and multiple frame assignment methods showing the most success. The track-to-track problem differs from this one in that measurements themselves are not available but rather track state update reports from the measuring sensors. Multiple hypothesis, multiple frame correlation systems have previously been considered; however, their practical implementation under the constraints imposed by tactical platforms is daunting. The situation is further exacerbated by the inconvenient nature of reports from legacy sensor systems on bandwidth- limited communications networks. In this paper, consideration is given to the special difficulties encountered when attempting the correlation of tracks from legacy sensors on tactical aircraft. Those difficulties include the following: covariance information from reporting sensors is frequently absent or incomplete; system latencies can create temporal uncertainty in data; and computational processing is severely limited by hardware and architecture. Moreover, consideration is given to practical solutions for dealing with these problems in a multiple hypothesis correlator.
Facio, Flavia M; Sapp, Julie C; Linn, Amy; Biesecker, Leslie G
2012-10-10
Massively-parallel sequencing (MPS) technologies create challenges for informed consent of research participants given the enormous scale of the data and the wide range of potential results. We propose that the consent process in these studies be based on whether they use MPS to test a hypothesis or to generate hypotheses. To demonstrate the differences in these approaches to informed consent, we describe the consent processes for two MPS studies. The purpose of our hypothesis-testing study is to elucidate the etiology of rare phenotypes using MPS. The purpose of our hypothesis-generating study is to test the feasibility of using MPS to generate clinical hypotheses, and to approach the return of results as an experimental manipulation. Issues to consider in both designs include: volume and nature of the potential results, primary versus secondary results, return of individual results, duty to warn, length of interaction, target population, and privacy and confidentiality. The categorization of MPS studies as hypothesis-testing versus hypothesis-generating can help to clarify the issue of so-called incidental or secondary results for the consent process, and aid the communication of the research goals to study participants.
The Association between Preschool Children's Social Functioning and Their Emergent Academic Skills.
Arnold, David H; Kupersmidt, Janis B; Voegler-Lee, Mary Ellen; Marshall, Nastassja
2012-01-01
This study examined the relationship between social functioning and emergent academic development in a sample of 467 preschool children (M = 55.9 months old, SD = 3.8). Teachers reported on children's aggression, attention problems, and prosocial skills. Preliteracy, language, and early mathematics skills were assessed with standardized tests. Better social functioning was associated with stronger academic development. Attention problems were related to poorer academic development controlling for aggression and social skills, pointing to the importance of attention in these relations. Children's social skills were related to academic development controlling for attention and aggression problems, consistent with models suggesting that children's social strengths and difficulties are independently related to their academic development. Support was not found for the hypothesis that these relationships would be stronger in boys than in girls. Some relationships were stronger in African American than Caucasian children. Children's self-reported feelings about school moderated several relationships, consistent with the idea that positive feelings about school may be a protective factor against co-occurring academic and social problems.
D'Lima, Gabrielle Maria; Pearson, Matthew R; Kelley, Michelle L
2012-06-01
This study examined protective behavioral strategies (PBS) as a potential mediator and moderator of the relationship between self-regulation and alcohol-related consequences. Participants were 249 first-year undergraduate men and women. The use of PBS partially mediated the relationship between self-regulation and alcohol-related problems (i.e., supporting the "self-control equals drinking control" hypothesis). However, use of PBS appeared more important for those with poorer self-regulation abilities (supporting the "PBS protect the impaired" hypothesis). Because both mediation and moderation were supported, a moderated mediation model was tested. The moderated mediation model demonstrated that the negative relationship between self-regulation and alcohol-related consequences could be explained by use of PBS for individuals with poor-to-average self-regulation but not for individuals with above-average, self-regulation abilities. Implications of the study's findings are discussed.
An Exercise for Illustrating the Logic of Hypothesis Testing
ERIC Educational Resources Information Center
Lawton, Leigh
2009-01-01
Hypothesis testing is one of the more difficult concepts for students to master in a basic, undergraduate statistics course. Students often are puzzled as to why statisticians simply don't calculate the probability that a hypothesis is true. This article presents an exercise that forces students to lay out on their own a procedure for testing a…
ERIC Educational Resources Information Center
Wilcox, Rand R.; Serang, Sarfaraz
2017-01-01
The article provides perspectives on p values, null hypothesis testing, and alternative techniques in light of modern robust statistical methods. Null hypothesis testing and "p" values can provide useful information provided they are interpreted in a sound manner, which includes taking into account insights and advances that have…
Hypothesis Testing Using Spatially Dependent Heavy Tailed Multisensor Data
2014-12-01
Office of Research 113 Bowne Hall Syracuse, NY 13244 -1200 ABSTRACT HYPOTHESIS TESTING USING SPATIALLY DEPENDENT HEAVY-TAILED MULTISENSOR DATA Report...consistent with the null hypothesis of linearity and can be used to estimate the distribution of a test statistic that can discrimi- nate between the null... Test for nonlinearity. Histogram is generated using the surrogate data. The statistic of the original time series is represented by the solid line
Narasingharao, Kumar; Pradhan, Balaram; Navaneetham, Janardhana
2017-03-01
Autism Spectrum Disorder (ASD) is a neuro developmental disorder which appears at early childhood age between 18 and 36 months. Apart from behaviour problems ASD children also suffer from sleep and Gastrointestinal (GI) problems. Major behaviour problems of ASD children are lack of social communication and interaction, less attention span, repetitive and restrictive behaviour, lack of eye to eye contact, aggressive and self-injurious behaviours, sensory integration problems, motor problems, deficiency in academic activities, anxiety and depression etc. Our hypothesis is that structured yoga intervention will brings significant changes in the problems of ASD children. The aim of this study was to find out efficacy of structured yoga intervention for sleep problems, gastrointestinal problems and behaviour problems of ASD children. It was an exploratory study with pre-test and post-test control design. Three sets of questionnaires having 61 questions developed by researchers were used to collect data pre and post yoga intervention. Questionnaires were based on three problematic areas of ASD children as mentioned above and were administered to parents by teachers under the supervision of researcher and clinical psychologists. Experimental group was given yoga intervention for a period of 90 days and control group continued with school curriculum. Both children and parents participated in this intervention. Significant changes were seen post yoga intervention in three areas of problems as mentioned above. Statistical analysis also showed significance value of 0.001 in the result. Structured yoga intervention can be conducted for a large group of ASD children with parent's involvement. Yoga can be used as alternative therapy to reduce the severity of symptoms of ASD children.
Pradhan, Balaram; Navaneetham, Janardhana
2017-01-01
Introduction Autism Spectrum Disorder (ASD) is a neuro developmental disorder which appears at early childhood age between 18 and 36 months. Apart from behaviour problems ASD children also suffer from sleep and Gastrointestinal (GI) problems. Major behaviour problems of ASD children are lack of social communication and interaction, less attention span, repetitive and restrictive behaviour, lack of eye to eye contact, aggressive and self-injurious behaviours, sensory integration problems, motor problems, deficiency in academic activities, anxiety and depression etc. Our hypothesis is that structured yoga intervention will brings significant changes in the problems of ASD children. Aim The aim of this study was to find out efficacy of structured yoga intervention for sleep problems, gastrointestinal problems and behaviour problems of ASD children. Materials and Methods It was an exploratory study with pre-test and post-test control design. Three sets of questionnaires having 61 questions developed by researchers were used to collect data pre and post yoga intervention. Questionnaires were based on three problematic areas of ASD children as mentioned above and were administered to parents by teachers under the supervision of researcher and clinical psychologists. Experimental group was given yoga intervention for a period of 90 days and control group continued with school curriculum. Results Both children and parents participated in this intervention. Significant changes were seen post yoga intervention in three areas of problems as mentioned above. Statistical analysis also showed significance value of 0.001 in the result. Conclusion Structured yoga intervention can be conducted for a large group of ASD children with parent’s involvement. Yoga can be used as alternative therapy to reduce the severity of symptoms of ASD children. PMID:28511484
Upper Ottawa street landfill site health study.
Hertzman, C; Hayes, M; Singer, J; Highland, J
1987-01-01
This report describes the design and conduct of two sequential historical prospective morbidity surveys of workers and residents from the Upper Ottawa Street Landfill Site in Hamilton, Ontario. The workers study was carried out first and was a hypothesis-generating study. Workers and controls were administered a health questionnaire, which was followed by an assessment of recall bias through medical chart abstraction. Multiple criteria were used to identify health problems associated with landfill site exposure. Those problems with highest credibility included clusters of respiratory, skin, narcotic, and mood disorders. These formed the hypothesis base in the subsequent health study of residents living adjacent to the landfill site. In that study, the association between mood, narcotic, skin, and respiratory conditions with landfill site exposure was confirmed using the following criteria: strength of association; consistency with the workers study; risk gradient by duration of residence and proximity to the landfill; absence of evidence that less healthy people moved to the area; specificity; and the absence of recall bias. The validity of these associations were reduced by three principal problems: the high refusal rate among the control population; socioeconomic status differences between the study groups; and the fact that the conditions found in excess were imprecisely defined and potentially interchangeable with other conditions. Offsetting these problems were the multiple criteria used to assess each hypothesis, which were applied according to present rules. Evidence is presented that supports the hypothesis that vapors, fumes, or particulate matter emanating from the landfill site, as well as direct skin exposure, may have lead to the health problems found in excess. Evidence is also presented supporting the hypothesis that perception of exposure and, therefore, of risk, may explain the results of the study. However, based on the analyses performed, it is the conclusion of the authors that the adverse effects seen were more likely the result of chemical exposure than of perception of risk. PMID:3691438
The role of responsibility and fear of guilt in hypothesis-testing.
Mancini, Francesco; Gangemi, Amelia
2006-12-01
Recent theories argue that both perceived responsibility and fear of guilt increase obsessive-like behaviours. We propose that hypothesis-testing might account for this effect. Both perceived responsibility and fear of guilt would influence subjects' hypothesis-testing, by inducing a prudential style. This style implies focusing on and confirming the worst hypothesis, and reiterating the testing process. In our experiment, we manipulated the responsibility and fear of guilt of 236 normal volunteers who executed a deductive task. The results show that perceived responsibility is the main factor that influenced individuals' hypothesis-testing. Fear of guilt has however a significant additive effect. Guilt-fearing participants preferred to carry on with the diagnostic process, even when faced with initial favourable evidence, whereas participants in the responsibility condition only did so when confronted with an unfavourable evidence. Implications for the understanding of obsessive-compulsive disorder (OCD) are discussed.
Dreaming is not controlled by hippocampal mechanisms.
Solms, Mark
2013-12-01
Links with the Humanities are to be welcomed, but they cannot be exempted from normal scientific criteria. Any hypothesis regarding the function of dreams that is premised on rapid eye movement (REM)/dream isomorphism is unsupportable on empirical grounds. Llewellyn's hypothesis has the further problem of counter-evidence in respect of its claim that dreaming relies upon hippocampal functions. The hypothesis also lacks face validity.
Organizational justice and sleeping problems: The Whitehall II study.
Elovainio, Marko; Ferrie, Jane E; Gimeno, David; De Vogli, Roberto; Shipley, Martin; Brunner, Eric J; Kumari, Meena; Vahtera, Jussi; Marmot, Michael G; Kivimäki, Mika
2009-04-01
To test the hypothesis that organizational injustice contributes to sleeping problems. Poor sleep quality can be a marker of prolonged emotional stress and has been shown to have serious effects on the immune system and metabolism. Data were from the prospective Whitehall II study of white-collar British civil servants (3143 women and 6895 men, aged 35-55 years at baseline). Age, employment grade, health behaviors, and depressive symptoms were measured at Phase 1 (1985-1988) and baseline sleeping problems were assessed at Phase 2 (1989-1990). Organizational justice was assessed twice, at Phases 1 and 2. The outcome was mean of sleeping problems during Phases 5 (1997-1999) and 7 (2003-2004). In men, low organizational justice at Phase 1 and Phase 2 were associated with overall sleeping problems, sleep maintenance problems, sleep onset problems, and nonrefreshing sleep at Phases 5 and 7. In women, a significant association was observed between low organizational justice and overall sleeping problems and sleep onset problems. These associations were robust to adjustments for age, employment grade, health behaviors, job strain, depressive symptoms, and sleeping problems at baseline. This study shows that perceived unfair treatment at workplace is associated with increased risk of poor sleep quality in men and women, one potential mechanism through which justice at work may affect health.
A Classic Test of the Hubbert-Rubey Weakening Mechanism: M7.6 Thrust-Belt Earthquake Taiwan
NASA Astrophysics Data System (ADS)
Yue, L.; Suppe, J.
2005-12-01
The Hubbert-Rubey (1959) fluid-pressure hypothesis has long been accepted as a classic solution to the problem of the apparent weakness of long thin thrust sheets. This hypothesis, in its classic form argues that ambient high pore-fluid pressures, which are common in sedimentary basins, reduce the normalized shear traction on the fault τb/ρ g H = μb(1-λb) where λb=Pf/ρ g H is the normalized pore-fluid pressure and μb is the coefficient of friction. Remarkably, there have been few large-scale tests of this classic hypothesis. Here we document ambient pore-fluid pressures surrounding the active frontal thrusts of western Taiwan, including the Chulungpu thrust that slipped in the 1999 Mw7.6 Chi-Chi earthquake. We show from 3-D mapping of these thrusts that they flatten to a shallow detachment at about 5 km depth in the Pliocene Chinshui Shale. Using critical-taper wedge theory and the dip of the detachment and surface slope we constrain the basal shear traction τb/ρ g H ≍ 0.1 which is substantially weaker than common lab friction values of of Byerlee's law (μb= 0.85-0.6). We have determined the pore-fluid pressures as a function of depth in 76 wells, based on in-situ formation tests, sonic logs and mud densities. Fluid pressures are regionally controlled stratigraphically by sedimentary facies. The top of overpressures is everywhere below the base of the Chinshui Shale, therefore the entire Chinshui thrust system is at ambient hydrostatic pore-fluid pressures (λb ≍ 0.4). According to the classic Hubbert-Rubey hypothesis the required basal coefficient of friction is therefore μb ≍ 0.1-0.2. Therefore the classic Hubbert & Rubey mechanism involving static ambient excess fluid pressures is not the cause of extreme fault weakening in this western Taiwan example. We must look to other mechanisms of large-scale fault weakening, many of which are difficult to test.
Chiba, Yasutaka
2017-09-01
Fisher's exact test is commonly used to compare two groups when the outcome is binary in randomized trials. In the context of causal inference, this test explores the sharp causal null hypothesis (i.e. the causal effect of treatment is the same for all subjects), but not the weak causal null hypothesis (i.e. the causal risks are the same in the two groups). Therefore, in general, rejection of the null hypothesis by Fisher's exact test does not mean that the causal risk difference is not zero. Recently, Chiba (Journal of Biometrics and Biostatistics 2015; 6: 244) developed a new exact test for the weak causal null hypothesis when the outcome is binary in randomized trials; the new test is not based on any large sample theory and does not require any assumption. In this paper, we extend the new test; we create a version of the test applicable to a stratified analysis. The stratified exact test that we propose is general in nature and can be used in several approaches toward the estimation of treatment effects after adjusting for stratification factors. The stratified Fisher's exact test of Jung (Biometrical Journal 2014; 56: 129-140) tests the sharp causal null hypothesis. This test applies a crude estimator of the treatment effect and can be regarded as a special case of our proposed exact test. Our proposed stratified exact test can be straightforwardly extended to analysis of noninferiority trials and to construct the associated confidence interval. © 2017 WILEY-VCH Verlag GmbH & Co. KGaA, Weinheim.
Avoiding false discoveries in association studies.
Sabatti, Chiara
2007-01-01
We consider the problem of controlling false discoveries in association studies. We assume that the design of the study is adequate so that the "false discoveries" are potentially only because of random chance, not to confounding or other flaws. Under this premise, we review the statistical framework for hypothesis testing and correction for multiple comparisons. We consider in detail the currently accepted strategies in linkage analysis. We then examine the underlying similarities and differences between linkage and association studies and document some of the most recent methodological developments for association mapping.
A systems approach to the physiology of weightlessness
NASA Technical Reports Server (NTRS)
White, Ronald J.; Leonard, Joel I.; Rummel, John A.; Leach, Carolyn S.
1991-01-01
A general systems approach to conducting and analyzing research on the human adaptation to weightlessness is presented. The research is aimed at clarifying the role that each of the major components of the human system plays following the transition to and from space. The approach utilizes a variety of mathematical models in order to pose and test alternative hypotheses concerned with the adaptation process. Certain aspects of the problem of fluid and electrolyte shifts in weightlessnes are considered, and an integrated hypothesis based on numerical simulation studies and experimental data is presented.
Monotonicity of fitness landscapes and mutation rate control.
Belavkin, Roman V; Channon, Alastair; Aston, Elizabeth; Aston, John; Krašovec, Rok; Knight, Christopher G
2016-12-01
A common view in evolutionary biology is that mutation rates are minimised. However, studies in combinatorial optimisation and search have shown a clear advantage of using variable mutation rates as a control parameter to optimise the performance of evolutionary algorithms. Much biological theory in this area is based on Ronald Fisher's work, who used Euclidean geometry to study the relation between mutation size and expected fitness of the offspring in infinite phenotypic spaces. Here we reconsider this theory based on the alternative geometry of discrete and finite spaces of DNA sequences. First, we consider the geometric case of fitness being isomorphic to distance from an optimum, and show how problems of optimal mutation rate control can be solved exactly or approximately depending on additional constraints of the problem. Then we consider the general case of fitness communicating only partial information about the distance. We define weak monotonicity of fitness landscapes and prove that this property holds in all landscapes that are continuous and open at the optimum. This theoretical result motivates our hypothesis that optimal mutation rate functions in such landscapes will increase when fitness decreases in some neighbourhood of an optimum, resembling the control functions derived in the geometric case. We test this hypothesis experimentally by analysing approximately optimal mutation rate control functions in 115 complete landscapes of binding scores between DNA sequences and transcription factors. Our findings support the hypothesis and find that the increase of mutation rate is more rapid in landscapes that are less monotonic (more rugged). We discuss the relevance of these findings to living organisms.
A statistical test to show negligible trend
Philip M. Dixon; Joseph H.K. Pechmann
2005-01-01
The usual statistical tests of trend are inappropriate for demonstrating the absence of trend. This is because failure to reject the null hypothesis of no trend does not prove that null hypothesis. The appropriate statistical method is based on an equivalence test. The null hypothesis is that the trend is not zero, i.e., outside an a priori specified equivalence region...
Deciphering the crowd: modeling and identification of pedestrian group motion.
Yücel, Zeynep; Zanlungo, Francesco; Ikeda, Tetsushi; Miyashita, Takahiro; Hagita, Norihiro
2013-01-14
Associating attributes to pedestrians in a crowd is relevant for various areas like surveillance, customer profiling and service providing. The attributes of interest greatly depend on the application domain and might involve such social relations as friends or family as well as the hierarchy of the group including the leader or subordinates. Nevertheless, the complex social setting inherently complicates this task. We attack this problem by exploiting the small group structures in the crowd. The relations among individuals and their peers within a social group are reliable indicators of social attributes. To that end, this paper identifies social groups based on explicit motion models integrated through a hypothesis testing scheme. We develop two models relating positional and directional relations. A pair of pedestrians is identified as belonging to the same group or not by utilizing the two models in parallel, which defines a compound hypothesis testing scheme. By testing the proposed approach on three datasets with different environmental properties and group characteristics, it is demonstrated that we achieve an identification accuracy of 87% to 99%. The contribution of this study lies in its definition of positional and directional relation models, its description of compound evaluations, and the resolution of ambiguities with our proposed uncertainty measure based on the local and global indicators of group relation.
Deciphering the Crowd: Modeling and Identification of Pedestrian Group Motion
Yücel, Zeynep; Zanlungo, Francesco; Ikeda, Tetsushi; Miyashita, Takahiro; Hagita, Norihiro
2013-01-01
Associating attributes to pedestrians in a crowd is relevant for various areas like surveillance, customer profiling and service providing. The attributes of interest greatly depend on the application domain and might involve such social relations as friends or family as well as the hierarchy of the group including the leader or subordinates. Nevertheless, the complex social setting inherently complicates this task. We attack this problem by exploiting the small group structures in the crowd. The relations among individuals and their peers within a social group are reliable indicators of social attributes. To that end, this paper identifies social groups based on explicit motion models integrated through a hypothesis testing scheme. We develop two models relating positional and directional relations. A pair of pedestrians is identified as belonging to the same group or not by utilizing the two models in parallel, which defines a compound hypothesis testing scheme. By testing the proposed approach on three datasets with different environmental properties and group characteristics, it is demonstrated that we achieve an identification accuracy of 87% to 99%. The contribution of this study lies in its definition of positional and directional relation models, its description of compound evaluations, and the resolution of ambiguities with our proposed uncertainty measure based on the local and global indicators of group relation. PMID:23344382
NASA Astrophysics Data System (ADS)
Zakiya, Hanifah; Sinaga, Parlindungan; Hamidah, Ida
2017-05-01
The results of field studies showed the ability of science literacy of students was still low. One root of the problem lies in the books used in learning is not oriented toward science literacy component. This study focused on the effectiveness of the use of textbook-oriented provisioning capability science literacy by using multi modal representation. The text books development method used Design Representational Approach Learning to Write (DRALW). Textbook design which was applied to the topic of "Kinetic Theory of Gases" is implemented in XI grade students of high school learning. Effectiveness is determined by consideration of the effect and the normalized percentage gain value, while the hypothesis was tested using Independent T-test. The results showed that the textbooks which were developed using multi-mode representation science can improve the literacy skills of students. Based on the size of the effect size textbooks developed with representation multi modal was found effective in improving students' science literacy skills. The improvement was occurred in all the competence and knowledge of scientific literacy. The hypothesis testing showed that there was a significant difference on the ability of science literacy between class that uses textbooks with multi modal representation and the class that uses the regular textbook used in schools.
Winking, Jeffrey; Stieglitz, Jonathan; Kurten, Jenna; Kaplan, Hillard; Gurven, Michael
2013-01-01
The polygyny–fertility hypothesis states that polygyny is associated with reduced fertility for women and is supported by a large body of literature. This finding is important, because theoretical models of polygyny often differentiate systems based on the degree to which women are forced or willingly choose to enter polygynous marriages. The fact that polygyny tends to be associated with reduced fertility has been presented as evidence that polygyny is often less favourable for women, and that women must, therefore, be pressured into accepting such arrangements. Previous studies, however, have been hampered by the non-random assignment of women into monogamous and polygynous unions (i.e. self-selection), as differences between these groups of women might explain some of the effects. Furthermore, the vast majority of such studies focus on sub-Saharan populations. We address these problems in our analysis of women's fertility in polygynous marriages among the Tsimane of Bolivia. We offer a more robust method for assessing the impact of polygynous marriage on reproductive outcomes by testing for intra-individual fertility effects among first wives as they transition from monogamous to polygynous marriage. We report a significant link between polygyny and reduced fertility when including all cases of polygyny; however, this association disappears when testing only for intra-individual effects. PMID:23407840
Unadjusted Bivariate Two-Group Comparisons: When Simpler is Better.
Vetter, Thomas R; Mascha, Edward J
2018-01-01
Hypothesis testing involves posing both a null hypothesis and an alternative hypothesis. This basic statistical tutorial discusses the appropriate use, including their so-called assumptions, of the common unadjusted bivariate tests for hypothesis testing and thus comparing study sample data for a difference or association. The appropriate choice of a statistical test is predicated on the type of data being analyzed and compared. The unpaired or independent samples t test is used to test the null hypothesis that the 2 population means are equal, thereby accepting the alternative hypothesis that the 2 population means are not equal. The unpaired t test is intended for comparing dependent continuous (interval or ratio) data from 2 study groups. A common mistake is to apply several unpaired t tests when comparing data from 3 or more study groups. In this situation, an analysis of variance with post hoc (posttest) intragroup comparisons should instead be applied. Another common mistake is to apply a series of unpaired t tests when comparing sequentially collected data from 2 study groups. In this situation, a repeated-measures analysis of variance, with tests for group-by-time interaction, and post hoc comparisons, as appropriate, should instead be applied in analyzing data from sequential collection points. The paired t test is used to assess the difference in the means of 2 study groups when the sample observations have been obtained in pairs, often before and after an intervention in each study subject. The Pearson chi-square test is widely used to test the null hypothesis that 2 unpaired categorical variables, each with 2 or more nominal levels (values), are independent of each other. When the null hypothesis is rejected, 1 concludes that there is a probable association between the 2 unpaired categorical variables. When comparing 2 groups on an ordinal or nonnormally distributed continuous outcome variable, the 2-sample t test is usually not appropriate. The Wilcoxon-Mann-Whitney test is instead preferred. When making paired comparisons on data that are ordinal, or continuous but nonnormally distributed, the Wilcoxon signed-rank test can be used. In analyzing their data, researchers should consider the continued merits of these simple yet equally valid unadjusted bivariate statistical tests. However, the appropriate use of an unadjusted bivariate test still requires a solid understanding of its utility, assumptions (requirements), and limitations. This understanding will mitigate the risk of misleading findings, interpretations, and conclusions.
Kairaluoma, L; Närhi, V; Ahonen, T; Westerholm, J; Aro, M
2009-01-01
There are claims that dietary supplementation of unsaturated fatty acids could help children with dyslexia to overcome their reading problems. However, these claims have not yet been empirically tested. This study was designed to test whether dietary supplementation was superior to placebo in treating reading, spelling or other reading-related skills of children with dyslexia. The experimental group (eicosapentaenoic acid, EPA, n = 30) ate dietary supplements and the control group (placebo, n = 31) placebos during the 90-day treatment period. The supplements contained omega-3 fatty acid (ethyl-EPA, 500 mg/day) and carnosine (400 mg/day). The groups were matched for reading skills, grade, gender, attention problems, intelligence and amount of special education. The literacy-related skills of the two groups were assessed before and after the treatment period. No group differences were observed between EPA and placebo in measures of reading accuracy or speed, spelling, decoding fluency, arithmetical skills, reading-related language skills, attention or behavioural problems. The present findings do not support the hypothesis that omega-3 fatty acid (ethyl-EPA) or carnosine has a role in the treatment of reading and spelling problems in children with dyslexia.
Appleyard, Karen; Berlin, Lisa J.; Rosanbalm, Katherine D.; Dodge, Kenneth A.
2013-01-01
In the interest of improving child maltreatment prevention science, this longitudinal, community based study of 499 mothers and their infants tested the hypothesis that mothers’ childhood history of maltreatment would predict maternal substance use problems, which in turn would predict offspring victimization. Mothers (35% White/non-Latina, 34% Black/non-Latina, 23% Latina, 7% other) were recruited and interviewed during pregnancy, and child protective services records were reviewed for the presence of the participants’ target infants between birth and age 26 months. Mediating pathways were examined through structural equation modeling and tested using the products of the coefficients approach. The mediated pathway from maternal history of sexual abuse to substance use problems to offspring victimization was significant (standardized mediated path [ab]=.07, 95% CI [.02, .14]; effect size=.26), as was the mediated pathway from maternal history of physical abuse to substance use problems to offspring victimization (standardized mediated path [ab]=.05, 95% CI [.01, .11]; effect size =.19). There was no significant mediated pathway from maternal history of neglect. Findings are discussed in terms of specific implications for child maltreatment prevention, including the importance of assessment and early intervention for maternal history of maltreatment and substance use problems, targeting women with maltreatment histories for substance use services, and integrating child welfare and parenting programs with substance use treatment. PMID:21240556
Appleyard, Karen; Berlin, Lisa J; Rosanbalm, Katherine D; Dodge, Kenneth A
2011-06-01
In the interest of improving child maltreatment prevention science, this longitudinal, community based study of 499 mothers and their infants tested the hypothesis that mothers' childhood history of maltreatment would predict maternal substance use problems, which in turn would predict offspring victimization. Mothers (35% White/non-Latina, 34% Black/non-Latina, 23% Latina, 7% other) were recruited and interviewed during pregnancy, and child protective services records were reviewed for the presence of the participants' target infants between birth and age 26 months. Mediating pathways were examined through structural equation modeling and tested using the products of the coefficients approach. The mediated pathway from maternal history of sexual abuse to substance use problems to offspring victimization was significant (standardized mediated path [ab] = .07, 95% CI [.02, .14]; effect size = .26), as was the mediated pathway from maternal history of physical abuse to substance use problems to offspring victimization (standardized mediated path [ab] = .05, 95% CI [.01, .11]; effect size = .19). There was no significant mediated pathway from maternal history of neglect. Findings are discussed in terms of specific implications for child maltreatment prevention, including the importance of assessment and early intervention for maternal history of maltreatment and substance use problems, targeting women with maltreatment histories for substance use services, and integrating child welfare and parenting programs with substance use treatment.
Larsen, Randy J; Kasimatis, Margaret; Frey, Kurt
1992-09-01
We examined the hypothesis that muscle contractions in the face influence subjective emotional experience. Previously, researchers have been critical of experiments designed to test this facial feedback hypothesis, particularly in terms of methodological problems that may lead to demand characteristics. In an effort to surmount these methodological problems Strack, Martin, and Stepper (1988) developed an experimental procedure whereby subjects were induced to contract facial muscles involved in the production of an emotional pattern, without being asked to actually simulate an emotion. Specifically, subjects were required to hold a pen in their teeth, which unobtrusively creates a contraction of the zygomaticus major muscles, the muscles involved in the production of a human smile. This manipulation minimises the likelihood that subjects are able to interpret their zygomaticus contractions as representing a particular emotion, thereby preventing subjects from determining the purpose of the experiment. Strack et al. (1988) found support for the facial feedback hypothesis applied to pleasant affect, in that subjects in the pen-in-teeth condition rated humorous cartoons as being funnier than subjects in the control condition (in which zygomaticus contractions were inhibited). The present study represents an extension of this nonobtrusive methodology to an investigation of the facial feedback of unpleasant affect. Consistent with the Strack et al. procedure, we wanted to have subjects furrow their brow without actually instructing them to do so and without asking them to produce any emotional facial pattern at all. This was achieved by attaching two golf tees to the subject's brow region (just above the inside comer of each eye) and then instructing them to touch the tips of the golf tees together as part of a "divided-attention" experiment. Touching the tips of the golf tees together could only be achieved by a contraction of the corrugator supercilii muscles, the muscles involved in the production of a sad emotional facial pattern. Subjects reported significantly more sadness in response to aversive photographs while touching the tips of the golf tees together than under conditions which inhibited corrugator contractions. These results provide evidence, using a new and unobtrusive manipulation, that facial feedback operates for unpleasant affect to a degree similar to that previously found for pleasant affect.
Longitudinal Dimensionality of Adolescent Psychopathology: Testing the Differentiation Hypothesis
ERIC Educational Resources Information Center
Sterba, Sonya K.; Copeland, William; Egger, Helen L.; Costello, E. Jane; Erkanli, Alaattin; Angold, Adrian
2010-01-01
Background: The differentiation hypothesis posits that the underlying liability distribution for psychopathology is of low dimensionality in young children, inflating diagnostic comorbidity rates, but increases in dimensionality with age as latent syndromes become less correlated. This hypothesis has not been adequately tested with longitudinal…
A large scale test of the gaming-enhancement hypothesis.
Przybylski, Andrew K; Wang, John C
2016-01-01
A growing research literature suggests that regular electronic game play and game-based training programs may confer practically significant benefits to cognitive functioning. Most evidence supporting this idea, the gaming-enhancement hypothesis , has been collected in small-scale studies of university students and older adults. This research investigated the hypothesis in a general way with a large sample of 1,847 school-aged children. Our aim was to examine the relations between young people's gaming experiences and an objective test of reasoning performance. Using a Bayesian hypothesis testing approach, evidence for the gaming-enhancement and null hypotheses were compared. Results provided no substantive evidence supporting the idea that having preference for or regularly playing commercially available games was positively associated with reasoning ability. Evidence ranged from equivocal to very strong in support for the null hypothesis over what was predicted. The discussion focuses on the value of Bayesian hypothesis testing for investigating electronic gaming effects, the importance of open science practices, and pre-registered designs to improve the quality of future work.
Wagner, Matthias Oliver; Bös, Klaus; Jascenoka, Julia; Jekauc, Darko; Petermann, Franz
2012-01-01
The aim of this study was to gain insights into the relationship between developmental coordination disorder, peer problems, and behavioral problems in school-aged children where both internalizing and externalizing behavioral problems were considered. We assumed that the relationship between developmental coordination disorder and internalizing/externalizing problems in school-aged children is mediated by peer problems and tested the hypothesis that a greater degree of motor impairment causes a greater degree of peer problems and thus a greater degree of internalizing or externalizing problems. Seventy boys and girls aged between 5 and 11 years were examined using the Movement Assessment Battery for Children 2 and the Intelligence and Developmental Scales. The results of path analysis showed that the relationship between developmental coordination disorder and internalizing/externalizing problems in school-aged children is mediated at least in part by peer problems. However, the cross-sectional design of the study does not provide conclusive evidence for a cause-effect relationship and only allows for the conservative prognosis that a greater degree of motor impairment may cause a greater degree of peer problems and thus a greater degree of internalizing/externalizing problems. Nevertheless, the results of this study emphasize the importance of being well-integrated in their peer group especially for children with developmental coordination disorder. Copyright © 2012 Elsevier Ltd. All rights reserved.
Measurement of air contamination in different wards of public sector hospital, Sukkur.
Memon, Badaruddin AllahDino; Bhutto, Gul Hassan; Rizvi, Wajid Hussain
2016-11-01
The aim of this study was to evaluate and assess the index of bacterial contamination in different wards of the Public Sector Hospital of Sukkur (Teaching) Pakistan; whether or not the air contamination was statistically different from the acceptable level using active and passive sampling. In addition to this main hypothesis, other investigations included: occurrence of the most common bacteria, whether or not the bacterial contamination in the wards was a persistent problem and identification of the effective antibiotics against the indentified bacteria. The evidence sought based on the One Sample T test suggests that there is a (statistically) significant difference between the observed (higher) than the acceptance level (p<0.01), the result based on One-Way ANOVA suggests that the contamination problem was persistent as there was no significant difference among observed contamination of all three visits at (p>0.01) and the result of antibiotic susceptibility test highlights sensitivity and resistance level of antibiotics for the indentified bacteria.
Effect of climate-related mass extinctions on escalation in molluscs
NASA Astrophysics Data System (ADS)
Hansen, Thor A.; Kelley, Patricia H.; Melland, Vicky D.; Graham, Scott E.
1999-12-01
We test the hypothesis that escalated species (e.g., those with antipredatory adaptations such as heavy armor) are more vulnerable to extinctions caused by changes in climate. If this hypothesis is valid, recovery faunas after climate-related extinctions should include significantly fewer species with escalated shell characteristics, and escalated species should undergo greater rates of extinction than nonescalated species. This hypothesis is tested for the Cretaceous-Paleocene, Eocene-Oligocene, middle Miocene, and Pliocene-Pleistocene mass extinctions. Gastropod and bivalve molluscs from the U.S. coastal plain were evaluated for 10 shell characters that confer resistance to predators. Of 40 tests, one supported the hypothesis; highly ornamented gastropods underwent greater levels of Pliocene-Pleistocene extinction than did nonescalated species. All remaining tests were nonsignificant. The hypothesis that escalated species are more vulnerable to climate-related mass extinctions is not supported.
Simulation-based hypothesis testing of high dimensional means under covariance heterogeneity.
Chang, Jinyuan; Zheng, Chao; Zhou, Wen-Xin; Zhou, Wen
2017-12-01
In this article, we study the problem of testing the mean vectors of high dimensional data in both one-sample and two-sample cases. The proposed testing procedures employ maximum-type statistics and the parametric bootstrap techniques to compute the critical values. Different from the existing tests that heavily rely on the structural conditions on the unknown covariance matrices, the proposed tests allow general covariance structures of the data and therefore enjoy wide scope of applicability in practice. To enhance powers of the tests against sparse alternatives, we further propose two-step procedures with a preliminary feature screening step. Theoretical properties of the proposed tests are investigated. Through extensive numerical experiments on synthetic data sets and an human acute lymphoblastic leukemia gene expression data set, we illustrate the performance of the new tests and how they may provide assistance on detecting disease-associated gene-sets. The proposed methods have been implemented in an R-package HDtest and are available on CRAN. © 2017, The International Biometric Society.
Shi, Yunfei; Yao, Jiang; Young, Jonathan M.; Fee, Judy A.; Perucchio, Renato; Taber, Larry A.
2014-01-01
The morphogenetic process of cardiac looping transforms the straight heart tube into a curved tube that resembles the shape of the future four-chambered heart. Although great progress has been made in identifying the molecular and genetic factors involved in looping, the physical mechanisms that drive this process have remained poorly understood. Recent work, however, has shed new light on this complicated problem. After briefly reviewing the current state of knowledge, we propose a relatively comprehensive hypothesis for the mechanics of the first phase of looping, termed c-looping, as the straight heart tube deforms into a c-shaped tube. According to this hypothesis, differential hypertrophic growth in the myocardium supplies the main forces that cause the heart tube to bend ventrally, while regional growth and cytoskeletal contraction in the omphalomesenteric veins (primitive atria) and compressive loads exerted by the splanchnopleuric membrane drive rightward torsion. A computational model based on realistic embryonic heart geometry is used to test the physical plausibility of this hypothesis. The behavior of the model is in reasonable agreement with available experimental data from control and perturbed embryos, offering support for our hypothesis. The results also suggest, however, that several other mechanisms contribute secondarily to normal looping, and we speculate that these mechanisms play backup roles when looping is perturbed. Finally, some outstanding questions are discussed for future study. PMID:25161623
Shi, Yunfei; Yao, Jiang; Young, Jonathan M; Fee, Judy A; Perucchio, Renato; Taber, Larry A
2014-01-01
The morphogenetic process of cardiac looping transforms the straight heart tube into a curved tube that resembles the shape of the future four-chambered heart. Although great progress has been made in identifying the molecular and genetic factors involved in looping, the physical mechanisms that drive this process have remained poorly understood. Recent work, however, has shed new light on this complicated problem. After briefly reviewing the current state of knowledge, we propose a relatively comprehensive hypothesis for the mechanics of the first phase of looping, termed c-looping, as the straight heart tube deforms into a c-shaped tube. According to this hypothesis, differential hypertrophic growth in the myocardium supplies the main forces that cause the heart tube to bend ventrally, while regional growth and cytoskeletal contraction in the omphalomesenteric veins (primitive atria) and compressive loads exerted by the splanchnopleuric membrane drive rightward torsion. A computational model based on realistic embryonic heart geometry is used to test the physical plausibility of this hypothesis. The behavior of the model is in reasonable agreement with available experimental data from control and perturbed embryos, offering support for our hypothesis. The results also suggest, however, that several other mechanisms contribute secondarily to normal looping, and we speculate that these mechanisms play backup roles when looping is perturbed. Finally, some outstanding questions are discussed for future study.
ERIC Educational Resources Information Center
Marmolejo-Ramos, Fernando; Cousineau, Denis
2017-01-01
The number of articles showing dissatisfaction with the null hypothesis statistical testing (NHST) framework has been progressively increasing over the years. Alternatives to NHST have been proposed and the Bayesian approach seems to have achieved the highest amount of visibility. In this last part of the special issue, a few alternative…
On computation of p-values in parametric linkage analysis.
Kurbasic, Azra; Hössjer, Ola
2004-01-01
Parametric linkage analysis is usually used to find chromosomal regions linked to a disease (phenotype) that is described with a specific genetic model. This is done by investigating the relations between the disease and genetic markers, that is, well-characterized loci of known position with a clear Mendelian mode of inheritance. Assume we have found an interesting region on a chromosome that we suspect is linked to the disease. Then we want to test the hypothesis of no linkage versus the alternative one of linkage. As a measure we use the maximal lod score Z(max). It is well known that the maximal lod score has asymptotically a (2 ln 10)(-1) x (1/2 chi2(0) + 1/2 chi2(1)) distribution under the null hypothesis of no linkage when only one point (one marker) on the chromosome is studied. In this paper, we show, both by simulations and theoretical arguments, that the null hypothesis distribution of Zmax has no simple form when more than one marker is used (multipoint analysis). In fact, the distribution of Zmax depends on the number of families, their structure, the assumed genetic model, marker denseness, and marker informativity. This means that a constant critical limit of Zmax leads to tests associated with different significance levels. Because of the above-mentioned problems, from the statistical point of view the maximal lod score should be supplemented by a p-value when results are reported. Copyright (c) 2004 S. Karger AG, Basel.
Application of Multi-Hypothesis Sequential Monte Carlo for Breakup Analysis
NASA Astrophysics Data System (ADS)
Faber, W. R.; Zaidi, W.; Hussein, I. I.; Roscoe, C. W. T.; Wilkins, M. P.; Schumacher, P. W., Jr.
As more objects are launched into space, the potential for breakup events and space object collisions is ever increasing. These events create large clouds of debris that are extremely hazardous to space operations. Providing timely, accurate, and statistically meaningful Space Situational Awareness (SSA) data is crucial in order to protect assets and operations in space. The space object tracking problem, in general, is nonlinear in both state dynamics and observations, making it ill-suited to linear filtering techniques such as the Kalman filter. Additionally, given the multi-object, multi-scenario nature of the problem, space situational awareness requires multi-hypothesis tracking and management that is combinatorially challenging in nature. In practice, it is often seen that assumptions of underlying linearity and/or Gaussianity are used to provide tractable solutions to the multiple space object tracking problem. However, these assumptions are, at times, detrimental to tracking data and provide statistically inconsistent solutions. This paper details a tractable solution to the multiple space object tracking problem applicable to space object breakup events. Within this solution, simplifying assumptions of the underlying probability density function are relaxed and heuristic methods for hypothesis management are avoided. This is done by implementing Sequential Monte Carlo (SMC) methods for both nonlinear filtering as well as hypothesis management. This goal of this paper is to detail the solution and use it as a platform to discuss computational limitations that hinder proper analysis of large breakup events.
NASA Astrophysics Data System (ADS)
Thompson, E. M.; Hewlett, J. B.; Baise, L. G.; Vogel, R. M.
2011-01-01
Annual maximum (AM) time series are incomplete (i.e., censored) when no events are included above the assumed censoring threshold (i.e., magnitude of completeness). We introduce a distrtibutional hypothesis test for left-censored Gumbel observations based on the probability plot correlation coefficient (PPCC). Critical values of the PPCC hypothesis test statistic are computed from Monte-Carlo simulations and are a function of sample size, censoring level, and significance level. When applied to a global catalog of earthquake observations, the left-censored Gumbel PPCC tests are unable to reject the Gumbel hypothesis for 45 of 46 seismic regions. We apply four different field significance tests for combining individual tests into a collective hypothesis test. None of the field significance tests are able to reject the global hypothesis that AM earthquake magnitudes arise from a Gumbel distribution. Because the field significance levels are not conclusive, we also compute the likelihood that these field significance tests are unable to reject the Gumbel model when the samples arise from a more complex distributional alternative. A power study documents that the censored Gumbel PPCC test is unable to reject some important and viable Generalized Extreme Value (GEV) alternatives. Thus, we cannot rule out the possibility that the global AM earthquake time series could arise from a GEV distribution with a finite upper bound, also known as a reverse Weibull distribution. Our power study also indicates that the binomial and uniform field significance tests are substantially more powerful than the more commonly used Bonferonni and false discovery rate multiple comparison procedures.
Saraf, Sanatan; Mathew, Thomas; Roy, Anindya
2015-01-01
For the statistical validation of surrogate endpoints, an alternative formulation is proposed for testing Prentice's fourth criterion, under a bivariate normal model. In such a setup, the criterion involves inference concerning an appropriate regression parameter, and the criterion holds if the regression parameter is zero. Testing such a null hypothesis has been criticized in the literature since it can only be used to reject a poor surrogate, and not to validate a good surrogate. In order to circumvent this, an equivalence hypothesis is formulated for the regression parameter, namely the hypothesis that the parameter is equivalent to zero. Such an equivalence hypothesis is formulated as an alternative hypothesis, so that the surrogate endpoint is statistically validated when the null hypothesis is rejected. Confidence intervals for the regression parameter and tests for the equivalence hypothesis are proposed using bootstrap methods and small sample asymptotics, and their performances are numerically evaluated and recommendations are made. The choice of the equivalence margin is a regulatory issue that needs to be addressed. The proposed equivalence testing formulation is also adopted for other parameters that have been proposed in the literature on surrogate endpoint validation, namely, the relative effect and proportion explained.
Test of association: which one is the most appropriate for my study?
Gonzalez-Chica, David Alejandro; Bastos, João Luiz; Duquia, Rodrigo Pereira; Bonamigo, Renan Rangel; Martínez-Mesa, Jeovany
2015-01-01
Hypothesis tests are statistical tools widely used for assessing whether or not there is an association between two or more variables. These tests provide a probability of the type 1 error (p-value), which is used to accept or reject the null study hypothesis. To provide a practical guide to help researchers carefully select the most appropriate procedure to answer the research question. We discuss the logic of hypothesis testing and present the prerequisites of each procedure based on practical examples.
Improving the Crossing-SIBTEST Statistic for Detecting Non-uniform DIF.
Chalmers, R Philip
2018-06-01
This paper demonstrates that, after applying a simple modification to Li and Stout's (Psychometrika 61(4):647-677, 1996) CSIBTEST statistic, an improved variant of the statistic could be realized. It is shown that this modified version of CSIBTEST has a more direct association with the SIBTEST statistic presented by Shealy and Stout (Psychometrika 58(2):159-194, 1993). In particular, the asymptotic sampling distributions and general interpretation of the effect size estimates are the same for SIBTEST and the new CSIBTEST. Given the more natural connection to SIBTEST, it is shown that Li and Stout's hypothesis testing approach is insufficient for CSIBTEST; thus, an improved hypothesis testing procedure is required. Based on the presented arguments, a new chi-squared-based hypothesis testing approach is proposed for the modified CSIBTEST statistic. Positive results from a modest Monte Carlo simulation study strongly suggest the original CSIBTEST procedure and randomization hypothesis testing approach should be replaced by the modified statistic and hypothesis testing method.
NASA Astrophysics Data System (ADS)
Sliva, Yekaterina
The purpose of this study was to introduce an instructional technique for teaching complex tasks in physics, test its effectiveness and efficiency, and understand cognitive processes taking place in learners' minds while they are exposed to this technique. The study was based primarily on cognitive load theory (CLT). CLT determines the amount of total cognitive load imposed on a learner by a learning task as combined intrinsic (invested in comprehending task complexity) and extraneous (wasteful) cognitive load. Working memory resources associated with intrinsic cognitive load are defined as germane resources caused by element interactivity that lead to learning, in contrast to extraneous working memory resources that are devoted to dealing with extraneous cognitive load. However, the amount of learner's working memory resources actually devoted to a task depends on how well the learner is engaged in the learning environment. Since total cognitive load has to stay within limits of working memory capacity, both extraneous and intrinsic cognitive load need to be reduced. In order for effective learning to occur, the use of germane cognitive resources should be maximized. In this study, the use of germane resources was maximized for two experimental groups by providing a learning environment that combined problem-solving procedure with prompts to self-explain with and without completion problems. The study tested three hypotheses and answered two research questions. The first hypothesis predicting that experimental treatments would reduce total cognitive load was not supported. The second hypothesis predicting that experimental treatments would increase performance was supported for the self-explanation group only. The third hypothesis that tested efficiency measure as adopted from Paas and van Merrienboer (1993) was not supported. As for the research question of whether the quality of self-explanations would change with time for the two experimental conditions, it was determined that time had a positive effect on such quality. The research question that investigated learners' attitudes towards the instructions revealed that experimental groups understood the main idea behind the suggested technique and positively reacted to it. The results of the study support the conclusions that (a) prompting learners to self-explain while independently solving problems can increase performance, especially on far transfer questions; (b) better performance is achieved in combination with increased mental effort; (c) self-explanations do not increase time on task; and (d) quality of self-explanations can be improved with time. Results based on the analyses of learners' attitudes further support that learners in the experimental groups understood the main idea behind the suggested techniques and positively reacted to them. The study also raised concern about application of efficiency formula for instructional conditions that increase both performance and mental effort in CLT. As a result, an alternative model was suggested to explain the relationship between performance and mental effort based on Yerkes-Dodson law (1908). Keywords: instructional design, cognitive load, complex tasks, problem-solving, self-explanation.
Hertzog, Christopher; Smith, R Marit; Ariel, Robert
2018-01-01
Background/Study Context: This study evaluated adult age differences in the original three-item Cognitive Reflection Test (CRT; Frederick, 2005, The Journal of Economic Perspectives, 19, 25-42) and an expanded seven-item version of that test (Toplak et al., 2013, Thinking and Reasoning, 20, 147-168). The CRT is a numerical problem-solving test thought to capture a disposition towards either rapid, intuition-based problem solving (Type I reasoning) or a more thoughtful, analytical problem-solving approach (Type II reasoning). Test items are designed to induce heuristically guided errors that can be avoided if using an appropriate numerical representation of the test problems. We evaluated differences between young adults and old adults in CRT performance and correlates of CRT performance. Older adults (ages 60 to 80) were paid volunteers who participated in experiments assessing age differences in self-regulated learning. Young adults (ages 17 to 35) were students participating for pay as part of a project assessing measures of critical thinking skills or as a young comparison group in the self-regulated learning study. There were age differences in the number of CRT correct responses in two independent samples. Results with the original three-item CRT found older adults to have a greater relative proportion of errors based on providing the intuitive lure. However, younger adults actually had a greater proportion of intuitive errors on the long version of the CRT, relative to older adults. Item analysis indicated a much lower internal consistency of CRT items for older adults. These outcomes do not offer full support for the argument that older adults are higher in the use of a "Type I" cognitive style. The evidence was also consistent with an alternative hypothesis that age differences were due to lower levels of numeracy in the older samples. Alternative process-oriented evaluations of how older adults solve CRT items will probably be needed to determine conditions under which older adults manifest an increase in the Type I dispositional tendency to opt for superficial, heuristically guided problem representations in numerical problem-solving tasks.
The Factors Affecting Definition of Research Problems in Educational Technology Researches
ERIC Educational Resources Information Center
Bahçekapili, Ekrem; Bahçekapili, Tugba; Fis Erümit, Semra; Göktas, Yüksel; Sözbilir, Mustafa
2013-01-01
Research problems in a scientific research are formed after a certain process. This process starts with defining a research topic and transforms into a specific research problem or hypothesis. The aim of this study was to examine the way educational technology researchers identify their research problems. To this end, sources that educational…
ERIC Educational Resources Information Center
McNeil, Nicole M.; Rittle-Johnson, Bethany; Hattikudur, Shanta; Petersen, Lori A.
2010-01-01
This study examined if solving arithmetic problems hinders undergraduates' accuracy on algebra problems. The hypothesis was that solving arithmetic problems would hinder accuracy because it activates an operational view of equations, even in educated adults who have years of experience with algebra. In three experiments, undergraduates (N = 184)…
Baka, Łukasz
2015-01-01
The aim of the study was to investigate the direct and indirect - mediated by job burnout - effects of job demands on mental and physical health problems. The Job Demands-Resources model was the theoretical framework of the study. Three job demands were taken into account - interpersonal conflicts at work, organizational constraints and workload. Indicators of mental and physical health problems included depression and physical symptoms, respectively. Three hundred and sixteen Polish teachers from 8 schools participated in the study. The hypotheses were tested with the use of tools measuring job demands (Interpersonal Conflicts at Work, Organizational Constraints, Quantitative Workload), job burnout (the Oldenburg Burnout Inventory), depression (the Beck Hopelessness Scale), and physical symptoms (the Physical Symptoms Inventory). The regression analysis with bootstrapping, using the PROCESS macros of Hayes was applied. The results support the hypotheses partially. The indirect effect and to some extent the direct effect of job demands turned out to be statistically important. The negative impact of 3 job demands on mental (hypothesis 1 - H1) and physical (hypothesis 2 - H2) health were mediated by the increasing job burnout. Only organizational constraints were directly associated with mental (and not physical) health. The results partially support the notion of the Job Demands-Resources model and provide further insight into processes leading to the low well-being of teachers in the workplace. This work is available in Open Access model and licensed under a CC BY-NC 3.0 PL license.
The Contribution of Adolescent Effortful Control to Early Adult Educational Attainment
Véronneau, Marie-Hélène; Racer, Kristina Hiatt; Fosco, Gregory M.; Dishion, Thomas J.
2014-01-01
Effortful control has been proposed as a set of neurocognitive competencies that is relevant to self-regulation and educational attainment (Posner & Rothbart, 2007). This study tested the hypothesis that a multiagent report of adolescents’ effortful control (age 17) would be predictive of academic persistence and educational attainment (age 23–25), after controlling for other established predictors (family factors, problem behavior, grade point average, and substance use). Participants were 997 students recruited in 6th grade from 3 urban public middle schools (53% males; 42.4% European American; 29.2% African American). Consistent with the hypothesis, the unique association of effortful control with future educational attainment was comparable in strength to that of parental education and students’ past grade point average, suggesting that effortful control contributes to this outcome above and beyond well-established predictors. Path coefficients were equivalent across gender and ethnicity (European Americans and African Americans). Effortful control appears to be a core feature of the self-regulatory competencies associated with achievement of educational success in early adulthood. These findings suggest that the promotion of self-regulation in general and effortful control in particular may be an important focus not only for resilience to stress and avoidance of problem behavior, but also for growth in academic competence. PMID:25308996
Effects of Phasor Measurement Uncertainty on Power Line Outage Detection
DOE Office of Scientific and Technical Information (OSTI.GOV)
Chen, Chen; Wang, Jianhui; Zhu, Hao
2014-12-01
Phasor measurement unit (PMU) technology provides an effective tool to enhance the wide-area monitoring systems (WAMSs) in power grids. Although extensive studies have been conducted to develop several PMU applications in power systems (e.g., state estimation, oscillation detection and control, voltage stability analysis, and line outage detection), the uncertainty aspects of PMUs have not been adequately investigated. This paper focuses on quantifying the impact of PMU uncertainty on power line outage detection and identification, in which a limited number of PMUs installed at a subset of buses are utilized to detect and identify the line outage events. Specifically, the linemore » outage detection problem is formulated as a multi-hypothesis test, and a general Bayesian criterion is used for the detection procedure, in which the PMU uncertainty is analytically characterized. We further apply the minimum detection error criterion for the multi-hypothesis test and derive the expected detection error probability in terms of PMU uncertainty. The framework proposed provides fundamental guidance for quantifying the effects of PMU uncertainty on power line outage detection. Case studies are provided to validate our analysis and show how PMU uncertainty influences power line outage detection.« less
NASA Astrophysics Data System (ADS)
Datta, Nilanjana; Pautrat, Yan; Rouzé, Cambyse
2016-06-01
Quantum Stein's lemma is a cornerstone of quantum statistics and concerns the problem of correctly identifying a quantum state, given the knowledge that it is one of two specific states (ρ or σ). It was originally derived in the asymptotic i.i.d. setting, in which arbitrarily many (say, n) identical copies of the state (ρ⊗n or σ⊗n) are considered to be available. In this setting, the lemma states that, for any given upper bound on the probability αn of erroneously inferring the state to be σ, the probability βn of erroneously inferring the state to be ρ decays exponentially in n, with the rate of decay converging to the relative entropy of the two states. The second order asymptotics for quantum hypothesis testing, which establishes the speed of convergence of this rate of decay to its limiting value, was derived in the i.i.d. setting independently by Tomamichel and Hayashi, and Li. We extend this result to settings beyond i.i.d. Examples of these include Gibbs states of quantum spin systems (with finite-range, translation-invariant interactions) at high temperatures, and quasi-free states of fermionic lattice gases.
Bayesian adaptive phase II screening design for combination trials.
Cai, Chunyan; Yuan, Ying; Johnson, Valen E
2013-01-01
Trials of combination therapies for the treatment of cancer are playing an increasingly important role in the battle against this disease. To more efficiently handle the large number of combination therapies that must be tested, we propose a novel Bayesian phase II adaptive screening design to simultaneously select among possible treatment combinations involving multiple agents. Our design is based on formulating the selection procedure as a Bayesian hypothesis testing problem in which the superiority of each treatment combination is equated to a single hypothesis. During the trial conduct, we use the current values of the posterior probabilities of all hypotheses to adaptively allocate patients to treatment combinations. Simulation studies show that the proposed design substantially outperforms the conventional multiarm balanced factorial trial design. The proposed design yields a significantly higher probability for selecting the best treatment while allocating substantially more patients to efficacious treatments. The proposed design is most appropriate for the trials combining multiple agents and screening out the efficacious combination to be further investigated. The proposed Bayesian adaptive phase II screening design substantially outperformed the conventional complete factorial design. Our design allocates more patients to better treatments while providing higher power to identify the best treatment at the end of the trial.
Testing the H2O2-H2O hypothesis for life on Mars with the TEGA instrument on the Phoenix lander.
Schulze-Makuch, Dirk; Turse, Carol; Houtkooper, Joop M; McKay, Christopher P
2008-04-01
In the time since the Viking life-detection experiments were conducted on Mars, many missions have enhanced our knowledge about the environmental conditions on the Red Planet. However, the martian surface chemistry and the Viking lander results remain puzzling. Nonbiological explanations that favor a strong inorganic oxidant are currently favored (e.g., Mancinelli, 1989; Plumb et al., 1989; Quinn and Zent, 1999; Klein, 1999; Yen et al., 2000), but problems remain regarding the lifetime, source, and abundance of that oxidant to account for the Viking observations (Zent and McKay, 1994). Alternatively, a hypothesis that favors the biological origin of a strong oxidizer has recently been advanced (Houtkooper and Schulze-Makuch, 2007). Here, we report on laboratory experiments that simulate the experiments to be conducted by the Thermal and Evolved Gas Analyzer (TEGA) instrument of the Phoenix lander, which is to descend on Mars in May 2008. Our experiments provide a baseline for an unbiased test for chemical versus biological responses, which can be applied at the time the Phoenix lander transmits its first results from the martian surface.
NASA Astrophysics Data System (ADS)
Le, Huy Xuan; Matunaga, Saburo
2014-12-01
This paper presents an adaptive unscented Kalman filter (AUKF) to recover the satellite attitude in a fault detection and diagnosis (FDD) subsystem of microsatellites. The FDD subsystem includes a filter and an estimator with residual generators, hypothesis tests for fault detections and a reference logic table for fault isolations and fault recovery. The recovery process is based on the monitoring of mean and variance values of each attitude sensor behaviors from residual vectors. In the case of normal work, the residual vectors should be in the form of Gaussian white noise with zero mean and fixed variance. When the hypothesis tests for the residual vectors detect something unusual by comparing the mean and variance values with dynamic thresholds, the AUKF with real-time updated measurement noise covariance matrix will be used to recover the sensor faults. The scheme developed in this paper resolves the problem of the heavy and complex calculations during residual generations and therefore the delay in the isolation process is reduced. The numerical simulations for TSUBAME, a demonstration microsatellite of Tokyo Institute of Technology, are conducted and analyzed to demonstrate the working of the AUKF and FDD subsystem.
Metin, Baris; Wiersema, Jan R; Verguts, Tom; Gasthuys, Roos; van Der Meere, Jacob J; Roeyers, Herbert; Sonuga-Barke, Edmund
2016-01-01
According to the state regulation deficit (SRD) account, ADHD is associated with a problem using effort to maintain an optimal activation state under demanding task settings such as very fast or very slow event rates. This leads to a prediction of disrupted performance at event rate extremes reflected in higher Gaussian response variability that is a putative marker of activation during motor preparation. In the current study, we tested this hypothesis using ex-Gaussian modeling, which distinguishes Gaussian from non-Gaussian variability. Twenty-five children with ADHD and 29 typically developing controls performed a simple Go/No-Go task under four different event-rate conditions. There was an accentuated quadratic relationship between event rate and Gaussian variability in the ADHD group compared to the controls. The children with ADHD had greater Gaussian variability at very fast and very slow event rates but not at moderate event rates. The results provide evidence for the SRD account of ADHD. However, given that this effect did not explain all group differences (some of which were independent of event rate) other cognitive and/or motivational processes are also likely implicated in ADHD performance deficits.
How scientific experiments are designed: Problem solving in a knowledge-rich, error-rich environment
NASA Astrophysics Data System (ADS)
Baker, Lisa M.
While theory formation and the relation between theory and data has been investigated in many studies of scientific reasoning, researchers have focused less attention on reasoning about experimental design, even though the experimental design process makes up a large part of real-world scientists' reasoning. The goal of this thesis was to provide a cognitive account of the scientific experimental design process by analyzing experimental design as problem-solving behavior (Newell & Simon, 1972). Three specific issues were addressed: the effect of potential error on experimental design strategies, the role of prior knowledge in experimental design, and the effect of characteristics of the space of alternate hypotheses on alternate hypothesis testing. A two-pronged in vivo/in vitro research methodology was employed, in which transcripts of real-world scientific laboratory meetings were analyzed as well as undergraduate science and non-science majors' design of biology experiments in the psychology laboratory. It was found that scientists use a specific strategy to deal with the possibility of error in experimental findings: they include "known" control conditions in their experimental designs both to determine whether error is occurring and to identify sources of error. The known controls strategy had not been reported in earlier studies with science-like tasks, in which participants' responses to error had consisted of replicating experiments and discounting results. With respect to prior knowledge: scientists and undergraduate students drew on several types of knowledge when designing experiments, including theoretical knowledge, domain-specific knowledge of experimental techniques, and domain-general knowledge of experimental design strategies. Finally, undergraduate science students generated and tested alternates to their favored hypotheses when the space of alternate hypotheses was constrained and searchable. This result may help explain findings of confirmation bias in earlier studies using science-like tasks, in which characteristics of the alternate hypothesis space may have made it unfeasible for participants to generate and test alternate hypotheses. In general, scientists and science undergraduates were found to engage in a systematic experimental design process that responded to salient features of the problem environment, including the constant potential for experimental error, availability of alternate hypotheses, and access to both theoretical knowledge and knowledge of experimental techniques.
Niknafs, Noushin; Beleva-Guthrie, Violeta; Naiman, Daniel Q.; Karchin, Rachel
2015-01-01
Recent improvements in next-generation sequencing of tumor samples and the ability to identify somatic mutations at low allelic fractions have opened the way for new approaches to model the evolution of individual cancers. The power and utility of these models is increased when tumor samples from multiple sites are sequenced. Temporal ordering of the samples may provide insight into the etiology of both primary and metastatic lesions and rationalizations for tumor recurrence and therapeutic failures. Additional insights may be provided by temporal ordering of evolving subclones—cellular subpopulations with unique mutational profiles. Current methods for subclone hierarchy inference tightly couple the problem of temporal ordering with that of estimating the fraction of cancer cells harboring each mutation. We present a new framework that includes a rigorous statistical hypothesis test and a collection of tools that make it possible to decouple these problems, which we believe will enable substantial progress in the field of subclone hierarchy inference. The methods presented here can be flexibly combined with methods developed by others addressing either of these problems. We provide tools to interpret hypothesis test results, which inform phylogenetic tree construction, and we introduce the first genetic algorithm designed for this purpose. The utility of our framework is systematically demonstrated in simulations. For most tested combinations of tumor purity, sequencing coverage, and tree complexity, good power (≥ 0.8) can be achieved and Type 1 error is well controlled when at least three tumor samples are available from a patient. Using data from three published multi-region tumor sequencing studies of (murine) small cell lung cancer, acute myeloid leukemia, and chronic lymphocytic leukemia, in which the authors reconstructed subclonal phylogenetic trees by manual expert curation, we show how different configurations of our tools can identify either a single tree in agreement with the authors, or a small set of trees, which include the authors’ preferred tree. Our results have implications for improved modeling of tumor evolution and the importance of multi-region tumor sequencing. PMID:26436540
Dynamic testing in schizophrenia: does training change the construct validity of a test?
Wiedl, Karl H; Schöttke, Henning; Green, Michael F; Nuechterlein, Keith H
2004-01-01
Dynamic testing typically involves specific interventions for a test to assess the extent to which test performance can be modified, beyond level of baseline (static) performance. This study used a dynamic version of the Wisconsin Card Sorting Test (WCST) that is based on cognitive remediation techniques within a test-training-test procedure. From results of previous studies with schizophrenia patients, we concluded that the dynamic and static versions of the WCST should have different construct validity. This hypothesis was tested by examining the patterns of correlations with measures of executive functioning, secondary verbal memory, and verbal intelligence. Results demonstrated a specific construct validity of WCST dynamic (i.e., posttest) scores as an index of problem solving (Tower of Hanoi) and secondary verbal memory and learning (Auditory Verbal Learning Test), whereas the impact of general verbal capacity and selective attention (Verbal IQ, Stroop Test) was reduced. It is concluded that the construct validity of the test changes with dynamic administration and that this difference helps to explain why the dynamic version of the WCST predicts functional outcome better than the static version.
The Importance of Teaching Power in Statistical Hypothesis Testing
ERIC Educational Resources Information Center
Olinsky, Alan; Schumacher, Phyllis; Quinn, John
2012-01-01
In this paper, we discuss the importance of teaching power considerations in statistical hypothesis testing. Statistical power analysis determines the ability of a study to detect a meaningful effect size, where the effect size is the difference between the hypothesized value of the population parameter under the null hypothesis and the true value…
The Relation between Parental Values and Parenting Behavior: A Test of the Kohn Hypothesis.
ERIC Educational Resources Information Center
Luster, Tom; And Others
1989-01-01
Used data on 65 mother-infant dyads to test Kohn's hypothesis concerning the relation between values and parenting behavior. Findings support Kohn's hypothesis that parents who value self-direction would emphasize supportive function of parenting and parents who value conformity would emphasize their obligations to impose restraints. (Author/NB)
Cognitive Biases in the Interpretation of Autonomic Arousal: A Test of the Construal Bias Hypothesis
ERIC Educational Resources Information Center
Ciani, Keith D.; Easter, Matthew A.; Summers, Jessica J.; Posada, Maria L.
2009-01-01
According to Bandura's construal bias hypothesis, derived from social cognitive theory, persons with the same heightened state of autonomic arousal may experience either pleasant or deleterious emotions depending on the strength of perceived self-efficacy. The current study tested this hypothesis by proposing that college students' preexisting…
Overgaard, Morten; Lindeløv, Jonas; Svejstrup, Stinna; Døssing, Marianne; Hvid, Tanja; Kauffmann, Oliver; Mouridsen, Kim
2013-01-01
This paper reports an experiment intended to test a particular hypothesis derived from blindsight research, which we name the “source misidentification hypothesis.” According to this hypothesis, a subject may be correct about a stimulus without being correct about how she had access to this knowledge (whether the stimulus was visual, auditory, or something else). We test this hypothesis in healthy subjects, asking them to report whether a masked stimulus was presented auditorily or visually, what the stimulus was, and how clearly they experienced the stimulus using the Perceptual Awareness Scale (PAS). We suggest that knowledge about perceptual modality may be a necessary precondition in order to issue correct reports of which stimulus was presented. Furthermore, we find that PAS ratings correlate with correctness, and that subjects are at chance level when reporting no conscious experience of the stimulus. To demonstrate that particular levels of reporting accuracy are obtained, we employ a statistical strategy, which operationally tests the hypothesis of non-equality, such that the usual rejection of the null-hypothesis admits the conclusion of equivalence. PMID:23508677
A large scale test of the gaming-enhancement hypothesis
Wang, John C.
2016-01-01
A growing research literature suggests that regular electronic game play and game-based training programs may confer practically significant benefits to cognitive functioning. Most evidence supporting this idea, the gaming-enhancement hypothesis, has been collected in small-scale studies of university students and older adults. This research investigated the hypothesis in a general way with a large sample of 1,847 school-aged children. Our aim was to examine the relations between young people’s gaming experiences and an objective test of reasoning performance. Using a Bayesian hypothesis testing approach, evidence for the gaming-enhancement and null hypotheses were compared. Results provided no substantive evidence supporting the idea that having preference for or regularly playing commercially available games was positively associated with reasoning ability. Evidence ranged from equivocal to very strong in support for the null hypothesis over what was predicted. The discussion focuses on the value of Bayesian hypothesis testing for investigating electronic gaming effects, the importance of open science practices, and pre-registered designs to improve the quality of future work. PMID:27896035
Crowley, Rebecca S.; Legowski, Elizabeth; Medvedeva, Olga; Tseytlin, Eugene; Roh, Ellen; Jukic, Drazen
2007-01-01
Objective Determine effects of computer-based tutoring on diagnostic performance gains, meta-cognition, and acceptance using two different problem representations. Describe impact of tutoring on spectrum of diagnostic skills required for task performance. Identify key features of student-tutor interaction contributing to learning gains. Design Prospective, between-subjects study, controlled for participant level of training. Resident physicians in two academic pathology programs spent four hours using one of two interfaces which differed mainly in external problem representation. The case-focused representation provided an open-learning environment in which students were free to explore evidence-hypothesis relationships within a case, but could not visualize the entire diagnostic space. The knowledge-focused representation provided an interactive representation of the entire diagnostic space, which more tightly constrained student actions. Measurements Metrics included results of pretest, post-test and retention-test for multiple choice and case diagnosis tests, ratios of performance to student reported certainty, results of participant survey, learning curves, and interaction behaviors during tutoring. Results Students had highly significant learning gains after one tutoring session. Learning was retained at one week. There were no differences between the two interfaces in learning gains on post-test or retention test. Only students in the knowledge-focused interface exhibited significant metacognitive gains from pretest to post-test and pretest to retention test. Students rated the knowledge-focused interface significantly higher than the case-focused interface. Conclusions Cognitive tutoring is associated with improved diagnostic performance in a complex medical domain. The effect is retained at one-week post-training. Knowledge-focused external problem representation shows an advantage over case-focused representation for metacognitive effects and user acceptance. PMID:17213494
Crowley, Rebecca S; Legowski, Elizabeth; Medvedeva, Olga; Tseytlin, Eugene; Roh, Ellen; Jukic, Drazen
2007-01-01
Determine effects of computer-based tutoring on diagnostic performance gains, meta-cognition, and acceptance using two different problem representations. Describe impact of tutoring on spectrum of diagnostic skills required for task performance. Identify key features of student-tutor interaction contributing to learning gains. Prospective, between-subjects study, controlled for participant level of training. Resident physicians in two academic pathology programs spent four hours using one of two interfaces which differed mainly in external problem representation. The case-focused representation provided an open-learning environment in which students were free to explore evidence-hypothesis relationships within a case, but could not visualize the entire diagnostic space. The knowledge-focused representation provided an interactive representation of the entire diagnostic space, which more tightly constrained student actions. Metrics included results of pretest, post-test and retention-test for multiple choice and case diagnosis tests, ratios of performance to student reported certainty, results of participant survey, learning curves, and interaction behaviors during tutoring. Students had highly significant learning gains after one tutoring session. Learning was retained at one week. There were no differences between the two interfaces in learning gains on post-test or retention test. Only students in the knowledge-focused interface exhibited significant metacognitive gains from pretest to post-test and pretest to retention test. Students rated the knowledge-focused interface significantly higher than the case-focused interface. Cognitive tutoring is associated with improved diagnostic performance in a complex medical domain. The effect is retained at one-week post-training. Knowledge-focused external problem representation shows an advantage over case-focused representation for metacognitive effects and user acceptance.
ERIC Educational Resources Information Center
SAW, J.G.
THIS PAPER DEALS WITH SOME TESTS OF HYPOTHESIS FREQUENTLY ENCOUNTERED IN THE ANALYSIS OF MULTIVARIATE DATA. THE TYPE OF HYPOTHESIS CONSIDERED IS THAT WHICH THE STATISTICIAN CAN ANSWER IN THE NEGATIVE OR AFFIRMATIVE. THE DOOLITTLE METHOD MAKES IT POSSIBLE TO EVALUATE THE DETERMINANT OF A MATRIX OF HIGH ORDER, TO SOLVE A MATRIX EQUATION, OR TO…
Kruschke, John K; Liddell, Torrin M
2018-02-01
In the practice of data analysis, there is a conceptual distinction between hypothesis testing, on the one hand, and estimation with quantified uncertainty on the other. Among frequentists in psychology, a shift of emphasis from hypothesis testing to estimation has been dubbed "the New Statistics" (Cumming 2014). A second conceptual distinction is between frequentist methods and Bayesian methods. Our main goal in this article is to explain how Bayesian methods achieve the goals of the New Statistics better than frequentist methods. The article reviews frequentist and Bayesian approaches to hypothesis testing and to estimation with confidence or credible intervals. The article also describes Bayesian approaches to meta-analysis, randomized controlled trials, and power analysis.
Investigating the Effects of Exam Length on Performance and Cognitive Fatigue
Jensen, Jamie L.; Berry, Dane A.; Kummer, Tyler A.
2013-01-01
This study examined the effects of exam length on student performance and cognitive fatigue in an undergraduate biology classroom. Exams tested higher order thinking skills. To test our hypothesis, we administered standard- and extended-length high-level exams to two populations of non-majors biology students. We gathered exam performance data between conditions as well as performance on the first and second half of exams within conditions. We showed that lengthier exams led to better performance on assessment items shared between conditions, possibly lending support to the spreading activation theory. It also led to greater performance on the final exam, lending support to the testing effect in creative problem solving. Lengthier exams did not result in lower performance due to fatiguing conditions, although students perceived subjective fatigue. Implications of these findings are discussed with respect to assessment practices. PMID:23950918
SAR-based change detection using hypothesis testing and Markov random field modelling
NASA Astrophysics Data System (ADS)
Cao, W.; Martinis, S.
2015-04-01
The objective of this study is to automatically detect changed areas caused by natural disasters from bi-temporal co-registered and calibrated TerraSAR-X data. The technique in this paper consists of two steps: Firstly, an automatic coarse detection step is applied based on a statistical hypothesis test for initializing the classification. The original analytical formula as proposed in the constant false alarm rate (CFAR) edge detector is reviewed and rewritten in a compact form of the incomplete beta function, which is a builtin routine in commercial scientific software such as MATLAB and IDL. Secondly, a post-classification step is introduced to optimize the noisy classification result in the previous step. Generally, an optimization problem can be formulated as a Markov random field (MRF) on which the quality of a classification is measured by an energy function. The optimal classification based on the MRF is related to the lowest energy value. Previous studies provide methods for the optimization problem using MRFs, such as the iterated conditional modes (ICM) algorithm. Recently, a novel algorithm was presented based on graph-cut theory. This method transforms a MRF to an equivalent graph and solves the optimization problem by a max-flow/min-cut algorithm on the graph. In this study this graph-cut algorithm is applied iteratively to improve the coarse classification. At each iteration the parameters of the energy function for the current classification are set by the logarithmic probability density function (PDF). The relevant parameters are estimated by the method of logarithmic cumulants (MoLC). Experiments are performed using two flood events in Germany and Australia in 2011 and a forest fire on La Palma in 2009 using pre- and post-event TerraSAR-X data. The results show convincing coarse classifications and considerable improvement by the graph-cut post-classification step.
Reconciling Paleomagnetism and Pangea
NASA Astrophysics Data System (ADS)
Domeier, Mathew M.
Although an array of geological and geophysical data support the conventional (Wegenerian) paleogeographic model of Pangea in the Late Triassic--Early Jurassic, its configuration in pre--Late Triassic time has remained controversial for the last half-century. Late Carboniferous to Middle Triassic paleomagnetic data have been repeatedly shown to he incompatible with the conventional model, leading to alternative paleogeographic reconstructions, built to accomodate the paleomaguetic records. However, these models invariably require dubious tectonic transformations that lack supporting evidence in the form of structural relics. An altogether different explanation for the model-data incongruity invokes significant non-dipole geomagnetic fields, but this undermines a core assumption in paleomagnetism: the geocentric axial dipole hypothesis. As paleomagnetic analysis is the only quantitative method for determining paleolatitude in pre-Cretaceous time, this persisting discrepancy between the conventional model and the paleomagnetic data has come to be a first-order problem in tectonics and paleomagnetism. This dissertation explores the third and final hypothetical solution to this problem: that the discrepancy is due to systemic bias in the paleomagnetic data. This hypothesis is tested by collecting new, high--quality, Permian and Triassic paleomagnetic data from Laurnssia and Gondwana, by conducting tests for quality and bias on the published paleomagnetic data, and by re-evaluating Pangea constructions in light of these findings. It is established that with use of accurate Euler parameters and high-fidelity paleomagnetic data, the conventional paleogeographic model can be reconciled with the Carboniferous--Mliddle Triassic paleomagnetic record. The findings of this dissertation thus imply that neither alternative reconstructions or significant non-dipole magnetic fields need to be invoked to resolve this long-standing problem. Furthermore, the documentation of systemic bias in the studied paleomagnetic data has broader implications for paleomagnetisin and derivative work; namely that erroneously shallow inclinations (in sediments), among other forms of bias, are likely to be pervasive in the present paleomagnetic data.
What Causes Birth Order-Intelligence Patterns? The Admixture Hypothesis, Revived.
ERIC Educational Resources Information Center
Rodgers, Joseph Lee
2001-01-01
Describes why birth order interests both parents and researchers, discussing what really causes apparent birth order effects on intelligence, examining problems with using cross-sectional intelligence data, and noting how to move beyond cross-sectional inferences. Explains the admixture hypothesis, which finds that family size is much more…
Observation of genetic relation among new phenomena Geminion, Chiron and mini-Centauro
NASA Technical Reports Server (NTRS)
1985-01-01
The threshold energy problem of exotic type interactions is discussed on the basis of available information from the Chacaltaya emulsion chamber experiment. The genetic hypothesis is proposed as a working hypothesis to explain the discrepancy seen in cosmic ray study and CERN p bar -p collider experiments.
Hypothesis Sampling Systems among Preoperational and Concrete Operational Kindergarten Children
ERIC Educational Resources Information Center
Gholson, Barry; And Others
1976-01-01
Preoperational and concrete operational kindergarten children received stimulus differentiation training, either with or without feedback, and then a series of discrimination learning problems in which a blank trial probe was used to detect a child's hypothesis after each feedback trial. Piagetian stage theory requires elaboration to account…
Young, Anna M.; Cordier, Breanne; Mundry, Roger; Wright, Timothy F.
2014-01-01
In many social species group, members share acoustically similar calls. Functional hypotheses have been proposed for call sharing, but previous studies have been limited by an inability to distinguish among these hypotheses. We examined the function of vocal sharing in female budgerigars with a two-part experimental design that allowed us to distinguish between two functional hypotheses. The social association hypothesis proposes that shared calls help animals mediate affiliative and aggressive interactions, while the password hypothesis proposes that shared calls allow animals to distinguish group identity and exclude nonmembers. We also tested the labeling hypothesis, a mechanistic explanation which proposes that shared calls are used to address specific individuals within the sender–receiver relationship. We tested the social association hypothesis by creating four–member flocks of unfamiliar female budgerigars (Melopsittacus undulatus) and then monitoring the birds’ calls, social behaviors, and stress levels via fecal glucocorticoid metabolites. We tested the password hypothesis by moving immigrants into established social groups. To test the labeling hypothesis, we conducted additional recording sessions in which individuals were paired with different group members. The social association hypothesis was supported by the development of multiple shared call types in each cage and a correlation between the number of shared call types and the number of aggressive interactions between pairs of birds. We also found support for calls serving as a labeling mechanism using discriminant function analysis with a permutation procedure. Our results did not support the password hypothesis, as there was no difference in stress or directed behaviors between immigrant and control birds. PMID:24860236
Learning Problem-Solving Rules as Search through a Hypothesis Space
ERIC Educational Resources Information Center
Lee, Hee Seung; Betts, Shawn; Anderson, John R.
2016-01-01
Learning to solve a class of problems can be characterized as a search through a space of hypotheses about the rules for solving these problems. A series of four experiments studied how different learning conditions affected the search among hypotheses about the solution rule for a simple computational problem. Experiment 1 showed that a problem…
Medeiros, Patrícia Muniz de; Ferreira Júnior, Washington Soares; Ramos, Marcelo Alves; Silva, Taline Cristina da; Ladio, Ana Haydée; Albuquerque, Ulysses Paulino
2017-01-01
Efforts have been made to understand the processes that lead to the introduction of exotic species into local pharmacopoeias. Among those efforts, the diversification hypothesis predicts that exotic plants are introduced in local medical systems to amplify the repertoire of knowledge related to the treatment of diseases, filling blanks that were not occupied by native species. Based on such hypothesis, this study aimed to contribute to this discussion using the context of local Brazilian populations. We performed a systematic review of Brazilian studies up to 2011 involving medicinal plants, excluding those studies that presented a high risk of bias (because of sampling or plant identification problems). An analysis of similarities (ANOSIM) was conducted in different scales to test for differences in the repertoire of therapeutic indications treated using native and exotic species. We have found that although there is some overlap between native and exotic plants regarding their therapeutic indications and the body systems (BSs) that they treat, there are clear gaps present, that is, there are therapeutic indications and BSs treated that are exclusive to exotic species. This scenario enables the postulation of two alternative unfoldings of the diversification hypothesis, namely, (1) exotic species are initially introduced to fill gaps and undergo subsequent expansion of their use for medical purposes already addressed using native species and (2) exotic species are initially introduced to address problems already addressed using native species to diversify the repertoire of medicinal plants and to increase the resilience of medical systems. The reasons why exotic species may have a competitive advantage over the native ones, the implications of the introduction of exotic species for the resilience of medical systems, and the contexts in which autochthonous plants can gain strength to remain in pharmacopoeias are also discussed.
Phase II design with sequential testing of hypotheses within each stage.
Poulopoulou, Stavroula; Karlis, Dimitris; Yiannoutsos, Constantin T; Dafni, Urania
2014-01-01
The main goal of a Phase II clinical trial is to decide, whether a particular therapeutic regimen is effective enough to warrant further study. The hypothesis tested by Fleming's Phase II design (Fleming, 1982) is [Formula: see text] versus [Formula: see text], with level [Formula: see text] and with a power [Formula: see text] at [Formula: see text], where [Formula: see text] is chosen to represent the response probability achievable with standard treatment and [Formula: see text] is chosen such that the difference [Formula: see text] represents a targeted improvement with the new treatment. This hypothesis creates a misinterpretation mainly among clinicians that rejection of the null hypothesis is tantamount to accepting the alternative, and vice versa. As mentioned by Storer (1992), this introduces ambiguity in the evaluation of type I and II errors and the choice of the appropriate decision at the end of the study. Instead of testing this hypothesis, an alternative class of designs is proposed in which two hypotheses are tested sequentially. The hypothesis [Formula: see text] versus [Formula: see text] is tested first. If this null hypothesis is rejected, the hypothesis [Formula: see text] versus [Formula: see text] is tested next, in order to examine whether the therapy is effective enough to consider further testing in a Phase III study. For the derivation of the proposed design the exact binomial distribution is used to calculate the decision cut-points. The optimal design parameters are chosen, so as to minimize the average sample number (ASN) under specific upper bounds for error levels. The optimal values for the design were found using a simulated annealing method.
Farkas, L
1981-09-01
This study sought to identify adaptation problems on the part of elderly persons and significant others that were associated with nursing home applications for elderly persons. Within the framework of the Roy Adaptation Nursing Model, an ex post facto research design was utilized to identify these adaptation problems. The four adaptive modes in this model are: physiological, self-concept, role function, and interdependence. The study group (n = 22) and a control group of elderly persons living in Calgary, as well as their significant others, were administered a structured questionnaire. Five hypotheses relating to overall adaptation problems, powerlessness, role reversal, guilt, and knowledge and utilization of services were formulated and tested. Only the hypothesis indicating role reversal on the part of the significant others was accepted. Adaptation problems encountered by the elderly which were associated with nursing home applications occurred in the self-concept and interdependence adaptive modes. The adaptation problems perceived by the significant others, which were associated with nursing home applications for their elderly, occurred in the self-concept, role function, and interdependence modes. Adaptation problems from the significant others' perception rather than from the elderly persons' perception appear to be more significantly associated with nursing home applications.
An Extension of RSS-based Model Comparison Tests for Weighted Least Squares
2012-08-22
use the model comparison test statistic to analyze the null hypothesis. Under the null hypothesis, the weighted least squares cost functional is JWLS ...q̂WLSH ) = 10.3040×106. Under the alternative hypothesis, the weighted least squares cost functional is JWLS (q̂WLS) = 8.8394 × 106. Thus the model
Conduct problems, IQ, and household chaos: a longitudinal multi-informant study
Deater-Deckard, Kirby; Mullineaux, Paula Y.; Beekman, Charles; Petrill, Stephen A.; Schatschneider, Chris; Thompson, Lee A.
2010-01-01
Background We tested the hypothesis that household chaos would be associated with lower child IQ and more child conduct problems concurrently and longitudinally over two years while controlling for housing conditions, parent education/IQ, literacy environment, parental warmth/negativity, and stressful events. Methods The sample included 302 families with same-sex twins (58% female) in Kindergarten/1st grade at the first assessment. Parents’ and observers’ ratings were gathered, with some collected over a two-year period. Results Chaos varied widely. There was substantial mother–father agreement and longitudinal stability. Chaos covaried with poorer housing conditions, lower parental education/IQ, poorer home literacy environment, higher stress, higher negativity and lower warmth. Chaos statistically predicted lower IQ and more conduct problems, beyond the effects of other home environment factors. Conclusions Even with other home environment factors controlled, higher levels of chaos were linked concurrently with lower child IQ, and concurrently and longitudinally with more child conduct problems. Parent self-reported chaos represents an important aspect of housing and family functioning, with respect to children’s cognitive and behavioral functioning. PMID:19527431
ERIC Educational Resources Information Center
Wellenreuther, Martin
1997-01-01
Argues that the usefulness of strictly quantitative research is still questioned in educational studies, primarily due to deficiencies in methodological training. Uses a critique of a recent study by Heitmeyer et al. (1995) to illustrate the requirements of "good" empirical research. Considers the problems of hypothesis testing in field research.…
Verhulst, Brad
2016-01-01
P values have become the scapegoat for a wide variety of problems in science. P values are generally over-emphasized, often incorrectly applied, and in some cases even abused. However, alternative methods of hypothesis testing will likely fall victim to the same criticisms currently leveled at P values if more fundamental changes are not made in the research process. Increasing the general level of statistical literacy and enhancing training in statistical methods provide a potential avenue for identifying, correcting, and preventing erroneous conclusions from entering the academic literature and for improving the general quality of patient care. PMID:28366961
Field Theory in Cultural Capital Studies of Educational Attainment
ERIC Educational Resources Information Center
Krarup, Troels; Munk, Martin D.
2016-01-01
This article argues that there is a double problem in international research in cultural capital and educational attainment: an empirical problem, since few new insights have been gained within recent years; and a theoretical problem, since cultural capital is seen as a simple hypothesis about certain isolated individual resources, disregarding…
Determining linear vibration frequencies of a ferromagnetic shell
NASA Astrophysics Data System (ADS)
Bagdoev, A. G.; Vardanyan, A. V.; Vardanyan, S. V.; Kukudzhanov, V. N.
2007-10-01
The problems of determining the roots of dispersion equations for free bending vibrations of thin magnetoelastic plates and shells are of both theoretical and practical interest, in particular, in studying vibrations of metallic structures used in controlled thermonuclear reactors. These problems were solved on the basis of the Kirchhoff hypothesis in [1-5]. In [6], an exact spatial approach to determining the vibration frequencies of thin plates was suggested, and it was shown that it completely agrees with the solution obtained according to the Kirchhoff hypothesis. In [7-9], this exact approach was used to solve the problem on vibrations of thin magnetoelastic plates, and it was shown by cumbersome calculations that the solutions obtained according to the exact theory and the Kirchhoff hypothesis differ substantially except in a single case. In [10], the equations of the dynamic theory of elasticity in the axisymmetric problem are given. In [11], the equations for the vibration frequencies of thin ferromagnetic plates with arbitrary conductivity were obtained in the exact statement. In [12], the Kirchhoff hypothesis was used to obtain dispersion relations for a magnetoelastic thin shell. In [5, 13-16], the relations for the Maxwell tensor and the ponderomotive force for magnetics were presented. In [17], the dispersion relations for thin ferromagnetic plates in the transverse field in the spatial statement were studied analytically and numerically. In the present paper, on the basis of the exact approach, we study free bending vibrations of a thin ferromagnetic cylindrical shell. We obtain the exact dispersion equation in the form of a sixth-order determinant, which can be solved numerically in the case of a magnetoelastic thin shell. The numerical results are presented in tables and compared with the results obtained by the Kirchhoff hypothesis. We show a large number of differences in the results, even for the least frequency.
Sex ratios in the two Germanies: a test of the economic stress hypothesis.
Catalano, Ralph A
2003-09-01
Literature describing temporal variation in the secondary sex ratio among humans reports an association between population stressors and declines in the odds of male birth. Explanations of this phenomenon draw on reports that stressed females spontaneously abort male more than female fetuses, and that stressed males exhibit reduced sperm motility. This work has led to the argument that population stress induced by a declining economy reduces the human sex ratio. No direct test of this hypothesis appears in the literature. Here, a test is offered based on a comparison of the sex ratio in East and West Germany for the years 1946 to 1999. The theory suggests that the East German sex ratio should be lower in 1991, when East Germany's economy collapsed, than expected from its own history and from the sex ratio in West Germany. The hypothesis is tested using time-series modelling methods. The data support the hypothesis. The sex ratio in East Germany was at its lowest in 1991. This first direct test supports the hypothesis that economic decline reduces the human sex ratio.
Interactions between child and parent temperament and child behavior problems.
Rettew, David C; Stanger, Catherine; McKee, Laura; Doyle, Alicia; Hudziak, James J
2006-01-01
Few studies of temperament have tested goodness-of-fit theories of child behavior problems. In this study, we test the hypothesis that interactions between child and parent temperament dimensions predict levels of child psychopathology after controlling for the effects of these dimensions individually. Temperament and psychopathology were assessed in a total of 175 children (97 boys, 78 girls; mean age, 10.99 years; SD, 3.66 years) using composite scores from multiple informants of the Junior Temperament and Character Inventory and the Achenbach System of Empirically Based Assessment. Parent temperament was assessed using the adult version of the Temperament and Character Inventory. Statistical analyses included multiple regression procedures to assess the contribution of child-parent temperament interactions after controlling for demographic variables, other types of child psychopathology, and the individual Temperament and Character Inventory and Junior Temperament and Character Inventory dimensions. Interactions between child and parent temperament dimensions predicted higher levels of externalizing, internalizing, and attention problems over and above the effects of these dimensions alone. Among others, the combination of high child novelty seeking with high maternal novelty was associated with child attention problems, whereas the combination of high child harm avoidance and high father harm avoidance was associated with increased child internalizing problems. Many child temperament dimensions also exerted significant effects independently. The association between a child temperament trait and psychopathology can be dependent upon the temperament of parents. These data lend support to previous theories of the importance of goodness-of-fit.
Understanding suicide terrorism: premature dismissal of the religious-belief hypothesis.
Liddle, James R; Machluf, Karin; Shackelford, Todd K
2010-07-06
We comment on work by Ginges, Hansen, and Norenzayan (2009), in which they compare two hypotheses for predicting individual support for suicide terrorism: the religious-belief hypothesis and the coalitional-commitment hypothesis. Although we appreciate the evidence provided in support of the coalitional-commitment hypothesis, we argue that their method of testing the religious-belief hypothesis is conceptually flawed, thus calling into question their conclusion that the religious-belief hypothesis has been disconfirmed. In addition to critiquing the methodology implemented by Ginges et al., we provide suggestions on how the religious-belief hypothesis may be properly tested. It is possible that the premature and unwarranted conclusions reached by Ginges et al. may deter researchers from examining the effect of specific religious beliefs on support for terrorism, and we hope that our comments can mitigate this possibility.
Vision and academic performance of learning disabled children.
Wharry, R E; Kirkpatrick, S W
1986-02-01
The purpose of this study was to assess difference in academic performance among myopic, hyperopic, and emmetropic children who were learning disabled. More specifically, myopic children were expected to perform better on mathematical and spatial tasks than would hyperopic ones and that hyperopic and emmetropic children would perform better on verbal measures than would myopic ones. For 439 learning disabled students visual anomalies were determined via a Generated Retinal Reflex Image Screening System. Test data were obtained from school files. Partial support for the hypothesis was obtained. Myopic learning disabled children outperformed hyperopic and emmetropic children on the Key Math test. Myopic children scored better than hyperopic children on the WRAT Reading subtest and on the Durrell Analysis of Reading Difficulty Oral Reading Comprehension, Oral Rate, Flashword, and Spelling subtests, and on the Key Math Measurement and Total Scores. Severity of refractive error significantly affected the Wechsler Intelligence Scale for Children--Revised Full Scale, Performance Scale, Verbal Scale, and Digit Span scores but did not affect any academic test scores. Several other findings were also reported. Those with nonametropic problems scored higher than those without problems on the Key Math Time subtest. Implications supportive of the theories of Benbow and Benbow and Geschwind and Behan were stated.
Dwivedi, Alok Kumar; Mallawaarachchi, Indika; Alvarado, Luis A
2017-06-30
Experimental studies in biomedical research frequently pose analytical problems related to small sample size. In such studies, there are conflicting findings regarding the choice of parametric and nonparametric analysis, especially with non-normal data. In such instances, some methodologists questioned the validity of parametric tests and suggested nonparametric tests. In contrast, other methodologists found nonparametric tests to be too conservative and less powerful and thus preferred using parametric tests. Some researchers have recommended using a bootstrap test; however, this method also has small sample size limitation. We used a pooled method in nonparametric bootstrap test that may overcome the problem related with small samples in hypothesis testing. The present study compared nonparametric bootstrap test with pooled resampling method corresponding to parametric, nonparametric, and permutation tests through extensive simulations under various conditions and using real data examples. The nonparametric pooled bootstrap t-test provided equal or greater power for comparing two means as compared with unpaired t-test, Welch t-test, Wilcoxon rank sum test, and permutation test while maintaining type I error probability for any conditions except for Cauchy and extreme variable lognormal distributions. In such cases, we suggest using an exact Wilcoxon rank sum test. Nonparametric bootstrap paired t-test also provided better performance than other alternatives. Nonparametric bootstrap test provided benefit over exact Kruskal-Wallis test. We suggest using nonparametric bootstrap test with pooled resampling method for comparing paired or unpaired means and for validating the one way analysis of variance test results for non-normal data in small sample size studies. Copyright © 2017 John Wiley & Sons, Ltd. Copyright © 2017 John Wiley & Sons, Ltd.
Validation of the thermal challenge problem using Bayesian Belief Networks.
DOE Office of Scientific and Technical Information (OSTI.GOV)
McFarland, John; Swiler, Laura Painton
The thermal challenge problem has been developed at Sandia National Laboratories as a testbed for demonstrating various types of validation approaches and prediction methods. This report discusses one particular methodology to assess the validity of a computational model given experimental data. This methodology is based on Bayesian Belief Networks (BBNs) and can incorporate uncertainty in experimental measurements, in physical quantities, and model uncertainties. The approach uses the prior and posterior distributions of model output to compute a validation metric based on Bayesian hypothesis testing (a Bayes' factor). This report discusses various aspects of the BBN, specifically in the context ofmore » the thermal challenge problem. A BBN is developed for a given set of experimental data in a particular experimental configuration. The development of the BBN and the method for ''solving'' the BBN to develop the posterior distribution of model output through Monte Carlo Markov Chain sampling is discussed in detail. The use of the BBN to compute a Bayes' factor is demonstrated.« less
Gudiño, Omar G.; Nadeem, Erum; Kataoka, Sheryl H.; Lau, Anna S.
2013-01-01
Urban Latino youth are exposed to high rates of violence, which increases risk for diverse forms of psychopathology. To current study aims to increase specificity in predicting responses by testing the hypothesis that youths’ reinforcement sensitivity–behavioral inhibition (BIS) and behavioral approach (BAS)–is associated with specific clinical outcomes and increases risk for the development of such problems following exposure to violence. Utilizing a short-term longitudinal design, Latino youth (N=168) provided reports of BIS/BAS and emotional/behavioral problems at Time 1, exposure to violence between Time 1 and Time 2, and clinical symptoms at Time 2. Results suggested that reinforcement sensitivity moderated the relation between violence exposure and psychopathology, such that increasing levels of BIS were associated with elevated risk for internalizing and posttraumatic stress symptoms following exposure to violence whereas BAS increased risk for externalizing problems. The importance of building on existing knowledge to understand minority youth psychopathology is discussed. PMID:22080366
Feldman, Anatol G; Latash, Mark L
2005-02-01
Criticisms of the equilibrium point (EP) hypothesis have recently appeared that are based on misunderstandings of some of its central notions. Starting from such interpretations of the hypothesis, incorrect predictions are made and tested. When the incorrect predictions prove false, the hypothesis is claimed to be falsified. In particular, the hypothesis has been rejected based on the wrong assumptions that it conflicts with empirically defined joint stiffness values or that it is incompatible with violations of equifinality under certain velocity-dependent perturbations. Typically, such attempts use notions describing the control of movements of artificial systems in place of physiologically relevant ones. While appreciating constructive criticisms of the EP hypothesis, we feel that incorrect interpretations have to be clarified by reiterating what the EP hypothesis does and does not predict. We conclude that the recent claims of falsifying the EP hypothesis and the calls for its replacement by EMG-force control hypothesis are unsubstantiated. The EP hypothesis goes far beyond the EMG-force control view. In particular, the former offers a resolution for the famous posture-movement paradox while the latter fails to resolve it.
Better without (lateral) frontal cortex? Insight problems solved by frontal patients.
Reverberi, Carlo; Toraldo, Alessio; D'Agostini, Serena; Skrap, Miran
2005-12-01
A recently proposed theory on frontal lobe functions claims that the prefrontal cortex, particularly its dorso-lateral aspect, is crucial in defining a set of responses suitable for a particular task, and biasing these for selection. This activity is carried out for virtually any kind of non-routine tasks, without distinction of content. The aim of this study is to test the prediction of Frith's 'sculpting the response space' hypothesis by means of an 'insight' problem-solving task, namely the matchstick arithmetic task. Starting from Knoblich et al.'s interpretation for the failure of healthy controls to solve the matchstick problem, and Frith's theory on the role of dorsolateral frontal cortex, we derived the counterintuitive prediction that patients with focal damage to the lateral frontal cortex should perform better than a group of healthy participants on this rather difficult task. We administered the matchstick task to 35 patients (aged 26-65 years) with a single focal brain lesion as determined by a CT or an MRI scan, and to 23 healthy participants (aged 34-62 years). The findings seemed in line with theoretical predictions. While only 43% of healthy participants could solve the most difficult matchstick problems ('type C'), 82% of lateral frontal patients did so (Fisher's exact test, P < 0.05). In conclusion, the combination of Frith's and Knoblich et al.'s theories was corroborated.
Application of basic science to clinical problems: traditional vs. hybrid problem-based learning.
Callis, Amber N; McCann, Ann L; Schneiderman, Emet D; Babler, William J; Lacy, Ernestine S; Hale, David Sidney
2010-10-01
It is widely acknowledged that clinical problem-solving is a key skill for dental practitioners. The aim of this study was to determine if students in a hybrid problem-based learning curriculum (h-PBL) were better at integrating basic science knowledge with clinical cases than students in a traditional, lecture-based curriculum (TC). The performance of TC students (n=40) was compared to that of h-PBL students (n=31). Participants read two clinical scenarios and answered a series of questions regarding each. To control for differences in ability, Dental Admission Test (DAT) Academic Average scores and predental grade point averages (GPAs) were compared, and an ANCOVA was used to adjust for the significant differences in DAT (t-test, p=0.002). Results showed that h-PBL students were better at applying basic science knowledge to a clinical case (ANCOVA, p=0.022) based on overall scores on one case. TC students' overall scores were better than h-PBL students on a separate case; however, it was not statistically significant (p=0.107). The h-PBL students also demonstrated greater skills in the areas of hypothesis generation (Mann-Whitney U, p=0.016) and communication (p=0.006). Basic science comprehension (p=0.01) and neurology (p<0.001) were two areas in which the TC students did score significantly higher than h-PBL students.
Case-based statistical learning applied to SPECT image classification
NASA Astrophysics Data System (ADS)
Górriz, Juan M.; Ramírez, Javier; Illán, I. A.; Martínez-Murcia, Francisco J.; Segovia, Fermín.; Salas-Gonzalez, Diego; Ortiz, A.
2017-03-01
Statistical learning and decision theory play a key role in many areas of science and engineering. Some examples include time series regression and prediction, optical character recognition, signal detection in communications or biomedical applications for diagnosis and prognosis. This paper deals with the topic of learning from biomedical image data in the classification problem. In a typical scenario we have a training set that is employed to fit a prediction model or learner and a testing set on which the learner is applied to in order to predict the outcome for new unseen patterns. Both processes are usually completely separated to avoid over-fitting and due to the fact that, in practice, the unseen new objects (testing set) have unknown outcomes. However, the outcome yields one of a discrete set of values, i.e. the binary diagnosis problem. Thus, assumptions on these outcome values could be established to obtain the most likely prediction model at the training stage, that could improve the overall classification accuracy on the testing set, or keep its performance at least at the level of the selected statistical classifier. In this sense, a novel case-based learning (c-learning) procedure is proposed which combines hypothesis testing from a discrete set of expected outcomes and a cross-validated classification stage.
Current hypotheses for the evolution of sex and recombination.
Hartfield, Matthew; Keightley, Peter D
2012-06-01
The evolution of sex is one of the most important and controversial problems in evolutionary biology. Although sex is almost universal in higher animals and plants, its inherent costs have made its maintenance difficult to explain. The most famous of these is the twofold cost of males, which can greatly reduce the fecundity of a sexual population, compared to a population of asexual females. Over the past century, multiple hypotheses, along with experimental evidence to support these, have been put forward to explain widespread costly sex. In this review, we outline some of the most prominent theories, along with the experimental and observational evidence supporting these. Historically, there have been 4 classes of theories: the ability of sex to fix multiple novel advantageous mutants (Fisher-Muller hypothesis); sex as a mechanism to stop the build-up of deleterious mutations in finite populations (Muller's ratchet); recombination creating novel genotypes that can resist infection by parasites (Red Queen hypothesis); and the ability of sex to purge bad genomes if deleterious mutations act synergistically (mutational deterministic hypothesis). Current theoretical and experimental evidence seems to favor the hypothesis that sex breaks down selection interference between new mutants, or it acts as a mechanism to shuffle genotypes in order to repel parasitic invasion. However, there is still a need to collect more data from natural populations and experimental studies, which can be used to test different hypotheses. © 2012 ISZS, Blackwell Publishing and IOZ/CAS.
Action perception as hypothesis testing.
Donnarumma, Francesco; Costantini, Marcello; Ambrosini, Ettore; Friston, Karl; Pezzulo, Giovanni
2017-04-01
We present a novel computational model that describes action perception as an active inferential process that combines motor prediction (the reuse of our own motor system to predict perceived movements) and hypothesis testing (the use of eye movements to disambiguate amongst hypotheses). The system uses a generative model of how (arm and hand) actions are performed to generate hypothesis-specific visual predictions, and directs saccades to the most informative places of the visual scene to test these predictions - and underlying hypotheses. We test the model using eye movement data from a human action observation study. In both the human study and our model, saccades are proactive whenever context affords accurate action prediction; but uncertainty induces a more reactive gaze strategy, via tracking the observed movements. Our model offers a novel perspective on action observation that highlights its active nature based on prediction dynamics and hypothesis testing. Copyright © 2017 The Authors. Published by Elsevier Ltd.. All rights reserved.
Religion as a means to assure paternity.
Strassmann, Beverly I; Kurapati, Nikhil T; Hug, Brendan F; Burke, Erin E; Gillespie, Brenda W; Karafet, Tatiana M; Hammer, Michael F
2012-06-19
The sacred texts of five world religions (Buddhism, Christianity, Hinduism, Islam, and Judaism) use similar belief systems to set limits on sexual behavior. We propose that this similarity is a shared cultural solution to a biological problem: namely male uncertainty over the paternity of offspring. Furthermore, we propose the hypothesis that religious practices that more strongly regulate female sexuality should be more successful at promoting paternity certainty. Using genetic data on 1,706 father-son pairs, we tested this hypothesis in a traditional African population in which multiple religions (Islam, Christianity, and indigenous) coexist in the same families and villages. We show that the indigenous religion enables males to achieve a significantly (P = 0.019) lower probability of cuckoldry (1.3% versus 2.9%) by enforcing the honest signaling of menstruation, but that all three religions share tenets aimed at the avoidance of extrapair copulation. Our findings provide evidence for high paternity certainty in a traditional African population, and they shed light on the reproductive agendas that underlie religious patriarchy.
Taussig, Heather N.; Culhane, Sara E.; Garrido, Edward; Knudtson, Michael D.; Petrenko, Christie L. M.
2015-01-01
Physically neglected youth are at increased risk for mental health problems, but there are few interventions that have demonstrated efficacy in reducing mental health symptoms for this vulnerable population. The Fostering Healthy Futures (FHF) program, which consists of mentoring and skills groups, was developed for preadolescent youth in foster care. In a published randomized controlled trial with 156 youth, FHF demonstrated positive impacts on mental health functioning. The current study sought to determine whether FHF might be particularly effective in ameliorating the impact of neglectful family environments. Because it was not possible to isolate a neglected-only subgroup, as most children with physical neglect histories had experienced other types of maltreatment, we tested the hypothesis that intervention effects would be stronger among children with more severe physical neglect. Findings did not support this hypothesis, however, as severity of physical neglect did not significantly moderate the impact of the intervention on psychosocial outcomes PMID:23076837
Religion as a means to assure paternity
Strassmann, Beverly I.; Kurapati, Nikhil T.; Hug, Brendan F.; Burke, Erin E.; Gillespie, Brenda W.; Karafet, Tatiana M.; Hammer, Michael F.
2012-01-01
The sacred texts of five world religions (Buddhism, Christianity, Hinduism, Islam, and Judaism) use similar belief systems to set limits on sexual behavior. We propose that this similarity is a shared cultural solution to a biological problem: namely male uncertainty over the paternity of offspring. Furthermore, we propose the hypothesis that religious practices that more strongly regulate female sexuality should be more successful at promoting paternity certainty. Using genetic data on 1,706 father–son pairs, we tested this hypothesis in a traditional African population in which multiple religions (Islam, Christianity, and indigenous) coexist in the same families and villages. We show that the indigenous religion enables males to achieve a significantly (P = 0.019) lower probability of cuckoldry (1.3% versus 2.9%) by enforcing the honest signaling of menstruation, but that all three religions share tenets aimed at the avoidance of extrapair copulation. Our findings provide evidence for high paternity certainty in a traditional African population, and they shed light on the reproductive agendas that underlie religious patriarchy. PMID:22665788
Linking Cultural Competence to Functional Life Outcomes in Mental Health Care Settings.
Michalopoulou, Georgia; Falzarano, Pamela; Butkus, Michael; Zeman, Lori; Vershave, Judy; Arfken, Cynthia
2014-01-01
Minorities in the United States have well-documented health disparities. Cultural barriers and biases by health care providers may contribute to lower quality of services which may contribute to these disparities. However, evidence linking cultural competency and health outcomes is lacking. This study, part of an ongoing quality improvement effort, tested the mediation hypothesis that patients' perception of provider cultural competency indirectly influences patients' health outcomes through process of care. Data were from patient satisfaction surveys collected in seven mental health clinics (n=94 minority patients). Consistent with our hypothesis, patients' perception of clinicians' cultural competency was indirectly associated with patients' self-reported improvements in social interactions, improvements in performance at work or school, and improvements in managing life problems through the patients' experience of respect, trust, and communication with the clinician. These findings indicate that process of care characteristics during the clinical encounter influence patients' perceptions of clinicians' cultural competency and affect functional outcomes. © 2013 National Medical Association. Published by Elsevier Inc. All rights reserved.
Mechanical energy flow models of rods and beams
NASA Technical Reports Server (NTRS)
Wohlever, J. C.; Bernhard, R. J.
1992-01-01
It has been proposed that the flow of mechanical energy through a structural/acoustic system may be modeled in a manner similar to that of flow of thermal energy/in a heat conduction problem. If this hypothesis is true, it would result in relatively efficient numerical models of structure-borne energy in large built-up structures. Fewer parameters are required to approximate the energy solution than are required to model the characteristic wave behavior of structural vibration by using traditional displacement formulations. The energy flow hypothesis is tested in this investigation for both longitudinal vibration in rods and transverse flexural vibrations of beams. The rod is shown to behave approximately according to the thermal energy flow analogy. However, the beam solutions behave significantly differently than predicted by the thermal analogy unless locally-space-averaged energy and power are considered. Several techniques for coupling dissimilar rods and beams are also discussed. Illustrations of the solution accuracy of the methods are included.
NASA Astrophysics Data System (ADS)
Noble, Clifford Elliott, II
2002-09-01
The problem. The purpose of this study was to investigate the ability of three single-task instruments---(a) the Test of English as a Foreign Language, (b) the Aviation Test of Spoken English, and (c) the Single Manual-Tracking Test---and three dual-task instruments---(a) the Concurrent Manual-Tracking and Communication Test, (b) the Certified Flight Instructor's Test, and (c) the Simulation-Based English Test---to predict the language performance of 10 Chinese student pilots speaking English as a second language when operating single-engine and multiengine aircraft within American airspace. Method. This research implemented a correlational design to investigate the ability of the six described instruments to predict the mean score of the criterion evaluation, which was the Examiner's Test. This test assessed the oral communication skill of student pilots on the flight portion of the terminal checkride in the Piper Cadet, Piper Seminole, and Beechcraft King Air airplanes. Results. Data from the Single Manual-Tracking Test, as well as the Concurrent Manual-Tracking and Communication Test, were discarded due to performance ceiling effects. Hypothesis 1, which stated that the average correlation between the mean scores of the dual-task evaluations and that of the Examiner's Test would predict the mean score of the criterion evaluation with a greater degree of accuracy than that of single-task evaluations, was not supported. Hypothesis 2, which stated that the correlation between the mean scores of the participants on the Simulation-Based English Test and the Examiner's Test would predict the mean score of the criterion evaluation with a greater degree of accuracy than that of all single- and dual-task evaluations, was also not supported. The findings suggest that single- and dual-task assessments administered after initial flight training are equivalent predictors of language performance when piloting single-engine and multiengine aircraft.
Confidence intervals for single-case effect size measures based on randomization test inversion.
Michiels, Bart; Heyvaert, Mieke; Meulders, Ann; Onghena, Patrick
2017-02-01
In the current paper, we present a method to construct nonparametric confidence intervals (CIs) for single-case effect size measures in the context of various single-case designs. We use the relationship between a two-sided statistical hypothesis test at significance level α and a 100 (1 - α) % two-sided CI to construct CIs for any effect size measure θ that contain all point null hypothesis θ values that cannot be rejected by the hypothesis test at significance level α. This method of hypothesis test inversion (HTI) can be employed using a randomization test as the statistical hypothesis test in order to construct a nonparametric CI for θ. We will refer to this procedure as randomization test inversion (RTI). We illustrate RTI in a situation in which θ is the unstandardized and the standardized difference in means between two treatments in a completely randomized single-case design. Additionally, we demonstrate how RTI can be extended to other types of single-case designs. Finally, we discuss a few challenges for RTI as well as possibilities when using the method with other effect size measures, such as rank-based nonoverlap indices. Supplementary to this paper, we provide easy-to-use R code, which allows the user to construct nonparametric CIs according to the proposed method.
ERIC Educational Resources Information Center
Besken, Miri
2016-01-01
The perceptual fluency hypothesis claims that items that are easy to perceive at encoding induce an illusion that they will be easier to remember, despite the finding that perception does not generally affect recall. The current set of studies tested the predictions of the perceptual fluency hypothesis with a picture generation manipulation.…
Adolescents' Body Image Trajectories: A Further Test of the Self-Equilibrium Hypothesis
ERIC Educational Resources Information Center
Morin, Alexandre J. S.; Maïano, Christophe; Scalas, L. Francesca; Janosz, Michel; Litalien, David
2017-01-01
The self-equilibrium hypothesis underlines the importance of having a strong core self, which is defined as a high and developmentally stable self-concept. This study tested this hypothesis in relation to body image (BI) trajectories in a sample of 1,006 adolescents (M[subscript age] = 12.6, including 541 males and 465 females) across a 4-year…
ERIC Educational Resources Information Center
Lee, Jungmin
2016-01-01
This study tested the Bennett hypothesis by examining whether four-year colleges changed listed tuition and fees, the amount of institutional grants per student, and room and board charges after their states implemented statewide merit-based aid programs. According to the Bennett hypothesis, increases in government financial aid make it easier for…
Double-Deficit Hypothesis in a Clinical Sample: Extension beyond Reading
ERIC Educational Resources Information Center
Heikkilä, Riikka; Torppa, Minna; Aro, Mikko; Närhi, Vesa; Ahonen, Timo
2016-01-01
This study explored the double-deficit hypothesis (DDH) in a transparent orthography (Finnish) and extended the view from reading disabilities to comorbidity of learning-related problems in math and attention. Children referred for evaluation of learning disabilities in second through sixth grade (N = 205) were divided into four groups based on…
Human female orgasm as evolved signal: a test of two hypotheses.
Ellsworth, Ryan M; Bailey, Drew H
2013-11-01
We present the results of a study designed to empirically test predictions derived from two hypotheses regarding human female orgasm behavior as an evolved communicative trait or signal. One hypothesis tested was the female fidelity hypothesis, which posits that human female orgasm signals a woman's sexual satisfaction and therefore her likelihood of future fidelity to a partner. The other was sire choice hypothesis, which posits that women's orgasm behavior signals increased chances of fertilization. To test the two hypotheses of human female orgasm, we administered a questionnaire to 138 females and 121 males who reported that they were currently in a romantic relationship. Key predictions of the female fidelity hypothesis were not supported. In particular, orgasm was not associated with female sexual fidelity nor was orgasm associated with male perceptions of partner sexual fidelity. However, faked orgasm was associated with female sexual infidelity and lower male relationship satisfaction. Overall, results were in greater support of the sire choice signaling hypothesis than the female fidelity hypothesis. Results also suggest that male satisfaction with, investment in, and sexual fidelity to a mate are benefits that favored the selection of orgasmic signaling in ancestral females.
Luo, Liqun; Zhao, Wei; Weng, Tangmei
2016-01-01
The Trivers-Willard hypothesis predicts that high-status parents will bias their investment to sons, whereas low-status parents will bias their investment to daughters. Among humans, tests of this hypothesis have yielded mixed results. This study tests the hypothesis using data collected among contemporary peasants in Central South China. We use current family status (rated by our informants) and father's former class identity (assigned by the Chinese Communist Party in the early 1950s) as measures of parental status, and proportion of sons in offspring and offspring's years of education as measures of parental investment. Results show that (i) those families with a higher former class identity such as landlord and rich peasant tend to have a higher socioeconomic status currently, (ii) high-status parents are more likely to have sons than daughters among their biological offspring, and (iii) in higher-status families, the years of education obtained by sons exceed that obtained by daughters to a larger extent than in lower-status families. Thus, the first assumption and the two predictions of the hypothesis are supported by this study. This article contributes a contemporary Chinese case to the testing of the Trivers-Willard hypothesis.
Hypothesis testing of a change point during cognitive decline among Alzheimer's disease patients.
Ji, Ming; Xiong, Chengjie; Grundman, Michael
2003-10-01
In this paper, we present a statistical hypothesis test for detecting a change point over the course of cognitive decline among Alzheimer's disease patients. The model under the null hypothesis assumes a constant rate of cognitive decline over time and the model under the alternative hypothesis is a general bilinear model with an unknown change point. When the change point is unknown, however, the null distribution of the test statistics is not analytically tractable and has to be simulated by parametric bootstrap. When the alternative hypothesis that a change point exists is accepted, we propose an estimate of its location based on the Akaike's Information Criterion. We applied our method to a data set from the Neuropsychological Database Initiative by implementing our hypothesis testing method to analyze Mini Mental Status Exam scores based on a random-slope and random-intercept model with a bilinear fixed effect. Our result shows that despite large amount of missing data, accelerated decline did occur for MMSE among AD patients. Our finding supports the clinical belief of the existence of a change point during cognitive decline among AD patients and suggests the use of change point models for the longitudinal modeling of cognitive decline in AD research.
NASA Astrophysics Data System (ADS)
van Aalsvoort, Joke
In a previous article, the problem of chemistry's lack of relevance in secondary chemical education was analysed using logical positivism as a tool. This article starts with the hypothesis that the problem can be addressed by means of activity theory, one of the important theories within the sociocultural school. The reason for this expectation is that, while logical positivism creates a divide between science and society, activity theory offers a model of society in which science and society are related. With the use of this model, a new course for grade nine has been constructed. This results in a confirmation of the hypothesis, at least at a theoretical level. A comparison with the Salters' approach is made in order to demonstrate the relative merits of a mediated way of dealing with the problem of the lack of relevance of chemistry in chemical education.
NASA Astrophysics Data System (ADS)
Menne, Matthew J.; Williams, Claude N., Jr.
2005-10-01
An evaluation of three hypothesis test statistics that are commonly used in the detection of undocumented changepoints is described. The goal of the evaluation was to determine whether the use of multiple tests could improve undocumented, artificial changepoint detection skill in climate series. The use of successive hypothesis testing is compared to optimal approaches, both of which are designed for situations in which multiple undocumented changepoints may be present. In addition, the importance of the form of the composite climate reference series is evaluated, particularly with regard to the impact of undocumented changepoints in the various component series that are used to calculate the composite.In a comparison of single test changepoint detection skill, the composite reference series formulation is shown to be less important than the choice of the hypothesis test statistic, provided that the composite is calculated from the serially complete and homogeneous component series. However, each of the evaluated composite series is not equally susceptible to the presence of changepoints in its components, which may be erroneously attributed to the target series. Moreover, a reference formulation that is based on the averaging of the first-difference component series is susceptible to random walks when the composition of the component series changes through time (e.g., values are missing), and its use is, therefore, not recommended. When more than one test is required to reject the null hypothesis of no changepoint, the number of detected changepoints is reduced proportionately less than the number of false alarms in a wide variety of Monte Carlo simulations. Consequently, a consensus of hypothesis tests appears to improve undocumented changepoint detection skill, especially when reference series homogeneity is violated. A consensus of successive hypothesis tests using a semihierarchic splitting algorithm also compares favorably to optimal solutions, even when changepoints are not hierarchic.
Sworn testimony of the model evidence: Gaussian Mixture Importance (GAME) sampling
NASA Astrophysics Data System (ADS)
Volpi, Elena; Schoups, Gerrit; Firmani, Giovanni; Vrugt, Jasper A.
2017-07-01
What is the "best" model? The answer to this question lies in part in the eyes of the beholder, nevertheless a good model must blend rigorous theory with redeeming qualities such as parsimony and quality of fit. Model selection is used to make inferences, via weighted averaging, from a set of K candidate models, Mk; k=>(1,…,K>), and help identify which model is most supported by the observed data, Y>˜=>(y˜1,…,y˜n>). Here, we introduce a new and robust estimator of the model evidence, p>(Y>˜|Mk>), which acts as normalizing constant in the denominator of Bayes' theorem and provides a single quantitative measure of relative support for each hypothesis that integrates model accuracy, uncertainty, and complexity. However, p>(Y>˜|Mk>) is analytically intractable for most practical modeling problems. Our method, coined GAussian Mixture importancE (GAME) sampling, uses bridge sampling of a mixture distribution fitted to samples of the posterior model parameter distribution derived from MCMC simulation. We benchmark the accuracy and reliability of GAME sampling by application to a diverse set of multivariate target distributions (up to 100 dimensions) with known values of p>(Y>˜|Mk>) and to hypothesis testing using numerical modeling of the rainfall-runoff transformation of the Leaf River watershed in Mississippi, USA. These case studies demonstrate that GAME sampling provides robust and unbiased estimates of the evidence at a relatively small computational cost outperforming commonly used estimators. The GAME sampler is implemented in the MATLAB package of DREAM and simplifies considerably scientific inquiry through hypothesis testing and model selection.
Solving the Dark Matter Problem
Baltz, Ted
2018-05-11
Cosmological observations have firmly established that the majority of matter in the universe is of an unknown type, called 'dark matter'. A compelling hypothesis is that the dark matter consists of weakly interacting massive particles (WIMPs) in the mass range around 100 GeV. If the WIMP hypothesis is correct, such particles could be created and studied at accelerators. Furthermore they could be directly detected as the primary component of our galaxy. Solving the dark matter problem requires that the connection be made between the two. We describe some theoretical and experimental avenues that might lead to this connection.
Order-restricted inference for means with missing values.
Wang, Heng; Zhong, Ping-Shou
2017-09-01
Missing values appear very often in many applications, but the problem of missing values has not received much attention in testing order-restricted alternatives. Under the missing at random (MAR) assumption, we impute the missing values nonparametrically using kernel regression. For data with imputation, the classical likelihood ratio test designed for testing the order-restricted means is no longer applicable since the likelihood does not exist. This article proposes a novel method for constructing test statistics for assessing means with an increasing order or a decreasing order based on jackknife empirical likelihood (JEL) ratio. It is shown that the JEL ratio statistic evaluated under the null hypothesis converges to a chi-bar-square distribution, whose weights depend on missing probabilities and nonparametric imputation. Simulation study shows that the proposed test performs well under various missing scenarios and is robust for normally and nonnormally distributed data. The proposed method is applied to an Alzheimer's disease neuroimaging initiative data set for finding a biomarker for the diagnosis of the Alzheimer's disease. © 2017, The International Biometric Society.
López-Valcárcel, Beatriz G; González-Martel, Christian; Peiro, Salvador
2018-01-01
Objective Newcomb-Benford’s Law (NBL) proposes a regular distribution for first digits, second digits and digit combinations applicable to many different naturally occurring sources of data. Testing deviations from NBL is used in many datasets as a screening tool for identifying data trustworthiness problems. This study aims to compare public available waiting lists (WL) data from Finland and Spain for testing NBL as an instrument to flag up potential manipulation in WLs. Design Analysis of the frequency of Finnish and Spanish WLs first digits to determine if their distribution is similar to the pattern documented by NBL. Deviations from the expected first digit frequency were analysed using Pearson’s χ2, mean absolute deviation and Kuiper tests. Setting/participants Publicly available WL data from Finland and Spain, two countries with universal health insurance and National Health Systems but characterised by different levels of transparency and good governance standards. Main outcome measures Adjustment of the observed distribution of the numbers reported in Finnish and Spanish WL data to the expected distribution according to NBL. Results WL data reported by the Finnish health system fits first digit NBL according to all statistical tests used (p=0.6519 in χ2 test). For Spanish data, this hypothesis was rejected in all tests (p<0.0001 in χ2 test). Conclusions Testing deviations from NBL distribution can be a useful tool to identify problems with WL data trustworthiness and signalling the need for further testing. PMID:29743333
Feature theory and the two-step hypothesis of Müllerian mimicry evolution.
Balogh, Alexandra Catherine Victoria; Gamberale-Stille, Gabriella; Tullberg, Birgitta Sillén; Leimar, Olof
2010-03-01
The two-step hypothesis of Müllerian mimicry evolution states that mimicry starts with a major mutational leap between adaptive peaks, followed by gradual fine-tuning. The hypothesis was suggested to solve the problem of apostatic selection producing a valley between adaptive peaks, and appears reasonable for a one-dimensional phenotype. Extending the hypothesis to the realistic scenario of multidimensional phenotypes controlled by multiple genetic loci can be problematic, because it is unlikely that major mutational leaps occur simultaneously in several traits. Here we consider the implications of predator psychology on the evolutionary process. According to feature theory, single prey traits may be used by predators as features to classify prey into discrete categories. A mutational leap in such a trait could initiate mimicry evolution. We conducted individual-based evolutionary simulations in which virtual predators both categorize prey according to features and generalize over total appearances. We found that an initial mutational leap toward feature similarity in one dimension facilitates mimicry evolution of multidimensional traits. We suggest that feature-based predator categorization together with predator generalization over total appearances solves the problem of applying the two-step hypothesis to complex phenotypes, and provides a basis for a theory of the evolution of mimicry rings.
Multiple-hypothesis multiple-model line tracking
NASA Astrophysics Data System (ADS)
Pace, Donald W.; Owen, Mark W.; Cox, Henry
2000-07-01
Passive sonar signal processing generally includes tracking of narrowband and/or broadband signature components observed on a Lofargram or on a Bearing-Time-Record (BTR) display. Fielded line tracking approaches to date have been recursive and single-hypthesis-oriented Kalman- or alpha-beta filters, with no mechanism for considering tracking alternatives beyond the most recent scan of measurements. While adaptivity is often built into the filter to handle changing track dynamics, these approaches are still extensions of single target tracking solutions to multiple target tracking environment. This paper describes an application of multiple-hypothesis, multiple target tracking technology to the sonar line tracking problem. A Multiple Hypothesis Line Tracker (MHLT) is developed which retains the recursive minimum-mean-square-error tracking behavior of a Kalman Filter in a maximum-a-posteriori delayed-decision multiple hypothesis context. Multiple line track filter states are developed and maintained using the interacting multiple model (IMM) state representation. Further, the data association and assignment problem is enhanced by considering line attribute information (line bandwidth and SNR) in addition to beam/bearing and frequency fit. MHLT results on real sonar data are presented to demonstrate the benefits of the multiple hypothesis approach. The utility of the system in cluttered environments and particularly in crossing line situations is shown.
Bayesian Methods for Determining the Importance of Effects
USDA-ARS?s Scientific Manuscript database
Criticisms have plagued the frequentist null-hypothesis significance testing (NHST) procedure since the day it was created from the Fisher Significance Test and Hypothesis Test of Jerzy Neyman and Egon Pearson. Alternatives to NHST exist in frequentist statistics, but competing methods are also avai...
Testing for purchasing power parity in the long-run for ASEAN-5
NASA Astrophysics Data System (ADS)
Choji, Niri Martha; Sek, Siok Kun
2017-04-01
For more than a decade, there has been a substantial interest in testing for the validity of the purchasing power parity (PPP) hypothesis empirically. This paper performs a test on revealing a long-run relative Purchasing Power Parity for a group of ASEAN-5 countries for the period of 1996-2016 using monthly data. For this purpose, we used the Pedroni co-integration method to test for the long-run hypothesis of purchasing power parity. We first tested for the stationarity of the variables and found that the variables are non-stationary at levels but stationary at first difference. Results of the Pedroni test rejected the null hypothesis of no co-integration meaning that we have enough evidence to support PPP in the long-run for the ASEAN-5 countries over the period of 1996-2016. In other words, the rejection of null hypothesis implies a long-run relation between nominal exchange rates and relative prices.
UNIFORMLY MOST POWERFUL BAYESIAN TESTS
Johnson, Valen E.
2014-01-01
Uniformly most powerful tests are statistical hypothesis tests that provide the greatest power against a fixed null hypothesis among all tests of a given size. In this article, the notion of uniformly most powerful tests is extended to the Bayesian setting by defining uniformly most powerful Bayesian tests to be tests that maximize the probability that the Bayes factor, in favor of the alternative hypothesis, exceeds a specified threshold. Like their classical counterpart, uniformly most powerful Bayesian tests are most easily defined in one-parameter exponential family models, although extensions outside of this class are possible. The connection between uniformly most powerful tests and uniformly most powerful Bayesian tests can be used to provide an approximate calibration between p-values and Bayes factors. Finally, issues regarding the strong dependence of resulting Bayes factors and p-values on sample size are discussed. PMID:24659829
Parvin, Darius E; McDougle, Samuel D; Taylor, Jordan A; Ivry, Richard B
2018-05-09
Failures to obtain reward can occur from errors in action selection or action execution. Recently, we observed marked differences in choice behavior when the failure to obtain a reward was attributed to errors in action execution compared with errors in action selection (McDougle et al., 2016). Specifically, participants appeared to solve this credit assignment problem by discounting outcomes in which the absence of reward was attributed to errors in action execution. Building on recent evidence indicating relatively direct communication between the cerebellum and basal ganglia, we hypothesized that cerebellar-dependent sensory prediction errors (SPEs), a signal indicating execution failure, could attenuate value updating within a basal ganglia-dependent reinforcement learning system. Here we compared the SPE hypothesis to an alternative, "top-down" hypothesis in which changes in choice behavior reflect participants' sense of agency. In two experiments with male and female human participants, we manipulated the strength of SPEs, along with the participants' sense of agency in the second experiment. The results showed that, whereas the strength of SPE had no effect on choice behavior, participants were much more likely to discount the absence of rewards under conditions in which they believed the reward outcome depended on their ability to produce accurate movements. These results provide strong evidence that SPEs do not directly influence reinforcement learning. Instead, a participant's sense of agency appears to play a significant role in modulating choice behavior when unexpected outcomes can arise from errors in action execution. SIGNIFICANCE STATEMENT When learning from the outcome of actions, the brain faces a credit assignment problem: Failures of reward can be attributed to poor choice selection or poor action execution. Here, we test a specific hypothesis that execution errors are implicitly signaled by cerebellar-based sensory prediction errors. We evaluate this hypothesis and compare it with a more "top-down" hypothesis in which the modulation of choice behavior from execution errors reflects participants' sense of agency. We find that sensory prediction errors have no significant effect on reinforcement learning. Instead, instructions influencing participants' belief of causal outcomes appear to be the main factor influencing their choice behavior. Copyright © 2018 the authors 0270-6474/18/384521-10$15.00/0.
van Helmond, Noud; Steegers, Monique A.; Filippini-de Moor, Gertie P.; Vissers, Kris C.; Wilder-Smith, Oliver H.
2016-01-01
Background Persistent pain is a challenging clinical problem after breast cancer treatment. After surgery, inflammatory pain and nociceptive input from nerve injury induce central sensitization which may play a role in the genesis of persistent pain. Using quantitative sensory testing, we tested the hypothesis that adding COX-2 inhibition to standard treatment reduces hyperalgesia after breast cancer surgery. A secondary hypothesis was that patients developing persistent pain would exhibit more postoperative hyperalgesia. Methods 138 women scheduled for lumpectomy/mastectomy under general anesthesia with paravertebral block were randomized to COX-2 inhibition (2x40mg parecoxib on day of surgery, thereafter 2x200mg celecoxib/day until day five) or placebo. Preoperatively and 1, 5, 15 days and 1, 3, 6, 12 months postoperatively, we determined electric and pressure pain tolerance thresholds in dermatomes C6/T4/L1 and a 100mm VAS score for pain. We calculated the sum of pain tolerance thresholds and analyzed change in these versus preoperatively using mixed models analysis with factor medication. To assess hyperalgesia in persistent pain patients we performed an additional analysis on patients reporting VAS>30 at 12 months. Results 48 COX-2 inhibition and 46 placebo patients were analyzed in a modified intention to treat analysis. Contrary to our primary hypothesis, change in the sum of tolerance thresholds in the COX-2 inhibition group was not different versus placebo. COX-2 inhibition had an effect on pain on movement at postoperative day 5 (p<0.01). Consistent with our secondary hypothesis, change in sum of pressure pain tolerance thresholds in 11 patients that developed persistent pain was negative versus patients without pain (p<0.01) from day 5 to 1 year postoperatively. Conclusions Perioperative COX-2 inhibition has limited value in preventing sensitization and persistent pain after breast cancer surgery. Central sensitization may play a role in the genesis of persistent postsurgical pain. PMID:27935990
[Experimental testing of Pflüger's reflex hypothesis of menstruation in late 19th century].
Simmer, H H
1980-07-01
Pflüger's hypothesis of a nerve reflex as the cause of menstruation published in 1865 and accepted by many, nonetheless did not lead to experimental investigations for 25 years. According to this hypothesis the nerve reflex starts in the ovary by an increase of the intraovarian pressure by the growing follicles. In 1884 Adolph Kehrer proposed a program to test the nerve reflex, but only in 1890, Cohnstein artificially increased the intraovarian pressure in women by bimanual compression from the outside and the vagina. His results were not convincing. Six years later, Strassmann injected fluids into ovaries of animals and obtained changes in the uterus resembling those of oestrus. His results seemed to verify a prognosis derived from Pflüger's hypothesis. Thus, after a long interval, that hypothesis had become a paradigma. Though reasons can be given for the delay, it is little understood, why experimental testing started so late.
When Null Hypothesis Significance Testing Is Unsuitable for Research: A Reassessment.
Szucs, Denes; Ioannidis, John P A
2017-01-01
Null hypothesis significance testing (NHST) has several shortcomings that are likely contributing factors behind the widely debated replication crisis of (cognitive) neuroscience, psychology, and biomedical science in general. We review these shortcomings and suggest that, after sustained negative experience, NHST should no longer be the default, dominant statistical practice of all biomedical and psychological research. If theoretical predictions are weak we should not rely on all or nothing hypothesis tests. Different inferential methods may be most suitable for different types of research questions. Whenever researchers use NHST they should justify its use, and publish pre-study power calculations and effect sizes, including negative findings. Hypothesis-testing studies should be pre-registered and optimally raw data published. The current statistics lite educational approach for students that has sustained the widespread, spurious use of NHST should be phased out.
When Null Hypothesis Significance Testing Is Unsuitable for Research: A Reassessment
Szucs, Denes; Ioannidis, John P. A.
2017-01-01
Null hypothesis significance testing (NHST) has several shortcomings that are likely contributing factors behind the widely debated replication crisis of (cognitive) neuroscience, psychology, and biomedical science in general. We review these shortcomings and suggest that, after sustained negative experience, NHST should no longer be the default, dominant statistical practice of all biomedical and psychological research. If theoretical predictions are weak we should not rely on all or nothing hypothesis tests. Different inferential methods may be most suitable for different types of research questions. Whenever researchers use NHST they should justify its use, and publish pre-study power calculations and effect sizes, including negative findings. Hypothesis-testing studies should be pre-registered and optimally raw data published. The current statistics lite educational approach for students that has sustained the widespread, spurious use of NHST should be phased out. PMID:28824397
Computing a Comprehensible Model for Spam Filtering
NASA Astrophysics Data System (ADS)
Ruiz-Sepúlveda, Amparo; Triviño-Rodriguez, José L.; Morales-Bueno, Rafael
In this paper, we describe the application of the Desicion Tree Boosting (DTB) learning model to spam email filtering.This classification task implies the learning in a high dimensional feature space. So, it is an example of how the DTB algorithm performs in such feature space problems. In [1], it has been shown that hypotheses computed by the DTB model are more comprehensible that the ones computed by another ensemble methods. Hence, this paper tries to show that the DTB algorithm maintains the same comprehensibility of hypothesis in high dimensional feature space problems while achieving the performance of other ensemble methods. Four traditional evaluation measures (precision, recall, F1 and accuracy) have been considered for performance comparison between DTB and others models usually applied to spam email filtering. The size of the hypothesis computed by a DTB is smaller and more comprehensible than the hypothesis computed by Adaboost and Naïve Bayes.
Future Climate Impacts on Harmful Algal Blooms in an Agriculturally Dominated Ecosystem
NASA Astrophysics Data System (ADS)
Aloysius, N. R.; Martin, J.; Ludsin, S.; Stumpf, R. P.
2015-12-01
Cyanobacteria blooms have become a major problem worldwide in aquatic ecosystems that receive excessive runoff of limiting nutrients from terrestrial drainage. Such blooms often are considered harmful because they degrade ecosystem services, threaten public health, and burden local economies. Owing to changing agricultural land-use practices, Lake Erie, the most biologically productive of the North American Great Lakes, has begun to undergo a re-eutrophication in which the frequency and extent of harmful algal blooms (HABs) has increased. Continued climate change has been hypothesized to magnify the HAB problem in Lake Erie in the absence of new agricultural management practices, although this hypothesis has yet to be formally tested empirically. Herein, we tested this hypothesis by predicting how the frequency and extent of potentially harmful cyanobacteria blooms will change in Lake Erie during the 21st century under the Intergovernmental Panel on Climate Change Fifth Assessment climate projections in the region. To do so, we used 80 ensembles of climate projections from 20 Global Climate Models (GCMs) and two greenhouse gas emission scenarios (moderate reduction, RCP4.5; business-as-usual, RCP8.5) to drive a spatiotemporally explicit watershed-hydrology model that was linked to several statistical predictive models of annual cyanobacteria blooms in Lake Erie. Owing to anticipated increases in precipitation during spring and warmer temperatures during summer, our ensemble of predictions revealed that, if current land-management practices continue, the frequency of severe HABs in Lake Erie will increase during the 21st century. These findings identify a real need to consider future climate projections when developing nutrient reduction strategies in the short term, with adaptation also needing to be encouraged under both greenhouse gas emissions scenarios in the absence of effective nutrient mitigation strategies.
Strenziok, Maren; Parasuraman, Raja; Clarke, Ellen; Cisler, Dean S; Thompson, James C; Greenwood, Pamela M
2014-01-15
The ultimate goal of cognitive enhancement as an intervention for age-related cognitive decline is transfer to everyday cognitive functioning. Development of training methods that transfer broadly to untrained cognitive tasks (far transfer) requires understanding of the neural bases of training and far transfer effects. We used cognitive training to test the hypothesis that far transfer is associated with altered attentional control demands mediated by the dorsal attention network and trained sensory cortex. In an exploratory study, we randomly assigned 42 healthy older adults to six weeks of training on Brain Fitness (BF-auditory perception), Space Fortress (SF-visuomotor/working memory), or Rise of Nations (RON-strategic reasoning). Before and after training, cognitive performance, diffusion-derived white matter integrity, and functional connectivity of the superior parietal cortex (SPC) were assessed. We found the strongest effects from BF training, which transferred to everyday problem solving and reasoning and selectively changed integrity of occipito-temporal white matter associated with improvement on untrained everyday problem solving. These results show that cognitive gain from auditory perception training depends on heightened white matter integrity in the ventral attention network. In BF and SF (which also transferred positively), a decrease in functional connectivity between SPC and inferior temporal lobe (ITL) was observed compared to RON-which did not transfer to untrained cognitive function. These findings highlight the importance for cognitive training of top-down control of sensory processing by the dorsal attention network. Altered brain connectivity - observed in the two training tasks that showed far transfer effects - may be a marker for training success. © 2013 Elsevier Inc. All rights reserved.
Testing fundamental ecological concepts with a Pythium-Prunus pathosystem
USDA-ARS?s Scientific Manuscript database
The study of plant-pathogen interactions has enabled tests of basic ecological concepts on plant community assembly (Janzen-Connell Hypothesis) and plant invasion (Enemy Release Hypothesis). We used a field experiment to (#1) test whether Pythium effects depended on host (seedling) density and/or d...
DOE Office of Scientific and Technical Information (OSTI.GOV)
Rohrer, Brandon Robinson
2011-09-01
Events of interest to data analysts are sometimes difficult to characterize in detail. Rather, they consist of anomalies, events that are unpredicted, unusual, or otherwise incongruent. The purpose of this LDRD was to test the hypothesis that a biologically-inspired anomaly detection algorithm could be used to detect contextual, multi-modal anomalies. There currently is no other solution to this problem, but the existence of a solution would have a great national security impact. The technical focus of this research was the application of a brain-emulating cognition and control architecture (BECCA) to the problem of anomaly detection. One aspect of BECCA inmore » particular was discovered to be critical to improved anomaly detection capabilities: it's feature creator. During the course of this project the feature creator was developed and tested against multiple data types. Development direction was drawn from psychological and neurophysiological measurements. Major technical achievements include the creation of hierarchical feature sets created from both audio and imagery data.« less
Alcohol consumption, beverage prices and measurement error.
Young, Douglas J; Bielinska-Kwapisz, Agnieszka
2003-03-01
Alcohol price data collected by the American Chamber of Commerce Researchers Association (ACCRA) have been widely used in studies of alcohol consumption and related behaviors. A number of problems with these data suggest that they contain substantial measurement error, which biases conventional statistical estimators toward a finding of little or no effect of prices on behavior. We test for measurement error, assess the magnitude of the bias and provide an alternative estimator that is likely to be superior. The study utilizes data on per capita alcohol consumption across U.S. states and the years 1982-1997. State and federal alcohol taxes are used as instrumental variables for prices. Formal tests strongly confim the hypothesis of measurement error. Instrumental variable estimates of the price elasticity of demand range from -0.53 to -1.24. These estimates are substantially larger in absolute value than ordinary least squares estimates, which sometimes are not significantly different from zero or even positive. The ACCRA price data are substantially contaminated with measurement error, but using state and federal taxes as instrumental variables mitigates the problem.
Rodríguez-Robles, Desirée; García-González, Julia; Juan-Valdés, Andrés; Morán-Del Pozo, Julia Mª; Guerra-Romero, Manuel I
2014-08-13
Construction and demolition waste (CDW) constitutes an increasingly significant problem in society due to the volume generated, rendering sustainable management and disposal problematic. The aim of this study is to identify a possible reuse option in the concrete manufacturing for recycled aggregates with a significant ceramic content: mixed recycled aggregates (MixRA) and ceramic recycled aggregates (CerRA). In order to do so, several tests are conducted in accordance with the Spanish Code on Structural Concrete (EHE-08) to determine the composition in weight and physic-mechanical characteristics (particle size distributions, fine content, sand equivalent, density, water absorption, flakiness index, and resistance to fragmentation) of the samples for the partial inclusion of the recycled aggregates in concrete mixes. The results of these tests clearly support the hypothesis that this type of material may be suitable for such partial replacements if simple pretreatment is carried out. Furthermore, this measure of reuse is in line with European, national, and regional policies on sustainable development, and presents a solution to the environmental problem caused by the generation of CDW.
A checklist to facilitate objective hypothesis testing in social psychology research.
Washburn, Anthony N; Morgan, G Scott; Skitka, Linda J
2015-01-01
Social psychology is not a very politically diverse area of inquiry, something that could negatively affect the objectivity of social psychological theory and research, as Duarte et al. argue in the target article. This commentary offers a number of checks to help researchers uncover possible biases and identify when they are engaging in hypothesis confirmation and advocacy instead of hypothesis testing.
Nan Liu; Hai Ren; Sufen Yuan; Qinfeng Guo; Long Yang
2013-01-01
The relative importance of facilitation and competition between pairwise plants across abiotic stress gradients as predicted by the stress-gradient hypothesis has been confirmed in arid and temperate ecosystems, but the hypothesis has rarely been tested in tropical systems, particularly across nutrient gradients. The current research examines the interactions between a...
Phase II Clinical Trials: D-methionine to Reduce Noise-Induced Hearing Loss
2012-03-01
loss (NIHL) and tinnitus in our troops. Hypotheses: Primary Hypothesis: Administration of oral D-methionine prior to and during weapons...reduce or prevent noise-induced tinnitus . Primary outcome to test the primary hypothesis: Pure tone air-conduction thresholds. Primary outcome to...test the secondary hypothesis: Tinnitus questionnaires. Specific Aims: 1. To determine whether administering oral D-methionine (D-met) can
Constraints on decay plus oscillation solutions of the solar neutrino problem
NASA Astrophysics Data System (ADS)
Joshipura, Anjan S.; Massó, Eduard; Mohanty, Subhendra
2002-12-01
We examine the constraints on the nonradiative decay of neutrinos from the observations of solar neutrino experiments. The standard oscillation hypothesis among three neutrinos solves the solar and atmospheric neutrino problems. The decay of a massive neutrino mixed with the electron neutrino results in the depletion of the solar neutrino flux. We introduce neutrino decay in the oscillation hypothesis and demand that decay does not spoil the successful explanation of solar and atmospheric observations. We obtain a lower bound on the ratio of the lifetime over the mass of ν2, τ2/m2>22.7 s/MeV for the Mikheyev-Smirnov-Wolfenstein solution of the solar neutrino problem and τ2/m2>27.8 s/MeV for the vacuum oscillation solution (at 99% C.L.).
The use of analysis of variance procedures in biological studies
Williams, B.K.
1987-01-01
The analysis of variance (ANOVA) is widely used in biological studies, yet there remains considerable confusion among researchers about the interpretation of hypotheses being tested. Ambiguities arise when statistical designs are unbalanced, and in particular when not all combinations of design factors are represented in the data. This paper clarifies the relationship among hypothesis testing, statistical modelling and computing procedures in ANOVA for unbalanced data. A simple two-factor fixed effects design is used to illustrate three common parametrizations for ANOVA models, and some associations among these parametrizations are developed. Biologically meaningful hypotheses for main effects and interactions are given in terms of each parametrization, and procedures for testing the hypotheses are described. The standard statistical computing procedures in ANOVA are given along with their corresponding hypotheses. Throughout the development unbalanced designs are assumed and attention is given to problems that arise with missing cells.
Cramer, D; Kupshik, G
1993-09-01
Ellis's rational-emotive theory postulates that since irrational statements augment emotional distress, replacing irrational with rational statements should lessen distress. This hypothesis was tested in the initial stages of psychotherapy by having 13 and 14 clinical out-patients respectively repeat for one minute either rational or irrational statements about their major presenting psychological problem. The distinction by Ellis & Harper (1975) that 'inappropriate' emotions differ qualitatively from 'appropriate' emotions was also examined. Although the experimental intervention had no effect on a post-test measure of irrational beliefs, patients repeating rational statements had significantly lower appropriate and inappropriate negative emotions at post-test, suggesting that inappropriate emotions do not differ qualitatively from appropriate emotions and that making rational statements may lower emotional distress in patients. Patients reiterating irrational statements showed no change in emotions, implying that these kinds of irrational cognitions may have already been present.
Device-independent tests of quantum channels
NASA Astrophysics Data System (ADS)
Dall'Arno, Michele; Brandsen, Sarah; Buscemi, Francesco
2017-03-01
We develop a device-independent framework for testing quantum channels. That is, we falsify a hypothesis about a quantum channel based only on an observed set of input-output correlations. Formally, the problem consists of characterizing the set of input-output correlations compatible with any arbitrary given quantum channel. For binary (i.e. two input symbols, two output symbols) correlations, we show that extremal correlations are always achieved by orthogonal encodings and measurements, irrespective of whether or not the channel preserves commutativity. We further provide a full, closed-form characterization of the sets of binary correlations in the case of: (i) any dihedrally covariant qubit channel (such as any Pauli and amplitude-damping channels) and (ii) any universally-covariant commutativity-preserving channel in an arbitrary dimension (such as any erasure, depolarizing, universal cloning and universal transposition channels).
Symbiosis and the origin of eukaryotic motility
NASA Technical Reports Server (NTRS)
Margulis, L.; Hinkle, G.
1991-01-01
Ongoing work to test the hypothesis of the origin of eukaryotic cell organelles by microbial symbioses is discussed. Because of the widespread acceptance of the serial endosymbiotic theory (SET) of the origin of plastids and mitochondria, the idea of the symbiotic origin of the centrioles and axonemes for spirochete bacteria motility symbiosis was tested. Intracellular microtubular systems are purported to derive from symbiotic associations between ancestral eukaryotic cells and motile bacteria. Four lines of approach to this problem are being pursued: (1) cloning the gene of a tubulin-like protein discovered in Spirocheata bajacaliforniesis; (2) seeking axoneme proteins in spirochets by antibody cross-reaction; (3) attempting to cultivate larger, free-living spirochetes; and (4) studying in detail spirochetes (e.g., Cristispira) symbiotic with marine animals. Other aspects of the investigation are presented.
Device-independent tests of quantum channels.
Dall'Arno, Michele; Brandsen, Sarah; Buscemi, Francesco
2017-03-01
We develop a device-independent framework for testing quantum channels. That is, we falsify a hypothesis about a quantum channel based only on an observed set of input-output correlations. Formally, the problem consists of characterizing the set of input-output correlations compatible with any arbitrary given quantum channel. For binary (i.e. two input symbols, two output symbols) correlations, we show that extremal correlations are always achieved by orthogonal encodings and measurements, irrespective of whether or not the channel preserves commutativity. We further provide a full, closed-form characterization of the sets of binary correlations in the case of: (i) any dihedrally covariant qubit channel (such as any Pauli and amplitude-damping channels) and (ii) any universally-covariant commutativity-preserving channel in an arbitrary dimension (such as any erasure, depolarizing, universal cloning and universal transposition channels).
An omnibus test for the global null hypothesis.
Futschik, Andreas; Taus, Thomas; Zehetmayer, Sonja
2018-01-01
Global hypothesis tests are a useful tool in the context of clinical trials, genetic studies, or meta-analyses, when researchers are not interested in testing individual hypotheses, but in testing whether none of the hypotheses is false. There are several possibilities how to test the global null hypothesis when the individual null hypotheses are independent. If it is assumed that many of the individual null hypotheses are false, combination tests have been recommended to maximize power. If, however, it is assumed that only one or a few null hypotheses are false, global tests based on individual test statistics are more powerful (e.g. Bonferroni or Simes test). However, usually there is no a priori knowledge on the number of false individual null hypotheses. We therefore propose an omnibus test based on cumulative sums of the transformed p-values. We show that this test yields an impressive overall performance. The proposed method is implemented in an R-package called omnibus.