Sequential Tests of Multiple Hypotheses Controlling Type I and II Familywise Error Rates
Bartroff, Jay; Song, Jinlin
2014-01-01
This paper addresses the following general scenario: A scientist wishes to perform a battery of experiments, each generating a sequential stream of data, to investigate some phenomenon. The scientist would like to control the overall error rate in order to draw statistically-valid conclusions from each experiment, while being as efficient as possible. The between-stream data may differ in distribution and dimension but also may be highly correlated, even duplicated exactly in some cases. Treating each experiment as a hypothesis test and adopting the familywise error rate (FWER) metric, we give a procedure that sequentially tests each hypothesis while controlling both the type I and II FWERs regardless of the between-stream correlation, and only requires arbitrary sequential test statistics that control the error rates for a given stream in isolation. The proposed procedure, which we call the sequential Holm procedure because of its inspiration from Holm’s (1979) seminal fixed-sample procedure, shows simultaneous savings in expected sample size and less conservative error control relative to fixed sample, sequential Bonferroni, and other recently proposed sequential procedures in a simulation study. PMID:25092948
Distributed Immune Systems for Wireless Network Information Assurance
2010-04-26
ratio test (SPRT), where the goal is to optimize a hypothesis testing problem given a trade-off between the probability of errors and the...using cumulative sum (CUSUM) and Girshik-Rubin-Shiryaev (GRSh) statistics. In sequential versions of the problem the sequential probability ratio ...the more complicated problems, in particular those where no clear mean can be established. We developed algorithms based on the sequential probability
Wald Sequential Probability Ratio Test for Space Object Conjunction Assessment
NASA Technical Reports Server (NTRS)
Carpenter, James R.; Markley, F Landis
2014-01-01
This paper shows how satellite owner/operators may use sequential estimates of collision probability, along with a prior assessment of the base risk of collision, in a compound hypothesis ratio test to inform decisions concerning collision risk mitigation maneuvers. The compound hypothesis test reduces to a simple probability ratio test, which appears to be a novel result. The test satisfies tolerances related to targeted false alarm and missed detection rates. This result is independent of the method one uses to compute the probability density that one integrates to compute collision probability. A well-established test case from the literature shows that this test yields acceptable results within the constraints of a typical operational conjunction assessment decision timeline. Another example illustrates the use of the test in a practical conjunction assessment scenario based on operations of the International Space Station.
ERIC Educational Resources Information Center
Good, Roland H, III; And Others
1993-01-01
Tested hypothesis that achievement would be maximized by matching student's Kaufman Assessment Battery for Children-identified processing strength with sequential or simultaneous instruction. Findings from analyses of data from three students with strengths in sequential processing and three students with strengths in simultaneous processing…
Phase II design with sequential testing of hypotheses within each stage.
Poulopoulou, Stavroula; Karlis, Dimitris; Yiannoutsos, Constantin T; Dafni, Urania
2014-01-01
The main goal of a Phase II clinical trial is to decide, whether a particular therapeutic regimen is effective enough to warrant further study. The hypothesis tested by Fleming's Phase II design (Fleming, 1982) is [Formula: see text] versus [Formula: see text], with level [Formula: see text] and with a power [Formula: see text] at [Formula: see text], where [Formula: see text] is chosen to represent the response probability achievable with standard treatment and [Formula: see text] is chosen such that the difference [Formula: see text] represents a targeted improvement with the new treatment. This hypothesis creates a misinterpretation mainly among clinicians that rejection of the null hypothesis is tantamount to accepting the alternative, and vice versa. As mentioned by Storer (1992), this introduces ambiguity in the evaluation of type I and II errors and the choice of the appropriate decision at the end of the study. Instead of testing this hypothesis, an alternative class of designs is proposed in which two hypotheses are tested sequentially. The hypothesis [Formula: see text] versus [Formula: see text] is tested first. If this null hypothesis is rejected, the hypothesis [Formula: see text] versus [Formula: see text] is tested next, in order to examine whether the therapy is effective enough to consider further testing in a Phase III study. For the derivation of the proposed design the exact binomial distribution is used to calculate the decision cut-points. The optimal design parameters are chosen, so as to minimize the average sample number (ASN) under specific upper bounds for error levels. The optimal values for the design were found using a simulated annealing method.
NASA Technical Reports Server (NTRS)
Carpenter, J. R.; Markley, F. L.; Alfriend, K. T.; Wright, C.; Arcido, J.
2011-01-01
Sequential probability ratio tests explicitly allow decision makers to incorporate false alarm and missed detection risks, and are potentially less sensitive to modeling errors than a procedure that relies solely on a probability of collision threshold. Recent work on constrained Kalman filtering has suggested an approach to formulating such a test for collision avoidance maneuver decisions: a filter bank with two norm-inequality-constrained epoch-state extended Kalman filters. One filter models 1he null hypothesis 1ha1 the miss distance is inside the combined hard body radius at the predicted time of closest approach, and one filter models the alternative hypothesis. The epoch-state filter developed for this method explicitly accounts for any process noise present in the system. The method appears to work well using a realistic example based on an upcoming highly-elliptical orbit formation flying mission.
ERIC Educational Resources Information Center
Cavanagh, Martine Odile; Langevin, Rene
2010-01-01
The object of this exploratory study was to test two hypotheses. The first was that a student's preferential cognitive style, sequential or simultaneous, can negatively affect the imaginative fiction texts that he or she produces. The second hypothesis was that students possessing a sequential or simultaneous preferential cognitive style would…
Unadjusted Bivariate Two-Group Comparisons: When Simpler is Better.
Vetter, Thomas R; Mascha, Edward J
2018-01-01
Hypothesis testing involves posing both a null hypothesis and an alternative hypothesis. This basic statistical tutorial discusses the appropriate use, including their so-called assumptions, of the common unadjusted bivariate tests for hypothesis testing and thus comparing study sample data for a difference or association. The appropriate choice of a statistical test is predicated on the type of data being analyzed and compared. The unpaired or independent samples t test is used to test the null hypothesis that the 2 population means are equal, thereby accepting the alternative hypothesis that the 2 population means are not equal. The unpaired t test is intended for comparing dependent continuous (interval or ratio) data from 2 study groups. A common mistake is to apply several unpaired t tests when comparing data from 3 or more study groups. In this situation, an analysis of variance with post hoc (posttest) intragroup comparisons should instead be applied. Another common mistake is to apply a series of unpaired t tests when comparing sequentially collected data from 2 study groups. In this situation, a repeated-measures analysis of variance, with tests for group-by-time interaction, and post hoc comparisons, as appropriate, should instead be applied in analyzing data from sequential collection points. The paired t test is used to assess the difference in the means of 2 study groups when the sample observations have been obtained in pairs, often before and after an intervention in each study subject. The Pearson chi-square test is widely used to test the null hypothesis that 2 unpaired categorical variables, each with 2 or more nominal levels (values), are independent of each other. When the null hypothesis is rejected, 1 concludes that there is a probable association between the 2 unpaired categorical variables. When comparing 2 groups on an ordinal or nonnormally distributed continuous outcome variable, the 2-sample t test is usually not appropriate. The Wilcoxon-Mann-Whitney test is instead preferred. When making paired comparisons on data that are ordinal, or continuous but nonnormally distributed, the Wilcoxon signed-rank test can be used. In analyzing their data, researchers should consider the continued merits of these simple yet equally valid unadjusted bivariate statistical tests. However, the appropriate use of an unadjusted bivariate test still requires a solid understanding of its utility, assumptions (requirements), and limitations. This understanding will mitigate the risk of misleading findings, interpretations, and conclusions.
Sequential versus Organized Rehearsal
ERIC Educational Resources Information Center
Weist, Richard M.; Crawford, Charlotte
1973-01-01
The purpose of this research was to test the hypothesis that organization in rehearsal is a necessary condition for organization in recall; that is, if recall is organized, then rehearsal must have been organized. (Author)
Experiences with digital processing of images at INPE
NASA Technical Reports Server (NTRS)
Mascarenhas, N. D. A. (Principal Investigator)
1984-01-01
Four different research experiments with digital image processing at INPE will be described: (1) edge detection by hypothesis testing; (2) image interpolation by finite impulse response filters; (3) spatial feature extraction methods in multispectral classification; and (4) translational image registration by sequential tests of hypotheses.
Wu, Mixia; Shu, Yu; Li, Zhaohai; Liu, Aiyi
2016-01-01
A sequential design is proposed to test whether the accuracy of a binary diagnostic biomarker meets the minimal level of acceptance. The accuracy of a binary diagnostic biomarker is a linear combination of the marker’s sensitivity and specificity. The objective of the sequential method is to minimize the maximum expected sample size under the null hypothesis that the marker’s accuracy is below the minimal level of acceptance. The exact results of two-stage designs based on Youden’s index and efficiency indicate that the maximum expected sample sizes are smaller than the sample sizes of the fixed designs. Exact methods are also developed for estimation, confidence interval and p-value concerning the proposed accuracy index upon termination of the sequential testing. PMID:26947768
Monitoring Items in Real Time to Enhance CAT Security
ERIC Educational Resources Information Center
Zhang, Jinming; Li, Jie
2016-01-01
An IRT-based sequential procedure is developed to monitor items for enhancing test security. The procedure uses a series of statistical hypothesis tests to examine whether the statistical characteristics of each item under inspection have changed significantly during CAT administration. This procedure is compared with a previously developed…
Sequential parallel comparison design with binary and time-to-event outcomes.
Silverman, Rachel Kloss; Ivanova, Anastasia; Fine, Jason
2018-04-30
Sequential parallel comparison design (SPCD) has been proposed to increase the likelihood of success of clinical trials especially trials with possibly high placebo effect. Sequential parallel comparison design is conducted with 2 stages. Participants are randomized between active therapy and placebo in stage 1. Then, stage 1 placebo nonresponders are rerandomized between active therapy and placebo. Data from the 2 stages are pooled to yield a single P value. We consider SPCD with binary and with time-to-event outcomes. For time-to-event outcomes, response is defined as a favorable event prior to the end of follow-up for a given stage of SPCD. We show that for these cases, the usual test statistics from stages 1 and 2 are asymptotically normal and uncorrelated under the null hypothesis, leading to a straightforward combined testing procedure. In addition, we show that the estimators of the treatment effects from the 2 stages are asymptotically normal and uncorrelated under the null and alternative hypothesis, yielding confidence interval procedures with correct coverage. Simulations and real data analysis demonstrate the utility of the binary and time-to-event SPCD. Copyright © 2018 John Wiley & Sons, Ltd.
On resilience studies of system detection and recovery techniques against stealthy insider attacks
NASA Astrophysics Data System (ADS)
Wei, Sixiao; Zhang, Hanlin; Chen, Genshe; Shen, Dan; Yu, Wei; Pham, Khanh D.; Blasch, Erik P.; Cruz, Jose B.
2016-05-01
With the explosive growth of network technologies, insider attacks have become a major concern to business operations that largely rely on computer networks. To better detect insider attacks that marginally manipulate network traffic over time, and to recover the system from attacks, in this paper we implement a temporal-based detection scheme using the sequential hypothesis testing technique. Two hypothetical states are considered: the null hypothesis that the collected information is from benign historical traffic and the alternative hypothesis that the network is under attack. The objective of such a detection scheme is to recognize the change within the shortest time by comparing the two defined hypotheses. In addition, once the attack is detected, a server migration-based system recovery scheme can be triggered to recover the system to the state prior to the attack. To understand mitigation of insider attacks, a multi-functional web display of the detection analysis was developed for real-time analytic. Experiments using real-world traffic traces evaluate the effectiveness of Detection System and Recovery (DeSyAR) scheme. The evaluation data validates the detection scheme based on sequential hypothesis testing and the server migration-based system recovery scheme can perform well in effectively detecting insider attacks and recovering the system under attack.
Organization principles in visual working memory: Evidence from sequential stimulus display.
Gao, Zaifeng; Gao, Qiyang; Tang, Ning; Shui, Rende; Shen, Mowei
2016-01-01
Although the mechanisms of visual working memory (VWM) have been studied extensively in recent years, the active property of VWM has received less attention. In the current study, we examined how VWM integrates sequentially presented stimuli by focusing on the role of Gestalt principles, which are important organizing principles in perceptual integration. We manipulated the level of Gestalt cues among three or four sequentially presented objects that were memorized. The Gestalt principle could not emerge unless all the objects appeared together. We distinguished two hypotheses: a perception-alike hypothesis and an encoding-specificity hypothesis. The former predicts that the Gestalt cue will play a role in information integration within VWM; the latter predicts that the Gestalt cue will not operate within VWM. In four experiments, we demonstrated that collinearity (Experiment 1) and closure (Experiment 2) cues significantly improved VWM performance, and this facilitation was not affected by the testing manner (Experiment 3) or by adding extra colors to the memorized objects (Experiment 4). Finally, we re-established the Gestalt cue benefit with similarity cues (Experiment 5). These findings together suggest that VWM realizes and uses potential Gestalt principles within the stored representations, supporting a perception-alike hypothesis. Copyright © 2015 Elsevier B.V. All rights reserved.
The PMHT: solutions for some of its problems
NASA Astrophysics Data System (ADS)
Wieneke, Monika; Koch, Wolfgang
2007-09-01
Tracking multiple targets in a cluttered environment is a challenging task. Probabilistic Multiple Hypothesis Tracking (PMHT) is an efficient approach for dealing with it. Essentially PMHT is based on the method of Expectation-Maximization for handling with association conflicts. Linearity in the number of targets and measurements is the main motivation for a further development and extension of this methodology. Unfortunately, compared with the Probabilistic Data Association Filter (PDAF), PMHT has not yet shown its superiority in terms of track-lost statistics. Furthermore, the problem of track extraction and deletion is apparently not yet satisfactorily solved within this framework. Four properties of PMHT are responsible for its problems in track maintenance: Non-Adaptivity, Hospitality, Narcissism and Local Maxima. 1, 2 In this work we present a solution for each of them and derive an improved PMHT by integrating the solutions into the PMHT formalism. The new PMHT is evaluated by Monte-Carlo simulations. A sequential Likelihood-Ratio (LR) test for track extraction has been developed and already integrated into the framework of traditional Bayesian Multiple Hypothesis Tracking. 3 As a multi-scan approach, also the PMHT methodology has the potential for track extraction. In this paper an analogous integration of a sequential LR test into the PMHT framework is proposed. We present an LR formula for track extraction and deletion using the PMHT update formulae. As PMHT provides all required ingredients for a sequential LR calculation, the LR is thus a by-product of the PMHT iteration process. Therefore the resulting update formula for the sequential LR test affords the development of Track-Before-Detect algorithms for PMHT. The approach is illustrated by a simple example.
Shoji, Kotaro; Bock, Judith; Cieslak, Roman; Zukowska, Katarzyna; Luszczynska, Aleksandra; Benight, Charles C
2014-09-01
This 2-study longitudinal investigation examined the indirect effects of secondary traumatic stress (STS) on secondary traumatic growth via two mediators: perceived social support and secondary trauma self-efficacy. In particular, we tested if the 2 hypothetical mediators operate sequentially, that is, with secondary trauma self-efficacy facilitating social support (i.e., cultivation hypothesis) and/or social support enhancing self-efficacy (i.e., enabling hypothesis). Participants in Study 1 (N = 293 at Time 1, N = 115 at Time 2) were behavioral healthcare providers working with U.S. military personnel suffering from trauma. Study 2 was conducted among Polish healthcare workers (N = 298 at Time 1, N = 189 at Time 2) providing services for civilian survivors of traumatic events. In both studies, multiple mediational analyses showed evidence for the cultivation hypothesis. The relationship between STS at Time 1 and secondary traumatic growth at Time 2 was mediated sequentially by secondary trauma self-efficacy at Time 1 and social support at Time 2. The enabling hypothesis was not supported. Education and development programs for healthcare workers may benefit from boosting self-efficacy with the intent to facilitate perceived social support. © 2014 Wiley Periodicals, Inc.
DOT National Transportation Integrated Search
1995-10-01
THIS INVESTIGATION WAS COMPLETED AS PART OF THE ITS-IDEA PROGRAM WHICH IS ONE OF THREE IDEA PROGRAMS MANAGED BY THE TRANSPORTATION RESEARCH BOARD (TRB) TO FOSTER INNOVATIONS IN SURFACE TRANSPORTATION. IT FOCUSES ON PRODUCTS AND RESULT FOR THE DEVELOP...
Sequential Analysis of Autonomic Arousal and Self-Injurious Behavior
ERIC Educational Resources Information Center
Hoch, John; Symons, Frank; Sng, Sylvia
2013-01-01
There have been limited direct tests of the hypothesis that self-injurious behavior (SIB) regulates arousal. In this study, two autonomic biomarkers for physiological arousal (heart rate [HR] and the high-frequency [HF] component of heart rate variability [HRV]) were investigated in relation to SIB for 3 participants with intellectual…
A Bayesian sequential design with adaptive randomization for 2-sided hypothesis test.
Yu, Qingzhao; Zhu, Lin; Zhu, Han
2017-11-01
Bayesian sequential and adaptive randomization designs are gaining popularity in clinical trials thanks to their potentials to reduce the number of required participants and save resources. We propose a Bayesian sequential design with adaptive randomization rates so as to more efficiently attribute newly recruited patients to different treatment arms. In this paper, we consider 2-arm clinical trials. Patients are allocated to the 2 arms with a randomization rate to achieve minimum variance for the test statistic. Algorithms are presented to calculate the optimal randomization rate, critical values, and power for the proposed design. Sensitivity analysis is implemented to check the influence on design by changing the prior distributions. Simulation studies are applied to compare the proposed method and traditional methods in terms of power and actual sample sizes. Simulations show that, when total sample size is fixed, the proposed design can obtain greater power and/or cost smaller actual sample size than the traditional Bayesian sequential design. Finally, we apply the proposed method to a real data set and compare the results with the Bayesian sequential design without adaptive randomization in terms of sample sizes. The proposed method can further reduce required sample size. Copyright © 2017 John Wiley & Sons, Ltd.
The sequential megafaunal collapse hypothesis: Testing with existing data
NASA Astrophysics Data System (ADS)
DeMaster, Douglas P.; Trites, Andrew W.; Clapham, Phillip; Mizroch, Sally; Wade, Paul; Small, Robert J.; Hoef, Jay Ver
2006-02-01
Springer et al. [Springer, A.M., Estes, J.A., van Vliet, G.B., Williams, T.M., Doak, D.F., Danner, E.M., Forney, K.A., Pfister, B., 2003. Sequential megafaunal collapse in the North Pacific Ocean: an ongoing legacy of industrial whaling? Proceedings of the National Academy of Sciences 100 (21), 12,223-12,228] hypothesized that great whales were an important prey resource for killer whales, and that the removal of fin and sperm whales by commercial whaling in the region of the Bering Sea/Aleutian Islands (BSAI) in the late 1960s and 1970s led to cascading trophic interactions that caused the sequential decline of populations of harbor seal, northern fur seal, Steller sea lion and northern sea otter. This hypothesis, referred to as the Sequential Megafaunal Collapse (SMC), has stirred considerable interest because of its implication for ecosystem-based management. The SMC has the following assumptions: (1) fin whales and sperm whales were important as prey species in the Bering Sea; (2) the biomass of all large whale species (i.e., North Pacific right, fin, humpback, gray, sperm, minke and bowhead whales) was in decline in the Bering Sea in the 1960s and early 1970s; and (3) pinniped declines in the 1970s and 1980s were sequential. We concluded that the available data are not consistent with the first two assumptions of the SMC. Statistical tests of the timing of the declines do not support the assumption that pinniped declines were sequential. We propose two alternative hypotheses for the declines that are more consistent with the available data. While it is plausible, from energetic arguments, for predation by killer whales to have been an important factor in the declines of one or more of the three populations of pinnipeds and the sea otter population in the BSAI region over the last 30 years, we hypothesize that the declines in pinniped populations in the BSAI can best be understood by invoking a multiple factor hypothesis that includes both bottom-up forcing (as indicated by evidence of nutritional stress in the western Steller sea lion population) and top-down forcing (e.g., predation by killer whales, mortality incidental to commercial fishing, directed harvests). Our second hypothesis is a modification of the top-down forcing mechanism (i.e., killer whale predation on one or more of the pinniped populations and the sea otter population is mediated via the recovery of the eastern North Pacific population of the gray whale). We remain skeptical about the proposed link between commercial whaling on fin and sperm whales, which ended in the mid-1960s, and the observed decline of populations of northern fur seal, harbor seal, and Steller sea lion some 15 years later.
Salter-Venzon, Dawna; Kazlova, Valentina; Izzy Ford, Samantha; Intra, Janjira; Klosner, Allison E; Gellenbeck, Kevin W
2017-05-01
Despite the notable health benefits of carotenoids for human health, the majority of human diets worldwide are repeatedly shown to be inadequate in intake of carotenoid-rich fruits and vegetables, according to current health recommendations. To address this deficit, strategies designed to increase dietary intakes and subsequent plasma levels of carotenoids are warranted. When mixed carotenoids are delivered into the intestinal tract simultaneously, competition occurs for micelle formation and absorption, affecting carotenoid bioavailability. Previously, we tested the in vitro viability of a carotenoid mix designed to deliver individual carotenoids sequentially spaced from one another over the 6 hr transit time of the human upper gastrointestinal system. We hypothesized that temporally and spatially separating the individual carotenoids would reduce competition for micelle formation, improve uptake, and maximize efficacy. Here, we test this hypothesis in a double-blind, repeated-measure, cross-over human study with 12 subjects by comparing the change of plasma carotenoid levels for 8 hr after oral doses of a sequentially spaced carotenoid mix, to a matched mix without sequential spacing. We find the carotenoid change from baseline, measured as area under the curve, is increased following consumption of the sequentially spaced mix compared to concomitant carotenoids delivery. These results demonstrate reduced interaction and regulation between the sequentially spaced carotenoids, suggesting improved bioavailability from a novel sequentially spaced carotenoid mix.
Li, Fuhong; Cao, Bihua; Luo, Yuejia; Lei, Yi; Li, Hong
2013-02-01
Functional magnetic resonance imaging (fMRI) was used to examine differences in brain activation that occur when a person receives the different outcomes of hypothesis testing (HT). Participants were provided with a series of images of batteries and were asked to learn a rule governing what kinds of batteries were charged. Within each trial, the first two charged batteries were sequentially displayed, and participants would generate a preliminary hypothesis based on the perceptual comparison. Next, a third battery that served to strengthen, reject, or was irrelevant to the preliminary hypothesis was displayed. The fMRI results revealed that (1) no significant differences in brain activation were found between the 2 hypothesis-maintain conditions (i.e., strengthen and irrelevant conditions); and (2) compared with the hypothesis-maintain conditions, the hypothesis-reject condition activated the left medial frontal cortex, bilateral putamen, left parietal cortex, and right cerebellum. These findings are discussed in terms of the neural correlates of the subcomponents of HT and working memory manipulation. Copyright © 2012 Elsevier Inc. All rights reserved.
Surveillance system and method having an adaptive sequential probability fault detection test
NASA Technical Reports Server (NTRS)
Herzog, James P. (Inventor); Bickford, Randall L. (Inventor)
2005-01-01
System and method providing surveillance of an asset such as a process and/or apparatus by providing training and surveillance procedures that numerically fit a probability density function to an observed residual error signal distribution that is correlative to normal asset operation and then utilizes the fitted probability density function in a dynamic statistical hypothesis test for providing improved asset surveillance.
Surveillance system and method having an adaptive sequential probability fault detection test
NASA Technical Reports Server (NTRS)
Bickford, Randall L. (Inventor); Herzog, James P. (Inventor)
2006-01-01
System and method providing surveillance of an asset such as a process and/or apparatus by providing training and surveillance procedures that numerically fit a probability density function to an observed residual error signal distribution that is correlative to normal asset operation and then utilizes the fitted probability density function in a dynamic statistical hypothesis test for providing improved asset surveillance.
Surveillance System and Method having an Adaptive Sequential Probability Fault Detection Test
NASA Technical Reports Server (NTRS)
Bickford, Randall L. (Inventor); Herzog, James P. (Inventor)
2008-01-01
System and method providing surveillance of an asset such as a process and/or apparatus by providing training and surveillance procedures that numerically fit a probability density function to an observed residual error signal distribution that is correlative to normal asset operation and then utilizes the fitted probability density function in a dynamic statistical hypothesis test for providing improved asset surveillance.
Multi-arm group sequential designs with a simultaneous stopping rule.
Urach, S; Posch, M
2016-12-30
Multi-arm group sequential clinical trials are efficient designs to compare multiple treatments to a control. They allow one to test for treatment effects already in interim analyses and can have a lower average sample number than fixed sample designs. Their operating characteristics depend on the stopping rule: We consider simultaneous stopping, where the whole trial is stopped as soon as for any of the arms the null hypothesis of no treatment effect can be rejected, and separate stopping, where only recruitment to arms for which a significant treatment effect could be demonstrated is stopped, but the other arms are continued. For both stopping rules, the family-wise error rate can be controlled by the closed testing procedure applied to group sequential tests of intersection and elementary hypotheses. The group sequential boundaries for the separate stopping rule also control the family-wise error rate if the simultaneous stopping rule is applied. However, we show that for the simultaneous stopping rule, one can apply improved, less conservative stopping boundaries for local tests of elementary hypotheses. We derive corresponding improved Pocock and O'Brien type boundaries as well as optimized boundaries to maximize the power or average sample number and investigate the operating characteristics and small sample properties of the resulting designs. To control the power to reject at least one null hypothesis, the simultaneous stopping rule requires a lower average sample number than the separate stopping rule. This comes at the cost of a lower power to reject all null hypotheses. Some of this loss in power can be regained by applying the improved stopping boundaries for the simultaneous stopping rule. The procedures are illustrated with clinical trials in systemic sclerosis and narcolepsy. © 2016 The Authors. Statistics in Medicine Published by John Wiley & Sons Ltd. © 2016 The Authors. Statistics in Medicine Published by John Wiley & Sons Ltd.
Is it better to select or to receive? Learning via active and passive hypothesis testing.
Markant, Douglas B; Gureckis, Todd M
2014-02-01
People can test hypotheses through either selection or reception. In a selection task, the learner actively chooses observations to test his or her beliefs, whereas in reception tasks data are passively encountered. People routinely use both forms of testing in everyday life, but the critical psychological differences between selection and reception learning remain poorly understood. One hypothesis is that selection learning improves learning performance by enhancing generic cognitive processes related to motivation, attention, and engagement. Alternatively, we suggest that differences between these 2 learning modes derives from a hypothesis-dependent sampling bias that is introduced when a person collects data to test his or her own individual hypothesis. Drawing on influential models of sequential hypothesis-testing behavior, we show that such a bias (a) can lead to the collection of data that facilitates learning compared with reception learning and (b) can be more effective than observing the selections of another person. We then report a novel experiment based on a popular category learning paradigm that compares reception and selection learning. We additionally compare selection learners to a set of "yoked" participants who viewed the exact same sequence of observations under reception conditions. The results revealed systematic differences in performance that depended on the learner's role in collecting information and the abstract structure of the problem.
Non-Adjacent Dependency Learning in Infants at Familial Risk of Dyslexia
ERIC Educational Resources Information Center
Kerkhoff, Annemarie; de Bree, Elise; de Klerk, Maartje; Wijnen, Frank
2013-01-01
This study tests the hypothesis that developmental dyslexia is (partly) caused by a deficit in implicit sequential learning, by investigating whether infants at familial risk of dyslexia can track non-adjacent dependencies in an artificial language. An implicit learning deficit would hinder detection of such dependencies, which mark grammatical…
Diederich, Adele
2008-02-01
Recently, Diederich and Busemeyer (2006) evaluated three hypotheses formulated as particular versions of a sequential-sampling model to account for the effects of payoffs in a perceptual decision task with time constraints. The bound-change hypothesis states that payoffs affect the distance of the starting position of the decision process to each decision bound. The drift-rate-change hypothesis states that payoffs affect the drift rate of the decision process. The two-stage-processing hypothesis assumes two processes, one for processing payoffs and another for processing stimulus information, and that on a given trial, attention switches from one process to the other. The latter hypothesis gave the best account of their data. The present study investigated two questions: (1) Does the experimental setting influence decisions, and consequently affect the fits of the hypotheses? A task was conducted in two experimental settings--either the time limit or the payoff matrix was held constant within a given block of trials, using three different payoff matrices and four different time limits--in order to answer this question. (2) Could it be that participants neglect payoffs on some trials and stimulus information on others? To investigate this idea, a further hypothesis was considered, the mixture-of-processes hypothesis. Like the two-stage-processing hypothesis, it postulates two processes, one for payoffs and another for stimulus information. However, it differs from the previous hypothesis in assuming that on a given trial exactly one of the processes operates, never both. The present design had no effect on choice probability but may have affected choice response times (RTs). Overall, the two-stage-processing hypothesis gave the best account, with respect both to choice probabilities and to observed mean RTs and mean RT patterns within a choice pair.
ERIC Educational Resources Information Center
Mainela-Arnold, Elina; Evans, Julia L.
2014-01-01
This study tested the predictions of the procedural deficit hypothesis by investigating the relationship between sequential statistical learning and two aspects of lexical ability, lexical-phonological and lexical-semantic, in children with and without specific language impairment (SLI). Participants included forty children (ages 8;5-12;3), twenty…
ERIC Educational Resources Information Center
Meng, Yi; Tan, Jing; Li, Jing
2017-01-01
Drawing upon the componential theory of creativity, cognitive evaluation theory and social exchange theory, the study reported in this paper tested a mediating model based on the hypothesis that abusive supervision negatively influences creativity sequentially through leader-member exchange (LMX) and intrinsic motivation. The study employed…
ERIC Educational Resources Information Center
Pedersen, Sara; Vitaro, Frank; Barker, Edward D.; Borge, Anne I. H.
2007-01-01
This study used a sample of 551 children surveyed yearly from ages 6 to 13 to examine the longitudinal associations among early behavior, middle-childhood peer rejection and friendedness, and early-adolescent depressive symptoms, loneliness, and delinquency. The study tested a sequential mediation hypothesis in which (a) behavior problems in the…
Steblay, N; Dysart, J; Fulero, S; Lindsay, R C
2001-10-01
Most police lineups use a simultaneous presentation technique in which eyewitnesses view all lineup members at the same time. Lindsay and Wells (R. C. L. Lindsay & G. L. Wells, 1985) devised an alternative procedure, the sequential lineup, in which witnesses view one lineup member at a time and decide whether or not that person is the perpetrator prior to viewing the next lineup member. The present work uses the technique of meta-analysis to compare the accuracy rates of these presentation styles. Twenty-three papers were located (9 published and 14 unpublished), providing 30 tests of the hypothesis and including 4,145 participants. Results showed that identification of perpetrators from target-present lineups occurs at a higher rate from simultaneous than from sequential lineups. However, this difference largely disappears when moderator variables approximating real world conditions are considered. Also, correct rejection rates were significantly higher for sequential than simultaneous lineups and this difference is maintained or increased by greater approximation to real world conditions. Implications of these findings are discussed.
Sequential Probability Ratio Test for Spacecraft Collision Avoidance Maneuver Decisions
NASA Technical Reports Server (NTRS)
Carpenter, J. Russell; Markley, F. Landis
2013-01-01
A document discusses sequential probability ratio tests that explicitly allow decision-makers to incorporate false alarm and missed detection risks, and are potentially less sensitive to modeling errors than a procedure that relies solely on a probability of collision threshold. Recent work on constrained Kalman filtering has suggested an approach to formulating such a test for collision avoidance maneuver decisions: a filter bank with two norm-inequality-constrained epoch-state extended Kalman filters. One filter models the null hypotheses that the miss distance is inside the combined hard body radius at the predicted time of closest approach, and one filter models the alternative hypothesis. The epoch-state filter developed for this method explicitly accounts for any process noise present in the system. The method appears to work well using a realistic example based on an upcoming, highly elliptical orbit formation flying mission.
Sequential-Simultaneous Analysis of Japanese Children's Performance on the Japanese McCarthy.
ERIC Educational Resources Information Center
Ishikuma, Toshinori; And Others
This study explored the hypothesis that Japanese children perform significantly better on simultaneous processing than on sequential processing. The Kaufman Assessment Battery for Children (K-ABC) served as the criterion of the two types of mental processing. Regression equations to predict Sequential and Simultaneous processing from McCarthy…
ERIC Educational Resources Information Center
Osman, Magda; Wilkinson, Leonora; Beigi, Mazda; Castaneda, Cristina Sanchez; Jahanshahi, Marjan
2008-01-01
The striatum is considered to mediate some forms of procedural learning. Complex dynamic control (CDC) tasks involve an individual having to make a series of sequential decisions to achieve a specific outcome (e.g. learning to operate and control a car), and they involve procedural learning. The aim of this study was to test the hypothesis that…
Sequential Analysis: Hypothesis Testing and Changepoint Detection
2014-07-11
it is necessary to estimate in situ the geographical coordinates and other parameters of earthquakes . The standard sensor equipment of a three...components. When an earthquake arises, the sensors begin to record several types of seismic waves (body and surface waves), among which the more important...machines and to increased safety norms. Many structures to be monitored, e.g., civil engineering structures subject to wind and earthquakes , aircraft
Adaptive sequential Bayesian classification using Page's test
NASA Astrophysics Data System (ADS)
Lynch, Robert S., Jr.; Willett, Peter K.
2002-03-01
In this paper, the previously introduced Mean-Field Bayesian Data Reduction Algorithm is extended for adaptive sequential hypothesis testing utilizing Page's test. In general, Page's test is well understood as a method of detecting a permanent change in distribution associated with a sequence of observations. However, the relationship between detecting a change in distribution utilizing Page's test with that of classification and feature fusion is not well understood. Thus, the contribution of this work is based on developing a method of classifying an unlabeled vector of fused features (i.e., detect a change to an active statistical state) as quickly as possible given an acceptable mean time between false alerts. In this case, the developed classification test can be thought of as equivalent to performing a sequential probability ratio test repeatedly until a class is decided, with the lower log-threshold of each test being set to zero and the upper log-threshold being determined by the expected distance between false alerts. It is of interest to estimate the delay (or, related stopping time) to a classification decision (the number of time samples it takes to classify the target), and the mean time between false alerts, as a function of feature selection and fusion by the Mean-Field Bayesian Data Reduction Algorithm. Results are demonstrated by plotting the delay to declaring the target class versus the mean time between false alerts, and are shown using both different numbers of simulated training data and different numbers of relevant features for each class.
Identifying protein complexes in PPI network using non-cooperative sequential game.
Maulik, Ujjwal; Basu, Srinka; Ray, Sumanta
2017-08-21
Identifying protein complexes from protein-protein interaction (PPI) network is an important and challenging task in computational biology as it helps in better understanding of cellular mechanisms in various organisms. In this paper we propose a noncooperative sequential game based model for protein complex detection from PPI network. The key hypothesis is that protein complex formation is driven by mechanism that eventually optimizes the number of interactions within the complex leading to dense subgraph. The hypothesis is drawn from the observed network property named small world. The proposed multi-player game model translates the hypothesis into the game strategies. The Nash equilibrium of the game corresponds to a network partition where each protein either belong to a complex or form a singleton cluster. We further propose an algorithm to find the Nash equilibrium of the sequential game. The exhaustive experiment on synthetic benchmark and real life yeast networks evaluates the structural as well as biological significance of the network partitions.
Memory and other properties of multiple test procedures generated by entangled graphs.
Maurer, Willi; Bretz, Frank
2013-05-10
Methods for addressing multiplicity in clinical trials have attracted much attention during the past 20 years. They include the investigation of new classes of multiple test procedures, such as fixed sequence, fallback and gatekeeping procedures. More recently, sequentially rejective graphical test procedures have been introduced to construct and visualize complex multiple test strategies. These methods propagate the local significance level of a rejected null hypothesis to not-yet rejected hypotheses. In the graph defining the test procedure, hypotheses together with their local significance levels are represented by weighted vertices and the propagation rule by weighted directed edges. An algorithm provides the rules for updating the local significance levels and the transition weights after rejecting an individual hypothesis. These graphical procedures have no memory in the sense that the origin of the propagated significance level is ignored in subsequent iterations. However, in some clinical trial applications, memory is desirable to reflect the underlying dependence structure of the study objectives. In such cases, it would allow the further propagation of significance levels to be dependent on their origin and thus reflect the grouped parent-descendant structures of the hypotheses. We will give examples of such situations and show how to induce memory and other properties by convex combination of several individual graphs. The resulting entangled graphs provide an intuitive way to represent the underlying relative importance relationships between the hypotheses, are as easy to perform as the original individual graphs, remain sequentially rejective and control the familywise error rate in the strong sense. Copyright © 2012 John Wiley & Sons, Ltd.
[Examination of the hypothesis 'the factors and mechanisms of superiority'].
Sierra-Fitzgerald, O; Quevedo-Caicedo, J; López-Calderón, M G
INTRODUCTION. The hypothesis of Geschwind and Galaburda suggests that specific cognitive superiority arises as a result of an alteration in development of the nervous system. In this article we review the co existence of superiority and inferiority . PATIENTS AND METHODS. A study was made of six children aged between 6 and 8 years old at the Instituto de Belles Artes Antonio Maria Valencia in Cali,Columbia with an educational level between second and third grade at a primary school and of medium low socio economic status. The children were considered to have superior musical ability by music experts, which is the way in which the concept of superiority was to be tested. The concept of inferiority was tested by neuropsychological tests = 1.5 DE below normal for the same age. We estimated the perinatal neurological risk in each case. Subsequently the children s general intelligence and specific cognitive abilities were evaluated. In the first case the WISC R and MSCA were used. The neuropsychological profiles were obtained by broad evaluation using a verbal fluency test, a test using counters, Boston vocabulary test, the Wechster memory scale, sequential verbal memory test, super imposed figures test, Piaget Head battery, Rey Osterrieth complex figure and the Wisconsin card classification test. The RESULTS showed slight/moderate deficits in practical construction ability and mild defects of memory and concept abilities. In general the results supported the hypothesis tested. The mechanisms of superiority proposed in the classical hypothesis mainly involve the contralateral hemisphere: in this study the ipsilateral mechanism was more important.
Effects of musical training on sound pattern processing in high-school students.
Wang, Wenjung; Staffaroni, Laura; Reid, Errold; Steinschneider, Mitchell; Sussman, Elyse
2009-05-01
Recognizing melody in music involves detection of both the pitch intervals and the silence between sequentially presented sounds. This study tested the hypothesis that active musical training in adolescents facilitates the ability to passively detect sequential sound patterns compared to musically non-trained age-matched peers. Twenty adolescents, aged 15-18 years, were divided into groups according to their musical training and current experience. A fixed order tone pattern was presented at various stimulus rates while electroencephalogram was recorded. The influence of musical training on passive auditory processing of the sound patterns was assessed using components of event-related brain potentials (ERPs). The mismatch negativity (MMN) ERP component was elicited in different stimulus onset asynchrony (SOA) conditions in non-musicians than musicians, indicating that musically active adolescents were able to detect sound patterns across longer time intervals than age-matched peers. Musical training facilitates detection of auditory patterns, allowing the ability to automatically recognize sequential sound patterns over longer time periods than non-musical counterparts.
Mainela-Arnold, Elina; Evans, Julia L.
2014-01-01
This study tested the predictions of the procedural deficit hypothesis by investigating the relationship between sequential statistical learning and two aspects of lexical ability, lexical-phonological and lexical-semantic, in children with and without specific language impairment (SLI). Participants included 40 children (ages 8;5–12;3), 20 children with SLI and 20 with typical development. Children completed Saffran’s statistical word segmentation task, a lexical-phonological access task (gating task), and a word definition task. Poor statistical learners were also poor at managing lexical-phonological competition during the gating task. However, statistical learning was not a significant predictor of semantic richness in word definitions. The ability to track statistical sequential regularities may be important for learning the inherently sequential structure of lexical-phonology, but not as important for learning lexical-semantic knowledge. Consistent with the procedural/declarative memory distinction, the brain networks associated with the two types of lexical learning are likely to have different learning properties. PMID:23425593
Analysis of membrane fusion as a two-state sequential process: evaluation of the stalk model.
Weinreb, Gabriel; Lentz, Barry R
2007-06-01
We propose a model that accounts for the time courses of PEG-induced fusion of membrane vesicles of varying lipid compositions and sizes. The model assumes that fusion proceeds from an initial, aggregated vesicle state ((A) membrane contact) through two sequential intermediate states (I(1) and I(2)) and then on to a fusion pore state (FP). Using this model, we interpreted data on the fusion of seven different vesicle systems. We found that the initial aggregated state involved no lipid or content mixing but did produce leakage. The final state (FP) was not leaky. Lipid mixing normally dominated the first intermediate state (I(1)), but content mixing signal was also observed in this state for most systems. The second intermediate state (I(2)) exhibited both lipid and content mixing signals and leakage, and was sometimes the only leaky state. In some systems, the first and second intermediates were indistinguishable and converted directly to the FP state. Having also tested a parallel, two-intermediate model subject to different assumptions about the nature of the intermediates, we conclude that a sequential, two-intermediate model is the simplest model sufficient to describe PEG-mediated fusion in all vesicle systems studied. We conclude as well that a fusion intermediate "state" should not be thought of as a fixed structure (e.g., "stalk" or "transmembrane contact") of uniform properties. Rather, a fusion "state" describes an ensemble of similar structures that can have different mechanical properties. Thus, a "state" can have varying probabilities of having a given functional property such as content mixing, lipid mixing, or leakage. Our data show that the content mixing signal may occur through two processes, one correlated and one not correlated with leakage. Finally, we consider the implications of our results in terms of the "modified stalk" hypothesis for the mechanism of lipid pore formation. We conclude that our results not only support this hypothesis but also provide a means of analyzing fusion time courses so as to test it and gauge the mechanism of action of fusion proteins in the context of the lipidic hypothesis of fusion.
ERIC Educational Resources Information Center
Shaughnessy, M.; And Others
Numerous cognitive psychologists have validated the hypothesis, originally advanced by the Russian physician, A. Luria, that different individuals process information in two distinctly different manners: simultaneously and sequentially. The importance of recognizing the existence of these two distinct styles of processing information and selecting…
HMM Sequential Hypothesis Tests for Intrusion Detection in MANETs Extended Abstract
2003-01-01
securing the routing protocols of mobile ad hoc wireless net- works has been done in prevention. Intrusion detection systems play a complimentary...TERMS 16. SECURITY CLASSIFICATION OF: 17. LIMITATION OF ABSTRACT 18. NUMBER OF PAGES 10 19a. NAME OF RESPONSIBLE PERSON a. REPORT unclassified...hops of A would be unable to communicate with B and vice versa [1]. 1.2 The role of intrusion detection in security In order to provide reliable
2015-10-18
statistical assessment of geosynchronous satellite status based on non- resolved photometry data,” AMOS Technical Conference, 2014 2. Wald, A...the seasonal changes. Its purpose is to enable an ongoing, automated assessment of satellite behavior through its life cycle using the photometry data...work on the decision to move the time slider [1], which is required in order to update the baseline signature (brightness) data for a satellite
A comparative analysis of sex change in Labridae supports the size advantage hypothesis.
Kazancioğlu, Erem; Alonzo, Suzanne H
2010-08-01
The size advantage hypothesis (SAH) predicts that the rate of increase in male and female fitness with size (the size advantage) drives the evolution of sequential hermaphroditism or sex change. Despite qualitative agreement between empirical patterns and SAH, only one comparative study tested SAH quantitatively. Here, we perform the first comparative analysis of sex change in Labridae, a group of hermaphroditic and dioecious (non-sex changer) fish with several model sex-changing species. We also estimate, for the first time, rates of evolutionary transitions between sex change and dioecy. Our analyses support SAH and indicate that the evolution of hermaphroditism is correlated to the size advantage. Furthermore, we find that transitions from sex change to dioecy are less likely under stronger size advantage. We cannot determine, however, how the size advantage affects transitions from dioecy to sex change. Finally, contrary to what is generally expected, we find that transitions from dioecy to sex change are more likely than transitions from sex change to dioecy. The similarity of sexual differentiation in hermaphroditic and dioecious labrids might underlie this pattern. We suggest that elucidating the developmental basis of sex change is critical to predict and explain patterns of the evolutionary history of sequential hermaphroditism.
Bursts and heavy tails in temporal and sequential dynamics of foraging decisions.
Jung, Kanghoon; Jang, Hyeran; Kralik, Jerald D; Jeong, Jaeseung
2014-08-01
A fundamental understanding of behavior requires predicting when and what an individual will choose. However, the actual temporal and sequential dynamics of successive choices made among multiple alternatives remain unclear. In the current study, we tested the hypothesis that there is a general bursting property in both the timing and sequential patterns of foraging decisions. We conducted a foraging experiment in which rats chose among four different foods over a continuous two-week time period. Regarding when choices were made, we found bursts of rapidly occurring actions, separated by time-varying inactive periods, partially based on a circadian rhythm. Regarding what was chosen, we found sequential dynamics in affective choices characterized by two key features: (a) a highly biased choice distribution; and (b) preferential attachment, in which the animals were more likely to choose what they had previously chosen. To capture the temporal dynamics, we propose a dual-state model consisting of active and inactive states. We also introduce a satiation-attainment process for bursty activity, and a non-homogeneous Poisson process for longer inactivity between bursts. For the sequential dynamics, we propose a dual-control model consisting of goal-directed and habit systems, based on outcome valuation and choice history, respectively. This study provides insights into how the bursty nature of behavior emerges from the interaction of different underlying systems, leading to heavy tails in the distribution of behavior over time and choices.
2011-05-24
of 230 community similarity (Legendre and Legendre 1998). 231 232 Permutational Multivariate Analysis of Variance ( PerMANOVA ) (McArdle...241 null hypothesis can be rejected with a type I error rate of a. We used an implementation 242 of PerMANOVA that involved sequential removal...TEXTURE, and 249 HABITAT. 250 251 The null distribution for PerMANOVA tests for site-scale effects was generated 252 using a restricted
An investigation of the lag between the start of research and the development of new technology
NASA Technical Reports Server (NTRS)
Glass, S. E.
1982-01-01
The lag which occurs between the start of NASA-sponsored research and the development of new technology is addressed. A possible common gestation period is examined. The lags vary from one to zero years. The observed lag as it relates to patent applications is shorter than the lag as it relates to invention disclosures. The sequential hypothesis testing showed that invention disclosures correlated better to the measures of research effort used then did patent applications.
Rossitto, Giacomo; Battistel, Michele; Barbiero, Giulio; Bisogni, Valeria; Maiolino, Giuseppe; Diego, Miotto; Seccia, Teresa M; Rossi, Gian Paolo
2018-02-01
The pulsatile secretion of adrenocortical hormones and a stress reaction occurring when starting adrenal vein sampling (AVS) can affect the selectivity and also the assessment of lateralization when sequential blood sampling is used. We therefore tested the hypothesis that a simulated sequential blood sampling could decrease the diagnostic accuracy of lateralization index for identification of aldosterone-producing adenoma (APA), as compared with bilaterally simultaneous AVS. In 138 consecutive patients who underwent subtyping of primary aldosteronism, we compared the results obtained simultaneously bilaterally when starting AVS (t-15) and 15 min after (t0), with those gained with a simulated sequential right-to-left AVS technique (R ⇒ L) created by combining hormonal values obtained at t-15 and at t0. The concordance between simultaneously obtained values at t-15 and t0, and between simultaneously obtained values and values gained with a sequential R ⇒ L technique, was also assessed. We found a marked interindividual variability of lateralization index values in the patients with bilaterally selective AVS at both time point. However, overall the lateralization index simultaneously determined at t0 provided a more accurate identification of APA than the simulated sequential lateralization indexR ⇒ L (P = 0.001). Moreover, regardless of which side was sampled first, the sequential AVS technique induced a sequence-dependent overestimation of lateralization index. While in APA patients the concordance between simultaneous AVS at t0 and t-15 and between simultaneous t0 and sequential technique was moderate-to-good (K = 0.55 and 0.66, respectively), in non-APA patients, it was poor (K = 0.12 and 0.13, respectively). Sequential AVS generates factitious between-sides gradients, which lower its diagnostic accuracy, likely because of the stress reaction arising upon starting AVS.
A field test of three LQAS designs to assess the prevalence of acute malnutrition.
Deitchler, Megan; Valadez, Joseph J; Egge, Kari; Fernandez, Soledad; Hennigan, Mary
2007-08-01
The conventional method for assessing the prevalence of Global Acute Malnutrition (GAM) in emergency settings is the 30 x 30 cluster-survey. This study describes alternative approaches: three Lot Quality Assurance Sampling (LQAS) designs to assess GAM. The LQAS designs were field-tested and their results compared with those from a 30 x 30 cluster-survey. Computer simulations confirmed that small clusters instead of a simple random sample could be used for LQAS assessments of GAM. Three LQAS designs were developed (33 x 6, 67 x 3, Sequential design) to assess GAM thresholds of 10, 15 and 20%. The designs were field-tested simultaneously with a 30 x 30 cluster-survey in Siraro, Ethiopia during June 2003. Using a nested study design, anthropometric, morbidity and vaccination data were collected on all children 6-59 months in sampled households. Hypothesis tests about GAM thresholds were conducted for each LQAS design. Point estimates were obtained for the 30 x 30 cluster-survey and the 33 x 6 and 67 x 3 LQAS designs. Hypothesis tests showed GAM as <10% for the 33 x 6 design and GAM as > or =10% for the 67 x 3 and Sequential designs. Point estimates for the 33 x 6 and 67 x 3 designs were similar to those of the 30 x 30 cluster-survey for GAM (6.7%, CI = 3.2-10.2%; 8.2%, CI = 4.3-12.1%, 7.4%, CI = 4.8-9.9%) and all other indicators. The CIs for the LQAS designs were only slightly wider than the CIs for the 30 x 30 cluster-survey; yet the LQAS designs required substantially less time to administer. The LQAS designs provide statistically appropriate alternatives to the more time-consuming 30 x 30 cluster-survey. However, additional field-testing is needed using independent samples rather than a nested study design.
Bursts and Heavy Tails in Temporal and Sequential Dynamics of Foraging Decisions
Jung, Kanghoon; Jang, Hyeran; Kralik, Jerald D.; Jeong, Jaeseung
2014-01-01
A fundamental understanding of behavior requires predicting when and what an individual will choose. However, the actual temporal and sequential dynamics of successive choices made among multiple alternatives remain unclear. In the current study, we tested the hypothesis that there is a general bursting property in both the timing and sequential patterns of foraging decisions. We conducted a foraging experiment in which rats chose among four different foods over a continuous two-week time period. Regarding when choices were made, we found bursts of rapidly occurring actions, separated by time-varying inactive periods, partially based on a circadian rhythm. Regarding what was chosen, we found sequential dynamics in affective choices characterized by two key features: (a) a highly biased choice distribution; and (b) preferential attachment, in which the animals were more likely to choose what they had previously chosen. To capture the temporal dynamics, we propose a dual-state model consisting of active and inactive states. We also introduce a satiation-attainment process for bursty activity, and a non-homogeneous Poisson process for longer inactivity between bursts. For the sequential dynamics, we propose a dual-control model consisting of goal-directed and habit systems, based on outcome valuation and choice history, respectively. This study provides insights into how the bursty nature of behavior emerges from the interaction of different underlying systems, leading to heavy tails in the distribution of behavior over time and choices. PMID:25122498
No place to hide: when shame causes proselfs to cooperate.
Declerck, Carolyn Henriette; Boone, Christophe; Kiyonari, Toko
2014-01-01
Shame is considered a social emotion with action tendencies that elicit socially beneficial behavior. Yet, unlike other social emotions, prior experimental studies do not indicate that incidental shame boosts prosocial behavior. Based on the affect as information theory, we hypothesize that incidental feelings of shame can increase cooperation, but only for self-interested individuals, and only in a context where shame is relevant with regards to its action tendency. To test this hypothesis, cooperation levels are compared between a simultaneous prisoner's dilemma (where "defect" may result from multiple motives) and a sequential prisoner's dilemma (where "second player defect" is the result of intentional greediness). As hypothesized, shame positively affected proselfs in a sequential prisoner's dilemma. Hence ashamed proselfs become inclined to cooperate when they believe they have no way to hide their greediness, and not necessarily because they want to make up for earlier wrong-doing.
Silva, Ivair R
2018-01-15
Type I error probability spending functions are commonly used for designing sequential analysis of binomial data in clinical trials, but it is also quickly emerging for near-continuous sequential analysis of post-market drug and vaccine safety surveillance. It is well known that, for clinical trials, when the null hypothesis is not rejected, it is still important to minimize the sample size. Unlike in post-market drug and vaccine safety surveillance, that is not important. In post-market safety surveillance, specially when the surveillance involves identification of potential signals, the meaningful statistical performance measure to be minimized is the expected sample size when the null hypothesis is rejected. The present paper shows that, instead of the convex Type I error spending shape conventionally used in clinical trials, a concave shape is more indicated for post-market drug and vaccine safety surveillance. This is shown for both, continuous and group sequential analysis. Copyright © 2017 John Wiley & Sons, Ltd.
Wade, P.R.; Burkanov, V.N.; Dahlheim, M.E.; Friday, N.A.; Fritz, L.W.; Loughlin, Thomas R.; Mizroch, S.A.; Muto, M.M.; Rice, D.W.; Barrett-Lennard, L. G.; Black, N.A.; Burdin, A.M.; Calambokidis, J.; Cerchio, S.; Ford, J.K.B.; Jacobsen, J.K.; Matkin, C.O.; Matkin, D.R.; Mehta, A.V.; Small, R.J.; Straley, J.M.; McCluskey, S.M.; VanBlaricom, G.R.; Clapham, P.J.
2007-01-01
Springer et al. (2003) contend that sequential declines occurred in North Pacific populations of harbor and fur seals, Steller sea lions, and sea otters. They hypothesize that these were due to increased predation by killer whales, when industrial whaling's removal of large whales as a supposed primary food source precipitated a prey switch. Using a regional approach, we reexamined whale catch data, killer whale predation observations, and the current biomass and trends of potential prey, and found little support for the prey-switching hypothesis. Large whale biomass in the Bering Sea did not decline as much as suggested by Springer et al., and much of the reduction occurred 50-100 yr ago, well before the declines of pinnipeds and sea otters began; thus, the need to switch prey starting in the 1970s is doubtful. With the sole exception that the sea otter decline followed the decline of pinnipeds, the reported declines were not in fact sequential. Given this, it is unlikely that a sequential megafaunal collapse from whales to sea otters occurred. The spatial and temporal patterns of pinniped and sea otter population trends are more complex than Springer et al. suggest, and are often inconsistent with their hypothesis. Populations remained stable or increased in many areas, despite extensive historical whaling and high killer whale abundance. Furthermore, observed killer whale predation has largely involved pinnipeds and small cetaceans; there is little evidence that large whales were ever a major prey item in high latitudes. Small cetaceans (ignored by Springer et al.) were likely abundant throughout the period. Overall, we suggest that the Springer et al. hypothesis represents a misleading and simplistic view of events and trophic relationships within this complex marine ecosystem. ?? 2007 by the Society for Marine Mammalogy.
Fitts, Douglas A
2017-09-21
The variable criteria sequential stopping rule (vcSSR) is an efficient way to add sample size to planned ANOVA tests while holding the observed rate of Type I errors, α o , constant. The only difference from regular null hypothesis testing is that criteria for stopping the experiment are obtained from a table based on the desired power, rate of Type I errors, and beginning sample size. The vcSSR was developed using between-subjects ANOVAs, but it should work with p values from any type of F test. In the present study, the α o remained constant at the nominal level when using the previously published table of criteria with repeated measures designs with various numbers of treatments per subject, Type I error rates, values of ρ, and four different sample size models. New power curves allow researchers to select the optimal sample size model for a repeated measures experiment. The criteria held α o constant either when used with a multiple correlation that varied the sample size model and the number of predictor variables, or when used with MANOVA with multiple groups and two levels of a within-subject variable at various levels of ρ. Although not recommended for use with χ 2 tests such as the Friedman rank ANOVA test, the vcSSR produces predictable results based on the relation between F and χ 2 . Together, the data confirm the view that the vcSSR can be used to control Type I errors during sequential sampling with any t- or F-statistic rather than being restricted to certain ANOVA designs.
Extended target recognition in cognitive radar networks.
Wei, Yimin; Meng, Huadong; Liu, Yimin; Wang, Xiqin
2010-01-01
We address the problem of adaptive waveform design for extended target recognition in cognitive radar networks. A closed-loop active target recognition radar system is extended to the case of a centralized cognitive radar network, in which a generalized likelihood ratio (GLR) based sequential hypothesis testing (SHT) framework is employed. Using Doppler velocities measured by multiple radars, the target aspect angle for each radar is calculated. The joint probability of each target hypothesis is then updated using observations from different radar line of sights (LOS). Based on these probabilities, a minimum correlation algorithm is proposed to adaptively design the transmit waveform for each radar in an amplitude fluctuation situation. Simulation results demonstrate performance improvements due to the cognitive radar network and adaptive waveform design. Our minimum correlation algorithm outperforms the eigen-waveform solution and other non-cognitive waveform design approaches.
Delay test generation for synchronous sequential circuits
NASA Astrophysics Data System (ADS)
Devadas, Srinivas
1989-05-01
We address the problem of generating tests for delay faults in non-scan synchronous sequential circuits. Delay test generation for sequential circuits is a considerably more difficult problem than delay testing of combinational circuits and has received much less attention. In this paper, we present a method for generating test sequences to detect delay faults in sequential circuits using the stuck-at fault sequential test generator STALLION. The method is complete in that it will generate a delay test sequence for a targeted fault given sufficient CPU time, if such a sequence exists. We term faults for which no delay test sequence exists, under out test methodology, sequentially delay redundant. We describe means of eliminating sequential delay redundancies in logic circuits. We present a partial-scan methodology for enhancing the testability of difficult-to-test of untestable sequential circuits, wherein a small number of flip-flops are selected and made controllable/observable. The selection process guarantees the elimination of all sequential delay redundancies. We show that an intimate relationship exists between state assignment and delay testability of a sequential machine. We describe a state assignment algorithm for the synthesis of sequential machines with maximal delay fault testability. Preliminary experimental results using the test generation, partial-scan and synthesis algorithm are presented.
Sequential Testing: Basics and Benefits
1978-03-01
Eii~TARADC6M and x _..TECHNICAL REPORT NO. 12325 SEQUENTIAL TESTING: BASICS AND BENEFITS / i * p iREFERENCE CP...Sequential Testing: Basics and Benefits Contents Page I. Introduction and Summary II. Sequential Analysis 2 III. Mathematics of Sequential Testing 4 IV...testing. The added benefit of reduced energy needs are inherent in this testing method. The text was originally released by the authors in 1972. The text
Shifflett, Benjamin; Huang, Rong; Edland, Steven D
2017-01-01
Genotypic association studies are prone to inflated type I error rates if multiple hypothesis testing is performed, e.g., sequentially testing for recessive, multiplicative, and dominant risk. Alternatives to multiple hypothesis testing include the model independent genotypic χ 2 test, the efficiency robust MAX statistic, which corrects for multiple comparisons but with some loss of power, or a single Armitage test for multiplicative trend, which has optimal power when the multiplicative model holds but with some loss of power when dominant or recessive models underlie the genetic association. We used Monte Carlo simulations to describe the relative performance of these three approaches under a range of scenarios. All three approaches maintained their nominal type I error rates. The genotypic χ 2 and MAX statistics were more powerful when testing a strictly recessive genetic effect or when testing a dominant effect when the allele frequency was high. The Armitage test for multiplicative trend was most powerful for the broad range of scenarios where heterozygote risk is intermediate between recessive and dominant risk. Moreover, all tests had limited power to detect recessive genetic risk unless the sample size was large, and conversely all tests were relatively well powered to detect dominant risk. Taken together, these results suggest the general utility of the multiplicative trend test when the underlying genetic model is unknown.
Sanchez, Gaëtan; Lecaignard, Françoise; Otman, Anatole; Maby, Emmanuel; Mattout, Jérémie
2016-01-01
The relatively young field of Brain-Computer Interfaces has promoted the use of electrophysiology and neuroimaging in real-time. In the meantime, cognitive neuroscience studies, which make extensive use of functional exploration techniques, have evolved toward model-based experiments and fine hypothesis testing protocols. Although these two developments are mostly unrelated, we argue that, brought together, they may trigger an important shift in the way experimental paradigms are being designed, which should prove fruitful to both endeavors. This change simply consists in using real-time neuroimaging in order to optimize advanced neurocognitive hypothesis testing. We refer to this new approach as the instantiation of an Active SAmpling Protocol (ASAP). As opposed to classical (static) experimental protocols, ASAP implements online model comparison, enabling the optimization of design parameters (e.g., stimuli) during the course of data acquisition. This follows the well-known principle of sequential hypothesis testing. What is radically new, however, is our ability to perform online processing of the huge amount of complex data that brain imaging techniques provide. This is all the more relevant at a time when physiological and psychological processes are beginning to be approached using more realistic, generative models which may be difficult to tease apart empirically. Based upon Bayesian inference, ASAP proposes a generic and principled way to optimize experimental design adaptively. In this perspective paper, we summarize the main steps in ASAP. Using synthetic data we illustrate its superiority in selecting the right perceptual model compared to a classical design. Finally, we briefly discuss its future potential for basic and clinical neuroscience as well as some remaining challenges.
Decision making and sequential sampling from memory
Shadlen, Michael N.; Shohamy, Daphna
2016-01-01
Decisions take time, and as a rule more difficult decisions take more time. But this only raises the question of what consumes the time. For decisions informed by a sequence of samples of evidence, the answer is straightforward: more samples are available with more time. Indeed the speed and accuracy of such decisions are explained by the accumulation of evidence to a threshold or bound. However, the same framework seems to apply to decisions that are not obviously informed by sequences of evidence samples. Here we proffer the hypothesis that the sequential character of such tasks involves retrieval of evidence from memory. We explore this hypothesis by focusing on value-based decisions and argue that mnemonic processes can account for regularities in choice and decision time. We speculate on the neural mechanisms that link sampling of evidence from memory to circuits that represent the accumulated evidence bearing on a choice. We propose that memory processes may contribute to a wider class of decisions that conform to the regularities of choice-reaction time predicted by the sequential sampling framework. PMID:27253447
Spatial memory and integration processes in congenital blindness.
Vecchi, Tomaso; Tinti, Carla; Cornoldi, Cesare
2004-12-22
The paper tests the hypothesis that difficulties met by the blind in spatial processing are due to the simultaneous treatment of independent spatial representations. Results showed that lack of vision does not impede the ability to process and transform mental images; however, blind people are significantly poorer in the recall of more than a single spatial pattern at a time than in the recall of the corresponding material integrated into a single pattern. It is concluded that the simultaneous maintenance of different spatial information is affected by congenital blindness, while cognitive processes that may involve sequential manipulation are not.
Dinning, Phil G; Wiklendt, Lukasz; Omari, Taher; Arkwright, John W; Spencer, Nick J; Brookes, Simon J H; Costa, Marcello
2014-01-01
Propulsive contractions of circular muscle are largely responsible for the movements of content along the digestive tract. Mechanical and electrophysiological recordings of isolated colonic circular muscle have demonstrated that localized distension activates ascending and descending interneuronal pathways, evoking contraction orally and relaxation anally. These polarized enteric reflex pathways can theoretically be sequentially activated by the mechanical stimulation of the advancing contents. Here, we test the hypothesis that initiation and propagation of peristaltic contractions involves a neuromechanical loop; that is an initial gut distension activates local and oral reflex contraction and anal reflex relaxation, the subsequent movement of content then acts as new mechanical stimulus triggering sequentially reflex contractions/relaxations at each point of the gut resulting in a propulsive peristaltic contraction. In fluid filled isolated rabbit distal colon, we combined spatiotemporal mapping of gut diameter and intraluminal pressure with a new analytical method, allowing us to identify when and where active (neurally-driven) contraction or relaxation occurs. Our data indicate that gut dilation is associated with propagating peristaltic contractions, and that the associated level of dilation is greater than that preceding non-propagating contractions (2.7 ± 1.4 mm vs. 1.6 ± 1.2 mm; P < 0.0001). These propagating contractions lead to the formation of boluses that are propelled by oral active neurally driven contractions. The propelled boluses also activate neurally driven anal relaxations, in a diameter dependent manner. These data support the hypothesis that neural peristalsis is the consequence of the activation of a functional loop involving mechanical dilation which activates polarized enteric circuits. These produce propulsion of the bolus which activates further anally, polarized enteric circuits by distension, thus closing the neuromechanical loop.
NASA Astrophysics Data System (ADS)
Yin, Kedong; Liu, Hao; Harrison, Paul J.
2017-05-01
We hypothesize that phytoplankton have the sequential nutrient uptake strategy to maintain nutrient stoichiometry and high primary productivity in the water column. According to this hypothesis, phytoplankton take up the most limiting nutrient first until depletion, continue to draw down non-limiting nutrients and then take up the most limiting nutrient rapidly when it is available. These processes would result in the variation of ambient nutrient ratios in the water column around the Redfield ratio. We used high-resolution continuous vertical profiles of nutrients, nutrient ratios and on-board ship incubation experiments to test this hypothesis in the Strait of Georgia. At the surface in summer, ambient NO3- was depleted with excess PO43- and SiO4- remaining, and as a result, both N : P and N : Si ratios were low. The two ratios increased to about 10 : 1 and 0. 45 : 1, respectively, at 20 m. Time series of vertical profiles showed that the leftover PO43- continued to be removed, resulting in additional phosphorus storage by phytoplankton. The N : P ratios at the nutricline in vertical profiles responded differently to mixing events. Field incubation of seawater samples also demonstrated the sequential uptake of NO3- (the most limiting nutrient) and then PO43- and SiO4- (the non-limiting nutrients). This sequential uptake strategy allows phytoplankton to acquire additional cellular phosphorus and silicon when they are available and wait for nitrogen to become available through frequent mixing of NO3- (or pulsed regenerated NH4). Thus, phytoplankton are able to maintain high productivity and balance nutrient stoichiometry by taking advantage of vigorous mixing regimes with the capacity of the stoichiometric plasticity. To our knowledge, this is the first study to show the in situ dynamics of continuous vertical profiles of N : P and N : Si ratios, which can provide insight into the in situ dynamics of nutrient stoichiometry in the water column and the inference of the transient status of phytoplankton nutrient stoichiometry in the coastal ocean.
A Rejection Principle for Sequential Tests of Multiple Hypotheses Controlling Familywise Error Rates
BARTROFF, JAY; SONG, JINLIN
2015-01-01
We present a unifying approach to multiple testing procedures for sequential (or streaming) data by giving sufficient conditions for a sequential multiple testing procedure to control the familywise error rate (FWER). Together we call these conditions a “rejection principle for sequential tests,” which we then apply to some existing sequential multiple testing procedures to give simplified understanding of their FWER control. Next the principle is applied to derive two new sequential multiple testing procedures with provable FWER control, one for testing hypotheses in order and another for closed testing. Examples of these new procedures are given by applying them to a chromosome aberration data set and to finding the maximum safe dose of a treatment. PMID:26985125
Clos, Mareike; Sommer, Tobias; Schneider, Signe L; Rose, Michael
2018-01-01
During incidental learning statistical regularities are extracted from the environment without the intention to learn. Acquired implicit memory of these regularities can affect behavior in the absence of awareness. However, conscious insight in the underlying regularities can also develop during learning. Such emergence of explicit memory is an important learning mechanism that is assumed to involve prediction errors in the striatum and to be dopamine-dependent. Here we directly tested this hypothesis by manipulating dopamine levels during incidental learning in a modified serial reaction time task (SRTT) featuring a hidden regular sequence of motor responses in a placebo-controlled between-group study. Awareness for the sequential regularity was subsequently assessed using cued generation and additionally verified using free recall. The results demonstrated that dopaminergic modulation nearly doubled the amount of explicit sequence knowledge emerged during learning in comparison to the placebo group. This strong effect clearly argues for a causal role of dopamine-dependent processing for the development of awareness for sequential regularities during learning.
A model of the human observer and decision maker
NASA Technical Reports Server (NTRS)
Wewerinke, P. H.
1981-01-01
The decision process is described in terms of classical sequential decision theory by considering the hypothesis that an abnormal condition has occurred by means of a generalized likelihood ratio test. For this, a sufficient statistic is provided by the innovation sequence which is the result of the perception an information processing submodel of the human observer. On the basis of only two model parameters, the model predicts the decision speed/accuracy trade-off and various attentional characteristics. A preliminary test of the model for single variable failure detection tasks resulted in a very good fit of the experimental data. In a formal validation program, a variety of multivariable failure detection tasks was investigated and the predictive capability of the model was demonstrated.
Sedek, G; Kofta, M
1990-04-01
This study tested a new information-processing explanation of learned helplessness that proposes that an uncontrollable situation produces helplessness symptoms because it is a source of inconsistent, self-contradictory task information during problem-solving attempts. The flow of such information makes hypothesis-testing activity futile. Prolonged and inefficient activity of this kind leads in turn to the emergence of a state of cognitive exhaustion, with accompanying performance deficits. In 3 experiments, Ss underwent informational helplessness training (IHT): They were sequentially exposed to inconsistent task information during discrimination problems. As predicted, IHT was associated with subjective symptoms of irreducible uncertainty and resulted in (a) performance deterioration on subsequent avoidance learning, (b) heightened negative mood, and (c) subjective symptoms of cognitive exhaustion.
Out with the old? The role of selective attention in retaining targets in partial report.
Lindsey, Dakota R B; Bundesen, Claus; Kyllingsbæk, Søren; Petersen, Anders; Logan, Gordon D
2017-01-01
In the partial-report task, subjects are asked to report only a portion of the items presented. Selective attention chooses which objects to represent in short-term memory (STM) on the basis of their relevance. Because STM is limited in capacity, one must sometimes choose which objects are removed from memory in light of new relevant information. We tested the hypothesis that the choices among newly presented information and old information in STM involve the same process-that both are acts of selective attention. We tested this hypothesis using a two-display partial-report procedure. In this procedure, subjects had to select and retain relevant letters (targets) from two sequentially presented displays. If selection in perception and retention in STM are the same process, then irrelevant letters (distractors) in the second display, which demanded attention because of their similarity to the targets, should have decreased target report from the first display. This effect was not obtained in any of four experiments. Thus, choosing objects to keep in STM is not the same process as choosing new objects to bring into STM.
Schott, Nadja; El-Rajab, Inaam; Klotzbier, Thomas
2016-10-01
While typically developing children produce relatively automatized postural control processes, children with DCD seem to exhibit an automatization deficit. Dual tasks with various cognitive loads seem to be an effective way to assess the automatic deficit hypothesis. The aims of the study were: (1) to examine the effect of a concurrent cognitive task on fine and gross motor tasks in children with DCD, and (2) to determine whether the effect varied with different difficulty levels of the concurrent task. We examined dual-task performance (Trail-Making-Test, Trail-Walking-Test) in 20 children with DCD and 39 typically developing children. Based on the idea of the Trail-Making-Test, participants walked along a fixed pathway, following a prescribed path, delineated by target markers of (1) increasing sequential numbers, and (2) increasing sequential numbers and letters. The motor and cognitive dual-task effects (DTE) were calculated for each task. Regardless of the cognitive task, children with DCD performed equally well in fine and gross motor tasks, and were slower in the dual task conditions than under single task-conditions, compared with children without DCD. Increased cognitive task complexity resulted in slow trail walking as well as slower trail tracing. The motor interference for the gross motor tasks was least for the simplest conditions and greatest for the complex conditions and was more pronounced in children with DCD. Cognitive interference was low irrespective of the motor task. Children with DCD show a different approach to allocation of cognitive resources, and have difficulties making motor skills automatic. The latter notion is consistent with impaired cerebellar function and the "automatization deficit hypothesis", suggesting that any deficit in the automatization process will appear if conscious monitoring of the motor skill is made more difficult by integrating another task requiring attentional resources. Copyright © 2016 Elsevier Ltd. All rights reserved.
Some controversial multiple testing problems in regulatory applications.
Hung, H M James; Wang, Sue-Jane
2009-01-01
Multiple testing problems in regulatory applications are often more challenging than the problems of handling a set of mathematical symbols representing multiple null hypotheses under testing. In the union-intersection setting, it is important to define a family of null hypotheses relevant to the clinical questions at issue. The distinction between primary endpoint and secondary endpoint needs to be considered properly in different clinical applications. Without proper consideration, the widely used sequential gate keeping strategies often impose too many logical restrictions to make sense, particularly to deal with the problem of testing multiple doses and multiple endpoints, the problem of testing a composite endpoint and its component endpoints, and the problem of testing superiority and noninferiority in the presence of multiple endpoints. Partitioning the null hypotheses involved in closed testing into clinical relevant orderings or sets can be a viable alternative to resolving the illogical problems requiring more attention from clinical trialists in defining the clinical hypotheses or clinical question(s) at the design stage. In the intersection-union setting there is little room for alleviating the stringency of the requirement that each endpoint must meet the same intended alpha level, unless the parameter space under the null hypothesis can be substantially restricted. Such restriction often requires insurmountable justification and usually cannot be supported by the internal data. Thus, a possible remedial approach to alleviate the possible conservatism as a result of this requirement is a group-sequential design strategy that starts with a conservative sample size planning and then utilizes an alpha spending function to possibly reach the conclusion early.
Characterization of pH-fractionated humic acids with respect to their dissociation behaviour.
Klučáková, Martina
2016-04-01
Humic acids were divided into several fractions using buffer solutions as extraction agents with different pH values. Two methods of fractionation were used. The first one was subsequent dissolution of bulk humic acids in buffers adjusted to different pH. The second one was sequential dissolution in buffers with increasing pH values. Experimental data were compared with hypothesis of partial solubility of humic acids in aqueous solutions. Behaviour of humic fractions obtained by sequential dissolution, original bulk sample and residual fractions obtained by subsequent dissolution at pH 10 and 12 agrees with the hypothesis. Results demonstrated that regardless the common mechanism, solubility and dissociation degree of various humic fractions may be very different and can be estimated using parameters of the model based on the proposed mechanism. Presented results suggest that dissolving of solid humic acids in water environment is more complex than conventional solubility behaviour of sparingly soluble solids.
NASA Astrophysics Data System (ADS)
Felder, Thomas; Gambogi, William; Stika, Katherine; Yu, Bao-Ling; Bradley, Alex; Hu, Hongjie; Garreau-Iles, Lucie; Trout, T. John
2016-09-01
DuPont has been working steadily to develop accelerated backsheet tests that correlate with solar panels observations in the field. This report updates efforts in sequential testing. Single exposure tests are more commonly used and can be completed more quickly, and certain tests provide helpful predictions of certain backsheet failure modes. DuPont recommendations for single exposure tests are based on 25-year exposure levels for UV and humidity/temperature, and form a good basis for sequential test development. We recommend a sequential exposure of damp heat followed by UV then repetitions of thermal cycling and UVA. This sequence preserves 25-year exposure levels for humidity/temperature and UV, and correlates well with a large body of field observations. Measurements can be taken at intervals in the test, although the full test runs 10 months. A second, shorter sequential test based on damp heat and thermal cycling tests mechanical durability and correlates with loss of mechanical properties seen in the field. Ongoing work is directed toward shorter sequential tests that preserve good correlation to field data.
Busse, Sebastian; Schwarting, Rainer K. W.
2016-01-01
The present study is part of a series of experiments, where we analyze why and how damage of the rat’s dorsal hippocampus (dHC) can enhance performance in a sequential reaction time task (SRTT). In this task, sequences of distinct visual stimulus presentations are food-rewarded in a fixed-ratio-13-schedule. Our previous study (Busse and Schwarting, 2016) had shown that rats with lesions of the dHC show substantially shorter session times and post-reinforcement pauses (PRPs) than controls, which allows for more practice when daily training is kept constant. Since sequential behavior is based on instrumental performance, a sequential benefit might be secondary to that. In order to test this hypothesis in the present study, we performed two experiments, where pseudorandom rather than sequential stimulus presentation was used in rats with excitotoxic dorsal hippocampal lesions. Again, we found enhanced performance in the lesion-group in terms of shorter session times and PRPs. During the sessions we found that the lesion-group spent less time with non-instrumental behavior (i.e., grooming, sniffing, and rearing) after prolonged instrumental training. Also, such rats showed moderate evidence for an extinction impairment under devalued food reward conditions and significant deficits in a response-outcome (R-O)-discrimination task in comparison to a control-group. These findings suggest that facilitatory effects on instrumental performance after dorsal hippocampal lesions may be primarily a result of complex behavioral changes, i.e., reductions of behavioral flexibility and/or alterations in motivation, which then result in enhanced instrumental learning. PMID:27375453
A Bayesian sequential design using alpha spending function to control type I error.
Zhu, Han; Yu, Qingzhao
2017-10-01
We propose in this article a Bayesian sequential design using alpha spending functions to control the overall type I error in phase III clinical trials. We provide algorithms to calculate critical values, power, and sample sizes for the proposed design. Sensitivity analysis is implemented to check the effects from different prior distributions, and conservative priors are recommended. We compare the power and actual sample sizes of the proposed Bayesian sequential design with different alpha spending functions through simulations. We also compare the power of the proposed method with frequentist sequential design using the same alpha spending function. Simulations show that, at the same sample size, the proposed method provides larger power than the corresponding frequentist sequential design. It also has larger power than traditional Bayesian sequential design which sets equal critical values for all interim analyses. When compared with other alpha spending functions, O'Brien-Fleming alpha spending function has the largest power and is the most conservative in terms that at the same sample size, the null hypothesis is the least likely to be rejected at early stage of clinical trials. And finally, we show that adding a step of stop for futility in the Bayesian sequential design can reduce the overall type I error and reduce the actual sample sizes.
Gross, Hans J
2011-09-01
Human inborn numerical competence means our ability to recognize object numbers precisely under circumstances which do not allow sequential counting. This archaic process has been called "subitizing," from the Latin "subito" = suddenly, immediately, indicating that the objects in question are presented to test persons only for a fraction of a second in order to prevent counting. In contrast, however, sequential counting, an outstanding cultural achievement of mankind, means to count "1, 2, 3, 4, 5, 6, 7, 8…" without a limit. The following essay will explain how the limit of numerical competence, i.e., the recognition of object numbers without counting, has been determined for humans and how this has been achieved for the first time in case of an invertebrate, the honeybee. Finally, a hypothesis explaining the influence of our limited, inborn numerical competence on counting in our times, e.g., in the Russian language, will be presented. Subitizing versus counting by young Down syndrome infants and autistics and the Savant syndrome will be discussed.
To bee or not to bee, this is the question…
2011-01-01
Human inborn numerical competence means our ability to recognize object numbers precisely under circumstances which do not allow sequential counting. This archaic process has been called “subitizing,” from the Latin “subito” = suddenly, immediately, indicating that the objects in question are presented to test persons only for a fraction of a second in order to prevent counting. In contrast, however, sequential counting, an outstanding cultural achievement of mankind, means to count “1, 2, 3, 4, 5, 6, 7, 8…” without a limit. The following essay will explain how the limit of numerical competence, i.e., the recognition of object numbers without counting, has been determined for humans and how this has been achieved for the first time in case of an invertebrate, the honeybee. Finally, a hypothesis explaining the influence of our limited, inborn numerical competence on counting in our times, e.g., in the Russian language, will be presented. Subitizing versus counting by young Down syndrome infants and autistics and the Savant syndrome will be discussed. PMID:22046473
Sequential structural damage diagnosis algorithm using a change point detection method
NASA Astrophysics Data System (ADS)
Noh, H.; Rajagopal, R.; Kiremidjian, A. S.
2013-11-01
This paper introduces a damage diagnosis algorithm for civil structures that uses a sequential change point detection method. The general change point detection method uses the known pre- and post-damage feature distributions to perform a sequential hypothesis test. In practice, however, the post-damage distribution is unlikely to be known a priori, unless we are looking for a known specific type of damage. Therefore, we introduce an additional algorithm that estimates and updates this distribution as data are collected using the maximum likelihood and the Bayesian methods. We also applied an approximate method to reduce the computation load and memory requirement associated with the estimation. The algorithm is validated using a set of experimental data collected from a four-story steel special moment-resisting frame and multiple sets of simulated data. Various features of different dimensions have been explored, and the algorithm was able to identify damage, particularly when it uses multidimensional damage sensitive features and lower false alarm rates, with a known post-damage feature distribution. For unknown feature distribution cases, the post-damage distribution was consistently estimated and the detection delays were only a few time steps longer than the delays from the general method that assumes we know the post-damage feature distribution. We confirmed that the Bayesian method is particularly efficient in declaring damage with minimal memory requirement, but the maximum likelihood method provides an insightful heuristic approach.
NASA Astrophysics Data System (ADS)
Noh, Hae Young; Rajagopal, Ram; Kiremidjian, Anne S.
2012-04-01
This paper introduces a damage diagnosis algorithm for civil structures that uses a sequential change point detection method for the cases where the post-damage feature distribution is unknown a priori. This algorithm extracts features from structural vibration data using time-series analysis and then declares damage using the change point detection method. The change point detection method asymptotically minimizes detection delay for a given false alarm rate. The conventional method uses the known pre- and post-damage feature distributions to perform a sequential hypothesis test. In practice, however, the post-damage distribution is unlikely to be known a priori. Therefore, our algorithm estimates and updates this distribution as data are collected using the maximum likelihood and the Bayesian methods. We also applied an approximate method to reduce the computation load and memory requirement associated with the estimation. The algorithm is validated using multiple sets of simulated data and a set of experimental data collected from a four-story steel special moment-resisting frame. Our algorithm was able to estimate the post-damage distribution consistently and resulted in detection delays only a few seconds longer than the delays from the conventional method that assumes we know the post-damage feature distribution. We confirmed that the Bayesian method is particularly efficient in declaring damage with minimal memory requirement, but the maximum likelihood method provides an insightful heuristic approach.
Koopmeiners, Joseph S.; Feng, Ziding
2015-01-01
Group sequential testing procedures have been proposed as an approach to conserving resources in biomarker validation studies. Previously, Koopmeiners and Feng (2011) derived the asymptotic properties of the sequential empirical positive predictive value (PPV) and negative predictive value curves, which summarize the predictive accuracy of a continuous marker, under case-control sampling. A limitation of their approach is that the prevalence can not be estimated from a case-control study and must be assumed known. In this manuscript, we consider group sequential testing of the predictive accuracy of a continuous biomarker with unknown prevalence. First, we develop asymptotic theory for the sequential empirical PPV and NPV curves when the prevalence must be estimated, rather than assumed known in a case-control study. We then discuss how our results can be combined with standard group sequential methods to develop group sequential testing procedures and bias-adjusted estimators for the PPV and NPV curve. The small sample properties of the proposed group sequential testing procedures and estimators are evaluated by simulation and we illustrate our approach in the context of a study to validate a novel biomarker for prostate cancer. PMID:26537180
Wansard, Murielle; Bartolomeo, Paolo; Bastin, Christine; Segovia, Fermín; Gillet, Sophie; Duret, Christophe; Meulemans, Thierry
2015-01-01
Over the last decade, many studies have demonstrated that visuospatial working memory (VSWM) can be divided into separate subsystems dedicated to the retention of visual patterns and their serial order. Impaired VSWM has been suggested to exacerbate left visual neglect in right-brain-damaged individuals. The aim of this study was to investigate the segregation between spatial-sequential and spatial-simultaneous working memory in individuals with neglect. We demonstrated that patterns of results on these VSWM tasks can be dissociated. Spatial-simultaneous and sequential aspects of VSWM can be selectively impaired in unilateral neglect. Our results support the hypothesis of multiple VSWM subsystems, which should be taken into account to better understand neglect-related deficits.
Hybrid Computerized Adaptive Testing: From Group Sequential Design to Fully Sequential Design
ERIC Educational Resources Information Center
Wang, Shiyu; Lin, Haiyan; Chang, Hua-Hua; Douglas, Jeff
2016-01-01
Computerized adaptive testing (CAT) and multistage testing (MST) have become two of the most popular modes in large-scale computer-based sequential testing. Though most designs of CAT and MST exhibit strength and weakness in recent large-scale implementations, there is no simple answer to the question of which design is better because different…
Nakashima, Kei; Aoshima, Masahiro; Ohfuji, Satoko; Yamawaki, Satoshi; Nemoto, Masahiro; Hasegawa, Shinya; Noma, Satoshi; Misawa, Masafumi; Hosokawa, Naoto; Yaegashi, Makito; Otsuka, Yoshihito
2018-03-21
It is unclear whether simultaneous administration of a 23-valent pneumococcal polysaccharide vaccine (PPSV23) and a quadrivalent influenza vaccine (QIV) produces immunogenicity in older individuals. This study tested the hypothesis that the pneumococcal antibody response elicited by simultaneous administration of PPSV23 and QIV in older individuals is not inferior to that elicited by sequential administration of PPSV23 and QIV. We performed a single-center, randomized, open-label, non-inferiority trial comprising 162 adults aged ≥65 years randomly assigned to either the simultaneous (simultaneous injections of PPSV23 and QIV) or sequential (control; PPSV23 injected 2 weeks after QIV vaccination) groups. Pneumococcal immunoglobulin G (IgG) titers of serotypes 23F, 3, 4, 6B, 14, and 19A were assessed. The primary endpoint was the serotype 23F response rate (a ≥2-fold increase in IgG concentrations 4-6 weeks after PPSV23 vaccination). With the non-inferiority margin set at 20% fewer patients, the response rate of serotype 23F in the simultaneous group (77.8%) was not inferior to that of the sequential group (77.6%; difference, 0.1%; 90% confidence interval, -10.8% to 11.1%). None of the pneumococcal IgG serotype titers were significantly different between the groups 4-6 weeks after vaccination. Simultaneous administration did not show a significant decrease in seroprotection odds ratios for H1N1, H3N2, or B/Phuket influenza strains other than B/Texas. Additionally, simultaneous administration did not increase adverse reactions. Hence, simultaneous administration of PPSV23 and QIV shows an acceptable immunogenicity that is comparable to sequential administration without an increase in adverse reactions. (This study was registered with ClinicalTrials.gov [NCT02592486]).
The Function and Organization of Lateral Prefrontal Cortex: A Test of Competing Hypotheses
Reynolds, Jeremy R.; O'Reilly, Randall C.; Cohen, Jonathan D.; Braver, Todd S.
2012-01-01
The present experiment tested three hypotheses regarding the function and organization of lateral prefrontal cortex (PFC). The first account (the information cascade hypothesis) suggests that the anterior-posterior organization of lateral PFC is based on the timing with which cue stimuli reduce uncertainty in the action selection process. The second account (the levels-of-abstraction hypothesis) suggests that the anterior-posterior organization of lateral PFC is based on the degree of abstraction of the task goals. The current study began by investigating these two hypotheses, and identified several areas of lateral PFC that were predicted to be active by both the information cascade and levels-of-abstraction accounts. However, the pattern of activation across experimental conditions was inconsistent with both theoretical accounts. Specifically, an anterior area of mid-dorsolateral PFC exhibited sensitivity to experimental conditions that, according to both accounts, should have selectively engaged only posterior areas of PFC. We therefore investigated a third possible account (the adaptive context maintenance hypothesis) that postulates that both posterior and anterior regions of PFC are reliably engaged in task conditions requiring active maintenance of contextual information, with the temporal dynamics of activity in these regions flexibly tracking the duration of maintenance demands. Activity patterns in lateral PFC were consistent with this third hypothesis: regions across lateral PFC exhibited transient activation when contextual information had to be updated and maintained in a trial-by-trial manner, but sustained activation when contextual information had to be maintained over a series of trials. These findings prompt a reconceptualization of current views regarding the anterior-posterior organization of lateral PFC, but do support other findings regarding the active maintenance role of lateral PFC in sequential working memory paradigms. PMID:22355309
Esser, Sarah; Haider, Hilde
2017-01-01
The Serial Reaction Time Task (SRTT) is an important paradigm to study the properties of unconscious learning processes. One specifically interesting and still controversially discussed topic are the conditions under which unconsciously acquired knowledge becomes conscious knowledge. The different assumptions about the underlying mechanisms can contrastively be separated into two accounts: single system views in which the strengthening of associative weights throughout training gradually turns implicit knowledge into explicit knowledge, and dual system views in which implicit knowledge itself does not become conscious. Rather, it requires a second process which detects changes in performance and is able to acquire conscious knowledge. In a series of three experiments, we manipulated the arrangement of sequential and deviant trials. In an SRTT training, participants either received mini-blocks of sequential trials followed by mini-blocks of deviant trials (22 trials each) or they received sequential and deviant trials mixed randomly. Importantly the number of correct and deviant transitions was the same for both conditions. Experiment 1 showed that both conditions acquired a comparable amount of implicit knowledge, expressed in different test tasks. Experiment 2 further demonstrated that both conditions differed in their subjectively experienced fluency of the task, with more fluency experienced when trained with mini-blocks. Lastly, Experiment 3 revealed that the participants trained with longer mini-blocks of sequential and deviant material developed more explicit knowledge. Results are discussed regarding their compatibility with different assumptions about the emergence of explicit knowledge in an implicit learning situation, especially with respect to the role of metacognitive judgements and more specifically the Unexpected-Event Hypothesis.
Esser, Sarah; Haider, Hilde
2017-01-01
The Serial Reaction Time Task (SRTT) is an important paradigm to study the properties of unconscious learning processes. One specifically interesting and still controversially discussed topic are the conditions under which unconsciously acquired knowledge becomes conscious knowledge. The different assumptions about the underlying mechanisms can contrastively be separated into two accounts: single system views in which the strengthening of associative weights throughout training gradually turns implicit knowledge into explicit knowledge, and dual system views in which implicit knowledge itself does not become conscious. Rather, it requires a second process which detects changes in performance and is able to acquire conscious knowledge. In a series of three experiments, we manipulated the arrangement of sequential and deviant trials. In an SRTT training, participants either received mini-blocks of sequential trials followed by mini-blocks of deviant trials (22 trials each) or they received sequential and deviant trials mixed randomly. Importantly the number of correct and deviant transitions was the same for both conditions. Experiment 1 showed that both conditions acquired a comparable amount of implicit knowledge, expressed in different test tasks. Experiment 2 further demonstrated that both conditions differed in their subjectively experienced fluency of the task, with more fluency experienced when trained with mini-blocks. Lastly, Experiment 3 revealed that the participants trained with longer mini-blocks of sequential and deviant material developed more explicit knowledge. Results are discussed regarding their compatibility with different assumptions about the emergence of explicit knowledge in an implicit learning situation, especially with respect to the role of metacognitive judgements and more specifically the Unexpected-Event Hypothesis. PMID:28421018
Avallone, Antonio; Pecori, Biagio; Bianco, Franco; Aloj, Luigi; Tatangelo, Fabiana; Romano, Carmela; Granata, Vincenza; Marone, Pietro; Leone, Alessandra; Botti, Gerardo; Petrillo, Antonella; Caracò, Corradina; Iaffaioli, Vincenzo R; Muto, Paolo; Romano, Giovanni; Comella, Pasquale; Budillon, Alfredo; Delrio, Paolo
2015-10-06
We have previously shown that an intensified preoperative regimen including oxaliplatin plus raltitrexed and 5-fluorouracil/folinic acid (OXATOM/FUFA) during preoperative pelvic radiotherapy produced promising results in locally advanced rectal cancer (LARC). Preclinical evidence suggests that the scheduling of bevacizumab may be crucial to optimize its combination with chemo-radiotherapy. This non-randomized, non-comparative, phase II study was conducted in MRI-defined high-risk LARC. Patients received three biweekly cycles of OXATOM/FUFA during RT. Bevacizumab was given 2 weeks before the start of chemo-radiotherapy, and on the same day of chemotherapy for 3 cycles (concomitant-schedule A) or 4 days prior to the first and second cycle of chemotherapy (sequential-schedule B). Primary end point was pathological complete tumor regression (TRG1) rate. The accrual for the concomitant-schedule was early terminated because the number of TRG1 (2 out of 16 patients) was statistically inconsistent with the hypothesis of activity (30%) to be tested. Conversely, the endpoint was reached with the sequential-schedule and the final TRG1 rate among 46 enrolled patients was 50% (95% CI 35%-65%). Neutropenia was the most common grade ≥ 3 toxicity with both schedules, but it was less pronounced with the sequential than concomitant-schedule (30% vs. 44%). Postoperative complications occurred in 8/15 (53%) and 13/46 (28%) patients in schedule A and B, respectively. At 5 year follow-up the probability of PFS and OS was 80% (95%CI, 66%-89%) and 85% (95%CI, 69%-93%), respectively, for the sequential-schedule. These results highlights the relevance of bevacizumab scheduling to optimize its combination with preoperative chemo-radiotherapy in the management of LARC.
Application of Multi-Hypothesis Sequential Monte Carlo for Breakup Analysis
NASA Astrophysics Data System (ADS)
Faber, W. R.; Zaidi, W.; Hussein, I. I.; Roscoe, C. W. T.; Wilkins, M. P.; Schumacher, P. W., Jr.
As more objects are launched into space, the potential for breakup events and space object collisions is ever increasing. These events create large clouds of debris that are extremely hazardous to space operations. Providing timely, accurate, and statistically meaningful Space Situational Awareness (SSA) data is crucial in order to protect assets and operations in space. The space object tracking problem, in general, is nonlinear in both state dynamics and observations, making it ill-suited to linear filtering techniques such as the Kalman filter. Additionally, given the multi-object, multi-scenario nature of the problem, space situational awareness requires multi-hypothesis tracking and management that is combinatorially challenging in nature. In practice, it is often seen that assumptions of underlying linearity and/or Gaussianity are used to provide tractable solutions to the multiple space object tracking problem. However, these assumptions are, at times, detrimental to tracking data and provide statistically inconsistent solutions. This paper details a tractable solution to the multiple space object tracking problem applicable to space object breakup events. Within this solution, simplifying assumptions of the underlying probability density function are relaxed and heuristic methods for hypothesis management are avoided. This is done by implementing Sequential Monte Carlo (SMC) methods for both nonlinear filtering as well as hypothesis management. This goal of this paper is to detail the solution and use it as a platform to discuss computational limitations that hinder proper analysis of large breakup events.
How hierarchical is language use?
Frank, Stefan L.; Bod, Rens; Christiansen, Morten H.
2012-01-01
It is generally assumed that hierarchical phrase structure plays a central role in human language. However, considerations of simplicity and evolutionary continuity suggest that hierarchical structure should not be invoked too hastily. Indeed, recent neurophysiological, behavioural and computational studies show that sequential sentence structure has considerable explanatory power and that hierarchical processing is often not involved. In this paper, we review evidence from the recent literature supporting the hypothesis that sequential structure may be fundamental to the comprehension, production and acquisition of human language. Moreover, we provide a preliminary sketch outlining a non-hierarchical model of language use and discuss its implications and testable predictions. If linguistic phenomena can be explained by sequential rather than hierarchical structure, this will have considerable impact in a wide range of fields, such as linguistics, ethology, cognitive neuroscience, psychology and computer science. PMID:22977157
Orthographic Structure and Reading Experience Affect the Transfer from Iconic to Short Term Memory
ERIC Educational Resources Information Center
Lefton, Lester A.; Spragins, Anne B.
1974-01-01
The basic hypothesis of these experiments was that the processing strategy for the transfer of alphabetic material from iconic storage to short-term memory involves a sequential left-to-right factor that develops with increases in experience with reading. (Author)
Reactivation, Replay, and Preplay: How It Might All Fit Together
Buhry, Laure; Azizi, Amir H.; Cheng, Sen
2011-01-01
Sequential activation of neurons that occurs during “offline” states, such as sleep or awake rest, is correlated with neural sequences recorded during preceding exploration phases. This so-called reactivation, or replay, has been observed in a number of different brain regions such as the striatum, prefrontal cortex, primary visual cortex and, most prominently, the hippocampus. Reactivation largely co-occurs together with hippocampal sharp-waves/ripples, brief high-frequency bursts in the local field potential. Here, we first review the mounting evidence for the hypothesis that reactivation is the neural mechanism for memory consolidation during sleep. We then discuss recent results that suggest that offline sequential activity in the waking state might not be simple repetitions of previously experienced sequences. Some offline sequential activity occurs before animals are exposed to a novel environment for the first time, and some sequences activated offline correspond to trajectories never experienced by the animal. We propose a conceptual framework for the dynamics of offline sequential activity that can parsimoniously describe a broad spectrum of experimental results. These results point to a potentially broader role of offline sequential activity in cognitive functions such as maintenance of spatial representation, learning, or planning. PMID:21918724
1984-06-01
SEQUENTIAL TESTING (Bldg. A, Room C) 1300-1330 ’ 1330-1415 1415-1445 1445-1515 BREAK 1515-1545 A TRUNCATED SEQUENTIAL PROBABILITY RATIO TEST J...suicide optical data operational testing reliability random numbers bootstrap methods missing data sequential testing fire support complex computer model carcinogenesis studies EUITION Of 1 NOV 68 I% OBSOLETE a ...contributed papers can be ascertained from the titles of the
Pharmacophore Based Virtual Screening Approach to Identify Selective PDE4B Inhibitors
Gaurav, Anand; Gautam, Vertika
2017-01-01
Phosphodiesterase 4 (PDE4) has been established as a promising target in asthma and chronic obstructive pulmonary disease. PDE4B subtype selective inhibitors are known to reduce the dose limiting adverse effect associated with non-selective PDE4B inhibitors. This makes the development of PDE4B subtype selective inhibitors a desirable research goal. To achieve this goal, ligand based pharmacophore modeling approach is employed. Separate pharmacophore hypotheses for PDE4B and PDE4D inhibitors were generated using HypoGen algorithm and 106 PDE4 inhibitors from literature having thiopyrano [3,2-d] Pyrimidines, 2-arylpyrimidines, and triazines skeleton. Suitable training and test sets were created using the molecules as per the guidelines available for HypoGen program. Training set was used for hypothesis development while test set was used for validation purpose. Fisher validation was also used to test the significance of the developed hypothesis. The validated pharmacophore hypotheses for PDE4B and PDE4D inhibitors were used in sequential virtual screening of zinc database of drug like molecules to identify selective PDE4B inhibitors. The hits were screened for their estimated activity and fit value. The top hit was subjected to docking into the active sites of PDE4B and PDE4D to confirm its selectivity for PDE4B. The hits are proposed to be evaluated further using in-vitro assays. PMID:29201082
One-sided truncated sequential t-test: application to natural resource sampling
Gary W. Fowler; William G. O' Regan
1974-01-01
A new procedure for constructing one-sided truncated sequential t-tests and its application to natural resource sampling are described. Monte Carlo procedures were used to develop a series of one-sided truncated sequential t-tests and the associated approximations to the operating characteristic and average sample number functions. Different truncation points and...
Impaired sequential and partially compensated probabilistic skill learning in Parkinson's disease.
Kemény, Ferenc; Demeter, Gyula; Racsmány, Mihály; Valálik, István; Lukács, Ágnes
2018-06-08
The striatal dopaminergic dysfunction in Parkinson's disease (PD) has been associated with deficits in skill learning in numerous studies, but some of the findings remain controversial. Our aim was to explore the generality of the learning deficit using two widely reported skill learning tasks in the same group of Parkinson's patients. Thirty-four patients with PD (mean age: 62.83 years, SD: 7.67) were compared to age-matched healthy adults. Two tasks were employed: the Serial Reaction Time Task (SRT), testing the learning of motor sequences, and the Weather Prediction (WP) task, testing non-sequential probabilistic category learning. On the SRT task, patients with PD showed no significant evidence for sequence learning. These results support and also extend previous findings, suggesting that motor skill learning is vulnerable in PD. On the WP task, the PD group showed the same amount of learning as controls, but they exploited qualitatively different strategies in predicting the target categories. While controls typically combined probabilities from multiple predicting cues, patients with PD instead focused on individual cues. We also found moderate to high correlations between the different measures of skill learning. These findings support our hypothesis that skill learning is generally impaired in PD, and can in some cases be compensated by relying on alternative learning strategies. © 2018 The Authors. Journal of Neuropsychology published by John Wiley & Sons Ltd on behalf of British Psychological Society.
Dargaville, P A; South, M; McDougall, P N
1997-12-01
To test the hypothesis that conventional mechanical ventilation (CV) provides a greater stimulus to secretion of pulmonary surfactant than high frequency oscillatory ventilation (HFO). Sequential examination of surfactant indices in lung lavage fluid in a group of six infants with severe lung disease (group 1), ventilated with HFO and then converted back to CV as their lung disease recovered. A similar group of 10 infants (group 2) ventilated conventionally throughout the course of their illness were studied for comparison. In groups 1 and 2, two sequential tracheal aspirate samples were taken, the first once lung disease was noted to be improving, and the second 48-72 h later. Group 1 infants had converted from HFO to CV during this time. A marked increase in concentration of total surfactant phospholipid (PL) and disaturated phosphatidylcholine (DSPC) was seen in group 1 after transition from HFO to CV; the magnitude of this increase was significantly greater than that sequentially observed in group II (total PL: 9.4-fold increase in group 1 vs 1.8-fold in group 2, P = 0.006; DSPC: group 1 6.4-fold increase vs. group 2 1.7-fold, P = 0.02). These findings suggest that intermittent lung inflation during CV produces more secretion of surfactant phospholipid than continuous alveolar distension on HFO, and raise the possibility that conservation and additional maturation of surfactant elements may occur when the injured lung is ventilated with HFO.
Vocal Generalization Depends on Gesture Identity and Sequence
Sober, Samuel J.
2014-01-01
Generalization, the brain's ability to transfer motor learning from one context to another, occurs in a wide range of complex behaviors. However, the rules of generalization in vocal behavior are poorly understood, and it is unknown how vocal learning generalizes across an animal's entire repertoire of natural vocalizations and sequences. Here, we asked whether generalization occurs in a nonhuman vocal learner and quantified its properties. We hypothesized that adaptive error correction of a vocal gesture produced in one sequence would generalize to the same gesture produced in other sequences. To test our hypothesis, we manipulated the fundamental frequency (pitch) of auditory feedback in Bengalese finches (Lonchura striata var. domestica) to create sensory errors during vocal gestures (song syllables) produced in particular sequences. As hypothesized, error-corrective learning on pitch-shifted vocal gestures generalized to the same gestures produced in other sequential contexts. Surprisingly, generalization magnitude depended strongly on sequential distance from the pitch-shifted syllables, with greater adaptation for gestures produced near to the pitch-shifted syllable. A further unexpected result was that nonshifted syllables changed their pitch in the direction opposite from the shifted syllables. This apparently antiadaptive pattern of generalization could not be explained by correlations between generalization and the acoustic similarity to the pitch-shifted syllable. These findings therefore suggest that generalization depends on the type of vocal gesture and its sequential context relative to other gestures and may reflect an advantageous strategy for vocal learning and maintenance. PMID:24741046
Pannell, R; Li, S; Gurewich, V
2017-08-01
Thrombolysis with tissue plasminogen activator (tPA) has been a disappointment and has now been replaced by an endovascular procedure whenever possible. Nevertheless, thrombolysis remains the only means by which circulation in a thrombosed artery can be restored rapidly. In contrast to tPA monotherapy, endogenous fibrinolysis uses both tPA and urokinase plasminogen activator (uPA), whose native form is a proenzyme, prouPA. This combination is remarkably effective as evidenced by the fibrin degradation product, D-dimer, which is invariably present in plasma. The two activators have complementary mechanisms of plasminogen activation and are synergistic in combination. Since tPA initiates fibrinolysis when released from the vessel wall and prouPA is in the blood, they induce fibrinolysis sequentially. It was postulated that this may be more effective and fibrin-specific. The hypothesis was tested in a model of clot lysis in plasma in which a clot was first exposed to tPA for 5 min, washed and incubated with prouPA. Lysis was compared with that of clots incubated with both activators simultaneously. The sequential combination was almost twice as effective and caused less fibrinogenolysis than the simultaneous combination (p < 0.0001) despite having significantly less tPA, as a result of the wash. A mechanism is described by which this phenomenon can be explained. The findings are believed to have significant therapeutic implications.
2017-01-01
Objective Anticipation of opponent actions, through the use of advanced (i.e., pre-event) kinematic information, can be trained using video-based temporal occlusion. Typically, this involves isolated opponent skills/shots presented as trials in a random order. However, two different areas of research concerning representative task design and contextual (non-kinematic) information, suggest this structure of practice restricts expert performance. The aim of this study was to examine the effect of a sequential structure of practice during video-based training of anticipatory behavior in tennis, as well as the transfer of these skills to the performance environment. Methods In a pre-practice-retention-transfer design, participants viewed life-sized video of tennis rallies across practice in either a sequential order (sequential group), in which participants were exposed to opponent skills/shots in the order they occur in the sport, or a non-sequential (non-sequential group) random order. Results In the video-based retention test, the sequential group was significantly more accurate in their anticipatory judgments when the retention condition replicated the sequential structure compared to the non-sequential group. In the non-sequential retention condition, the non-sequential group was more accurate than the sequential group. In the field-based transfer test, overall decision time was significantly faster in the sequential group compared to the non-sequential group. Conclusion Findings highlight the benefits of a sequential structure of practice for the transfer of anticipatory behavior in tennis. We discuss the role of contextual information, and the importance of representative task design, for the testing and training of perceptual-cognitive skills in sport. PMID:28355263
The stratification of military service and combat exposure, 1934–1994*
MacLean, Alair
2010-01-01
Previous research has suggested that men who were exposed to combat during wartime differed from those who were not. Yet little is known about how selection into combat has changed over time. This paper estimates sequential logistic models using data from the Panel Study of Income Dynamics to examine the stratification of military service and combat exposure in the US during the last six decades of the twentieth century. It tests potentially overlapping hypotheses drawn from two competing theories, class bias and dual selection. It also tests a hypothesis, drawn from the life course perspective, that the processes by which people came to see combat have changed historically. The findings show that human capital, institutional screening, and class bias all determined who saw combat. They also show that, net of historical change in the odds of service and combat, the impact of only one background characteristic, race, changed over time. PMID:21113325
Hussain, Faraz; Jha, Sumit K; Jha, Susmit; Langmead, Christopher J
2014-01-01
Stochastic models are increasingly used to study the behaviour of biochemical systems. While the structure of such models is often readily available from first principles, unknown quantitative features of the model are incorporated into the model as parameters. Algorithmic discovery of parameter values from experimentally observed facts remains a challenge for the computational systems biology community. We present a new parameter discovery algorithm that uses simulated annealing, sequential hypothesis testing, and statistical model checking to learn the parameters in a stochastic model. We apply our technique to a model of glucose and insulin metabolism used for in-silico validation of artificial pancreata and demonstrate its effectiveness by developing parallel CUDA-based implementation for parameter synthesis in this model.
NASA Technical Reports Server (NTRS)
Grant, T. L.
1978-01-01
A hybrid receiver has been designed for the Galileo Project. The receiver, located on the Galileo Orbiter, will autonomously acquire and track signals from the first atmospheric probe of Jupiter as well as demodulate, bit-synchronize, and buffer the telemetry data. The receiver has a conventional RF and LF front end but performs multiple functions digitally under firmware control. It will be a self-acquiring receiver that operates under a large frequency uncertainty; it can accommodate different modulation types, bit rates, and other parameter changes via reprogramming. A breadboard receiver and test set demonstrate a preliminary version of the sequential detection process and verify the hypothesis that a fading channel does not reduce the probability of detection.
Casado, Pilar; Martín-Loeches, Manuel; León, Inmaculada; Hernández-Gutiérrez, David; Espuny, Javier; Muñoz, Francisco; Jiménez-Ortega, Laura; Fondevila, Sabela; de Vega, Manuel
2018-03-01
This study aims to extend the embodied cognition approach to syntactic processing. The hypothesis is that the brain resources to plan and perform motor sequences are also involved in syntactic processing. To test this hypothesis, Event-Related brain Potentials (ERPs) were recorded while participants read sentences with embedded relative clauses, judging for their acceptability (half of the sentences contained a subject-verb morphosyntactic disagreement). The sentences, previously divided into three segments, were self-administered segment-by-segment in two different sequential manners: linear or non-linear. Linear self-administration consisted of successively pressing three buttons with three consecutive fingers in the right hand, while non-linear self-administration implied the substitution of the finger in the middle position by the right foot. Our aim was to test whether syntactic processing could be affected by the manner the sentences were self-administered. Main results revealed that the ERPs LAN component vanished whereas the P600 component increased in response to incorrect verbs, for non-linear relative to linear self-administration. The LAN and P600 components reflect early and late syntactic processing, respectively. Our results convey evidence that language syntactic processing and performing non-linguistic motor sequences may share resources in the human brain. Copyright © 2017 Elsevier Ltd. All rights reserved.
Introducing a Model for Optimal Design of Sequential Objective Structured Clinical Examinations
ERIC Educational Resources Information Center
Mortaz Hejri, Sara; Yazdani, Kamran; Labaf, Ali; Norcini, John J.; Jalili, Mohammad
2016-01-01
In a sequential OSCE which has been suggested to reduce testing costs, candidates take a short screening test and who fail the test, are asked to take the full OSCE. In order to introduce an effective and accurate sequential design, we developed a model for designing and evaluating screening OSCEs. Based on two datasets from a 10-station…
Context-dependent decision-making: a simple Bayesian model
Lloyd, Kevin; Leslie, David S.
2013-01-01
Many phenomena in animal learning can be explained by a context-learning process whereby an animal learns about different patterns of relationship between environmental variables. Differentiating between such environmental regimes or ‘contexts’ allows an animal to rapidly adapt its behaviour when context changes occur. The current work views animals as making sequential inferences about current context identity in a world assumed to be relatively stable but also capable of rapid switches to previously observed or entirely new contexts. We describe a novel decision-making model in which contexts are assumed to follow a Chinese restaurant process with inertia and full Bayesian inference is approximated by a sequential-sampling scheme in which only a single hypothesis about current context is maintained. Actions are selected via Thompson sampling, allowing uncertainty in parameters to drive exploration in a straightforward manner. The model is tested on simple two-alternative choice problems with switching reinforcement schedules and the results compared with rat behavioural data from a number of T-maze studies. The model successfully replicates a number of important behavioural effects: spontaneous recovery, the effect of partial reinforcement on extinction and reversal, the overtraining reversal effect, and serial reversal-learning effects. PMID:23427101
Context-dependent decision-making: a simple Bayesian model.
Lloyd, Kevin; Leslie, David S
2013-05-06
Many phenomena in animal learning can be explained by a context-learning process whereby an animal learns about different patterns of relationship between environmental variables. Differentiating between such environmental regimes or 'contexts' allows an animal to rapidly adapt its behaviour when context changes occur. The current work views animals as making sequential inferences about current context identity in a world assumed to be relatively stable but also capable of rapid switches to previously observed or entirely new contexts. We describe a novel decision-making model in which contexts are assumed to follow a Chinese restaurant process with inertia and full Bayesian inference is approximated by a sequential-sampling scheme in which only a single hypothesis about current context is maintained. Actions are selected via Thompson sampling, allowing uncertainty in parameters to drive exploration in a straightforward manner. The model is tested on simple two-alternative choice problems with switching reinforcement schedules and the results compared with rat behavioural data from a number of T-maze studies. The model successfully replicates a number of important behavioural effects: spontaneous recovery, the effect of partial reinforcement on extinction and reversal, the overtraining reversal effect, and serial reversal-learning effects.
Inhibition during response preparation is sensitive to response complexity
Saks, Dylan; Hoang, Timothy; Ivry, Richard B.
2015-01-01
Motor system excitability is transiently suppressed during the preparation of movement. This preparatory inhibition is hypothesized to facilitate response selection and initiation. Given that demands on selection and initiation processes increase with movement complexity, we hypothesized that complexity would influence preparatory inhibition. To test this hypothesis, we probed corticospinal excitability during a delayed-response task in which participants were cued to prepare right- or left-hand movements of varying complexity. Single-pulse transcranial magnetic stimulation was applied over right primary motor cortex to elicit motor evoked potentials (MEPs) from the first dorsal interosseous (FDI) of the left hand. MEP suppression was greater during the preparation of responses involving coordination of the FDI and adductor digiti minimi relative to easier responses involving only the FDI, independent of which hand was cued to respond. In contrast, this increased inhibition was absent when the complex responses required sequential movements of the two muscles. Moreover, complexity did not influence the level of inhibition when the response hand was fixed for the trial block, regardless of whether the complex responses were performed simultaneously or sequentially. These results suggest that preparatory inhibition contributes to response selection, possibly by suppressing extraneous movements when responses involve the simultaneous coordination of multiple effectors. PMID:25717168
Exploiting Complexity Information for Brain Activation Detection
Zhang, Yan; Liang, Jiali; Lin, Qiang; Hu, Zhenghui
2016-01-01
We present a complexity-based approach for the analysis of fMRI time series, in which sample entropy (SampEn) is introduced as a quantification of the voxel complexity. Under this hypothesis the voxel complexity could be modulated in pertinent cognitive tasks, and it changes through experimental paradigms. We calculate the complexity of sequential fMRI data for each voxel in two distinct experimental paradigms and use a nonparametric statistical strategy, the Wilcoxon signed rank test, to evaluate the difference in complexity between them. The results are compared with the well known general linear model based Statistical Parametric Mapping package (SPM12), where a decided difference has been observed. This is because SampEn method detects brain complexity changes in two experiments of different conditions and the data-driven method SampEn evaluates just the complexity of specific sequential fMRI data. Also, the larger and smaller SampEn values correspond to different meanings, and the neutral-blank design produces higher predictability than threat-neutral. Complexity information can be considered as a complementary method to the existing fMRI analysis strategies, and it may help improving the understanding of human brain functions from a different perspective. PMID:27045838
Jim On, Shelbi C; Haddican, Madelaine; Yaroshinsky, Alex; Singer, Giselle; Lebwohl, Mark
2015-01-01
Ingenol mebutate gel is a topical field treatment of actinic keratosis (AK). One of several proposed mechanisms of action for ingenol mebutate is induction of cell death in proliferating keratinocytes, suggesting a preferential action on AKs rather than healthy skin. Local skin reactions (LSRs) during 2 sequential 4-week cycles of AK treatment with ingenol mebutate gel 0.015% on the face or scalp were evaluated to test the hypothesis that reapplication of the study product would produce lower LSR scores than during the first treatment cycle. In this unblinded study, 20 participants with AKs on the face or scalp were treated with ingenol mebutate gel 0.015% once daily for 3 days in 2 sequential 4-week cycles. Composite LSR scores were evaluated during both cycles. The composite LSR score during the second cycle was found to be significantly lower than the first cycle (P=.0002). The proportion of participants who experienced LSRs in the second treatment cycle was less than the first cycle. Ingenol mebutate gel 0.015% may cumulatively reduce the burden of sun-damaged skin over 2 treatment cycles by targeting and removing transformed keratinocytes.
Doros, Gheorghe; Pencina, Michael; Rybin, Denis; Meisner, Allison; Fava, Maurizio
2013-07-20
Previous authors have proposed the sequential parallel comparison design (SPCD) to address the issue of high placebo response rate in clinical trials. The original use of SPCD focused on binary outcomes, but recent use has since been extended to continuous outcomes that arise more naturally in many fields, including psychiatry. Analytic methods proposed to date for analysis of SPCD trial continuous data included methods based on seemingly unrelated regression and ordinary least squares. Here, we propose a repeated measures linear model that uses all outcome data collected in the trial and accounts for data that are missing at random. An appropriate contrast formulated after the model has been fit can be used to test the primary hypothesis of no difference in treatment effects between study arms. Our extensive simulations show that when compared with the other methods, our approach preserves the type I error even for small sample sizes and offers adequate power and the smallest mean squared error under a wide variety of assumptions. We recommend consideration of our approach for analysis of data coming from SPCD trials. Copyright © 2013 John Wiley & Sons, Ltd.
Memon, Amina; Gabbert, Fiona
2003-04-01
Eyewitness research has identified sequential lineup testing as a way of reducing false lineup choices while maintaining accurate identifications. The authors examined the usefulness of this procedure for reducing false choices in older adults. Young and senior witnesses viewed a crime video and were later presented with target present orabsent lineups in a simultaneous or sequential format. In addition, some participants received prelineup questions about their memory for a perpetrator's face and about their confidence in their ability to identify the culprit or to correctly reject the lineup. The sequential lineup reduced false choosing rates among young and older adults in target-absent conditions. In target-present conditions, sequential testing significantly reduced the correct identification rate in both age groups.
Button, Le; Peter, Beate; Stoel-Gammon, Carol; Raskind, Wendy H
2013-03-01
The purpose of this study was to address the hypothesis that childhood apraxia of speech (CAS) is influenced by an underlying deficit in sequential processing that is also expressed in other modalities. In a sample of 21 adults from five multigenerational families, 11 with histories of various familial speech sound disorders, 3 biologically related adults from a family with familial CAS showed motor sequencing deficits in an alternating motor speech task. Compared with the other adults, these three participants showed deficits in tasks requiring high loads of sequential processing, including nonword imitation, nonword reading and spelling. Qualitative error analyses in real word and nonword imitations revealed group differences in phoneme sequencing errors. Motor sequencing ability was correlated with phoneme sequencing errors during real word and nonword imitation, reading and spelling. Correlations were characterized by extremely high scores in one family and extremely low scores in another. Results are consistent with a central deficit in sequential processing in CAS of familial origin.
BUTTON, LE; PETER, BEATE; STOEL-GAMMON, CAROL; RASKIND, WENDY H.
2013-01-01
The purpose of this study was to address the hypothesis that childhood apraxia of speech (CAS) is influenced by an underlying deficit in sequential processing that is also expressed in other modalities. In a sample of 21 adults from five multigenerational families, 11 with histories of various familial speech sound disorders, 3 biologically related adults from a family with familial CAS showed motor sequencing deficits in an alternating motor speech task. Compared with the other adults, these three participants showed deficits in tasks requiring high loads of sequential processing, including nonword imitation, nonword reading and spelling. Qualitative error analyses in real word and nonword imitations revealed group differences in phoneme sequencing errors. Motor sequencing ability was correlated with phoneme sequencing errors during real word and nonword imitation, reading and spelling. Correlations were characterized by extremely high scores in one family and extremely low scores in another. Results are consistent with a central deficit in sequential processing in CAS of familial origin. PMID:23339292
The Relevance of Visual Sequential Memory to Reading.
ERIC Educational Resources Information Center
Crispin, Lisa; And Others
1984-01-01
Results of three visual sequential memory tests and a group reading test given to 19 elementary students are discussed in terms of task analysis and structuralist approaches to analysis of reading skills. Relation of visual sequential memory to other reading subskills is considered in light of current reasearch. (CMG)
Orphan therapies: making best use of postmarket data.
Maro, Judith C; Brown, Jeffrey S; Dal Pan, Gerald J; Li, Lingling
2014-08-01
Postmarket surveillance of the comparative safety and efficacy of orphan therapeutics is challenging, particularly when multiple therapeutics are licensed for the same orphan indication. To make best use of product-specific registry data collected to fulfill regulatory requirements, we propose the creation of a distributed electronic health data network among registries. Such a network could support sequential statistical analyses designed to detect early warnings of excess risks. We use a simulated example to explore the circumstances under which a distributed network may prove advantageous. We perform sample size calculations for sequential and non-sequential statistical studies aimed at comparing the incidence of hepatotoxicity following initiation of two newly licensed therapies for homozygous familial hypercholesterolemia. We calculate the sample size savings ratio, or the proportion of sample size saved if one conducted a sequential study as compared to a non-sequential study. Then, using models to describe the adoption and utilization of these therapies, we simulate when these sample sizes are attainable in calendar years. We then calculate the analytic calendar time savings ratio, analogous to the sample size savings ratio. We repeat these analyses for numerous scenarios. Sequential analyses detect effect sizes earlier or at the same time as non-sequential analyses. The most substantial potential savings occur when the market share is more imbalanced (i.e., 90% for therapy A) and the effect size is closest to the null hypothesis. However, due to low exposure prevalence, these savings are difficult to realize within the 30-year time frame of this simulation for scenarios in which the outcome of interest occurs at or more frequently than one event/100 person-years. We illustrate a process to assess whether sequential statistical analyses of registry data performed via distributed networks may prove a worthwhile infrastructure investment for pharmacovigilance.
Test pattern generation for ILA sequential circuits
NASA Technical Reports Server (NTRS)
Feng, YU; Frenzel, James F.; Maki, Gary K.
1993-01-01
An efficient method of generating test patterns for sequential machines implemented using one-dimensional, unilateral, iterative logic arrays (ILA's) of BTS pass transistor networks is presented. Based on a transistor level fault model, the method affords a unique opportunity for real-time fault detection with improved fault coverage. The resulting test sets are shown to be equivalent to those obtained using conventional gate level models, thus eliminating the need for additional test patterns. The proposed method advances the simplicity and ease of the test pattern generation for a special class of sequential circuitry.
Category transfer in sequential causal learning: the unbroken mechanism hypothesis.
Hagmayer, York; Meder, Björn; von Sydow, Momme; Waldmann, Michael R
2011-07-01
The goal of the present set of studies is to explore the boundary conditions of category transfer in causal learning. Previous research has shown that people are capable of inducing categories based on causal learning input, and they often transfer these categories to new causal learning tasks. However, occasionally learners abandon the learned categories and induce new ones. Whereas previously it has been argued that transfer is only observed with essentialist categories in which the hidden properties are causally relevant for the target effect in the transfer relation, we here propose an alternative explanation, the unbroken mechanism hypothesis. This hypothesis claims that categories are transferred from a previously learned causal relation to a new causal relation when learners assume a causal mechanism linking the two relations that is continuous and unbroken. The findings of two causal learning experiments support the unbroken mechanism hypothesis. Copyright © 2011 Cognitive Science Society, Inc.
Computerized Classification Testing with the Rasch Model
ERIC Educational Resources Information Center
Eggen, Theo J. H. M.
2011-01-01
If classification in a limited number of categories is the purpose of testing, computerized adaptive tests (CATs) with algorithms based on sequential statistical testing perform better than estimation-based CATs (e.g., Eggen & Straetmans, 2000). In these computerized classification tests (CCTs), the Sequential Probability Ratio Test (SPRT) (Wald,…
Monte Carlo Simulation of Sudden Death Bearing Testing
NASA Technical Reports Server (NTRS)
Vlcek, Brian L.; Hendricks, Robert C.; Zaretsky, Erwin V.
2003-01-01
Monte Carlo simulations combined with sudden death testing were used to compare resultant bearing lives to the calculated hearing life and the cumulative test time and calendar time relative to sequential and censored sequential testing. A total of 30 960 virtual 50-mm bore deep-groove ball bearings were evaluated in 33 different sudden death test configurations comprising 36, 72, and 144 bearings each. Variations in both life and Weibull slope were a function of the number of bearings failed independent of the test method used and not the total number of bearings tested. Variation in L10 life as a function of number of bearings failed were similar to variations in lift obtained from sequentially failed real bearings and from Monte Carlo (virtual) testing of entire populations. Reductions up to 40 percent in bearing test time and calendar time can be achieved by testing to failure or the L(sub 50) life and terminating all testing when the last of the predetermined bearing failures has occurred. Sudden death testing is not a more efficient method to reduce bearing test time or calendar time when compared to censored sequential testing.
Arend, Carlos Frederico; Arend, Ana Amalia; da Silva, Tiago Rodrigues
2014-06-01
The aim of our study was to systematically compare different methodologies to establish an evidence-based approach based on tendon thickness and structure for sonographic diagnosis of supraspinatus tendinopathy when compared to MRI. US was obtained from 164 symptomatic patients with supraspinatus tendinopathy detected at MRI and 42 asymptomatic controls with normal MRI. Diagnostic yield was calculated for either maximal supraspinatus tendon thickness (MSTT) and tendon structure as isolated criteria and using different combinations of parallel and sequential testing at US. Chi-squared tests were performed to assess sensitivity, specificity, and accuracy of different diagnostic approaches. Mean MSTT was 6.68 mm in symptomatic patients and 5.61 mm in asymptomatic controls (P<.05). When used as an isolated criterion, MSTT>6.0mm provided best results for accuracy (93.7%) when compared to other measurements of tendon thickness. Also as an isolated criterion, abnormal tendon structure (ATS) yielded 93.2% accuracy for diagnosis. The best overall yield was obtained by both parallel and sequential testing using either MSTT>6.0mm or ATS as diagnostic criteria at no particular order, which provided 99.0% accuracy, 100% sensitivity, and 95.2% specificity. Among these parallel and sequential tests that provided best overall yield, additional analysis revealed that sequential testing first evaluating tendon structure required assessment of 258 criteria (vs. 261 for sequential testing first evaluating tendon thickness and 412 for parallel testing) and demanded a mean of 16.1s to assess diagnostic criteria and reach the diagnosis (vs. 43.3s for sequential testing first evaluating tendon thickness and 47.4s for parallel testing). We found that using either MSTT>6.0mm or ATS as diagnostic criteria for both parallel and sequential testing provides the best overall yield for sonographic diagnosis of supraspinatus tendinopathy when compared to MRI. Among these strategies, a two-step sequential approach first assessing tendon structure was advantageous because it required a lower number of criteria to be assessed and demanded less time to assess diagnostic criteria and reach the diagnosis. Copyright © 2014 Elsevier Ireland Ltd. All rights reserved.
Optimal Sequential Rules for Computer-Based Instruction.
ERIC Educational Resources Information Center
Vos, Hans J.
1998-01-01
Formulates sequential rules for adapting the appropriate amount of instruction to learning needs in the context of computer-based instruction. Topics include Bayesian decision theory, threshold and linear-utility structure, psychometric model, optimal sequential number of test questions, and an empirical example of sequential instructional…
Cole, A J; Griffiths, D; Lavender, S; Summers, P; Rich, K
2006-05-01
To test the hypothesis that artefact caused by postmortem off-gassing is at least partly responsible for the presence of gas within the vascular system and tissues of the cadaver following death associated with compressed air diving. Controlled experiment sacrificing sheep after a period of simulated diving in a hyperbaric chamber and carrying out sequential postmortem computed tomography (CT) on the cadavers. All the subject sheep developed significant quantities of gas in the vascular system within 24 hours, as demonstrated by CT and necropsy, while the control animals did not. The presence of gas in the vascular system of human cadavers following diving associated fatalities is to be expected, and is not necessarily connected with gas embolism following pulmonary barotrauma, as has previously been claimed.
Biological Motion Primes the Animate/Inanimate Distinction in Infancy
Poulin-Dubois, Diane; Crivello, Cristina; Wright, Kristyn
2015-01-01
Given that biological motion is both detected and preferred early in life, we tested the hypothesis that biological motion might be instrumental to infants’ differentiation of animate and inanimate categories. Infants were primed with either point-light displays of realistic biological motion, random motion, or schematic biological motion of an unfamiliar shape. After being habituated to these displays, 12-month-old infants categorized animals and vehicles as well as furniture and vehicles with the sequential touching task. The findings indicated that infants primed with point-light displays of realistic biological motion showed better categorization of animates than those exposed to random or schematic biological motion. These results suggest that human biological motion might be one of the motion cues that provide the building blocks for infants’ concept of animacy. PMID:25659077
Pilot and Repeat Trials as Development Tools Associated with Demonstration of Bioequivalence.
Fuglsang, Anders
2015-05-01
The purpose of this work is to use simulated trials to study how pilot trials can be implemented in relation to bioequivalence testing, and how the use of the information obtained at the pilot stage can influence the overall chance of showing bioequivalence (power) or the chance of approving a truly bioinequivalent product (type I error). The work also covers the use of repeat pivotal trials since the difference between a pilot trial followed by a pivotal trial and a pivotal trial followed by a repeat trial is mainly a question of whether a conclusion of bioequivalence can be allowed after the first trial. Repeating a pivotal trial after a failed trial involves dual or serial testing of the bioequivalence null hypothesis, and the paper illustrates how this may inflate the type I error up to almost 10%. Hence, it is questioned if such practice is in the interest of patients. Tables for power, type I error, and sample sizes are provided for a total of six different decision trees which allow the developer to use either the observed geometric mean ratio (GMR) from the first or trial or to assume that the GMR is 0.95. In cases when the true GMR can be controlled so as not to deviate more from unity than 0.95, sequential design methods ad modum Potvin may be superior to pilot trials. The tables provide a quantitative basis for choosing between sequential designs and pivotal trials preceded by pilot trials.
The Cerebellar Deficit Hypothesis and Dyslexic Tendencies in a Non-Clinical Sample
ERIC Educational Resources Information Center
Brookes, Rebecca L.; Stirling, John
2005-01-01
In order to assess the relationship between cerebellar deficits and dyslexic tendencies in a non-clinical sample, 27 primary school children aged 8-9 completed a cerebellar soft signs battery and were additionally assessed for reading age, sequential memory, picture arrangement and knowledge of common sequences. An average measure of the soft…
Avallone, Antonio; Pecori, Biagio; Bianco, Franco; Aloj, Luigi; Tatangelo, Fabiana; Romano, Carmela; Granata, Vincenza; Marone, Pietro; Leone, Alessandra; Botti, Gerardo; Petrillo, Antonella; Caracò, Corradina; Iaffaioli, Vincenzo R.; Muto, Paolo; Romano, Giovanni; Comella, Pasquale; Budillon, Alfredo; Delrio, Paolo
2015-01-01
Background We have previously shown that an intensified preoperative regimen including oxaliplatin plus raltitrexed and 5-fluorouracil/folinic acid (OXATOM/FUFA) during preoperative pelvic radiotherapy produced promising results in locally advanced rectal cancer (LARC). Preclinical evidence suggests that the scheduling of bevacizumab may be crucial to optimize its combination with chemo-radiotherapy. Patients and methods This non-randomized, non-comparative, phase II study was conducted in MRI-defined high-risk LARC. Patients received three biweekly cycles of OXATOM/FUFA during RT. Bevacizumab was given 2 weeks before the start of chemo-radiotherapy, and on the same day of chemotherapy for 3 cycles (concomitant-schedule A) or 4 days prior to the first and second cycle of chemotherapy (sequential-schedule B). Primary end point was pathological complete tumor regression (TRG1) rate. Results The accrual for the concomitant-schedule was early terminated because the number of TRG1 (2 out of 16 patients) was statistically inconsistent with the hypothesis of activity (30%) to be tested. Conversely, the endpoint was reached with the sequential-schedule and the final TRG1 rate among 46 enrolled patients was 50% (95% CI 35%–65%). Neutropenia was the most common grade ≥3 toxicity with both schedules, but it was less pronounced with the sequential than concomitant-schedule (30% vs. 44%). Postoperative complications occurred in 8/15 (53%) and 13/46 (28%) patients in schedule A and B, respectively. At 5 year follow-up the probability of PFS and OS was 80% (95%CI, 66%–89%) and 85% (95%CI, 69%–93%), respectively, for the sequential-schedule. Conclusions These results highlights the relevance of bevacizumab scheduling to optimize its combination with preoperative chemo-radiotherapy in the management of LARC. PMID:26320185
ROC and Loss Function Analysis in Sequential Testing
ERIC Educational Resources Information Center
Muijtjens, Arno M. M.; Van Luijk, Scheltus J.; Van Der Vleuten, Cees P. M.
2006-01-01
Sequential testing is applied to reduce costs in SP-based tests (OSCEs). Initially, all candidates take a screening test consisting of a part of the OSCE. Candidates who fail the screen sit the complete test, whereas those who pass the screen are qualified as a pass of the complete test. The procedure may result in a reduction of testing…
Endogenous Sequential Cortical Activity Evoked by Visual Stimuli
Miller, Jae-eun Kang; Hamm, Jordan P.; Jackson, Jesse; Yuste, Rafael
2015-01-01
Although the functional properties of individual neurons in primary visual cortex have been studied intensely, little is known about how neuronal groups could encode changing visual stimuli using temporal activity patterns. To explore this, we used in vivo two-photon calcium imaging to record the activity of neuronal populations in primary visual cortex of awake mice in the presence and absence of visual stimulation. Multidimensional analysis of the network activity allowed us to identify neuronal ensembles defined as groups of cells firing in synchrony. These synchronous groups of neurons were themselves activated in sequential temporal patterns, which repeated at much higher proportions than chance and were triggered by specific visual stimuli such as natural visual scenes. Interestingly, sequential patterns were also present in recordings of spontaneous activity without any sensory stimulation and were accompanied by precise firing sequences at the single-cell level. Moreover, intrinsic dynamics could be used to predict the occurrence of future neuronal ensembles. Our data demonstrate that visual stimuli recruit similar sequential patterns to the ones observed spontaneously, consistent with the hypothesis that already existing Hebbian cell assemblies firing in predefined temporal sequences could be the microcircuit substrate that encodes visual percepts changing in time. PMID:26063915
Choe, Yu-Kyong; Foster, Tammie; Asselin, Abigail; LeVander, Meagan; Baird, Jennifer
2017-04-01
Approximately 24% of stroke survivors experience co-occurring aphasia and hemiparesis. These individuals typically attend back-to-back therapy sessions. However, sequentially scheduled therapy may trigger physical and mental fatigue and have an adverse impact on treatment outcomes. The current study tested a hypothesis that exerting less effort during a therapy session would reduce overall fatigue and enhance functional recovery. Two stroke survivors chronically challenged by non-fluent aphasia and right hemiparesis sequentially completed verbal naming and upper-limb tasks on their home computers. The level of cognitive-linguistic effort in speech/language practice was manipulated by presenting verbal naming tasks in two conditions: Decreasing cues (i.e., most-to-least support for word retrieval), and Increasing cues (i.e., least-to-most support). The participants completed the same upper-limb exercises throughout the study periods. Both individuals showed a statistically significant advantage of decreasing cues over increasing cues in word retrieval during the practice period, but not at the end of the practice period or thereafter. The participant with moderate aphasia and hemiparesis achieved clinically meaningful gains in upper-limb functions following the decreasing cues condition, but not after the increasing cues condition. Preliminary findings from the current study suggest a positive impact of decreasing cues in the context of multidisciplinary stroke rehabilitation.
Peter, Beate
2018-01-01
In a companion study, adults with dyslexia and adults with a probable history of childhood apraxia of speech showed evidence of difficulty with processing sequential information during nonword repetition, multisyllabic real word repetition and nonword decoding. Results suggested that some errors arose in visual encoding during nonword reading, all levels of processing but especially short-term memory storage/retrieval during nonword repetition, and motor planning and programming during complex real word repetition. To further investigate the role of short-term memory, a participant with short-term memory impairment (MI) was recruited. MI was confirmed with poor performance during a sentence repetition and three nonword repetition tasks, all of which have a high short-term memory load, whereas typical performance was observed during tests of reading, spelling, and static verbal knowledge, all with low short-term memory loads. Experimental results show error-free performance during multisyllabic real word repetition but high counts of sequence errors, especially migrations and assimilations, during nonword repetition, supporting short-term memory as a locus of sequential processing deficit during nonword repetition. Results are also consistent with the hypothesis that during complex real word repetition, short-term memory is bypassed as the word is recognized and retrieved from long-term memory prior to producing the word.
Lee, Mei-Hua; Bodfish, James W; Lewis, Mark H; Newell, Karl M
2010-01-01
This study investigated the mean rate and time-dependent sequential organization of spontaneous eye blinks in adults with intellectual and developmental disability (IDD) and individuals from this group who were additionally categorized with stereotypic movement disorder (IDD+SMD). The mean blink rate was lower in the IDD+SMD group than the IDD group and both of these groups had a lower blink rate than a contrast group of healthy adults. In the IDD group the n to n+1 sequential organization over time of the eye-blink durations showed a stronger compensatory organization than the contrast group suggesting decreased complexity/dimensionality of eye-blink behavior. Very low blink rate (and thus insufficient time series data) precluded analysis of time-dependent sequential properties in the IDD+SMD group. These findings support the hypothesis that both IDD and SMD are associated with a reduction in the dimension and adaptability of movement behavior and that this may serve as a risk factor for the expression of abnormal movements.
Wells, Gary L; Steblay, Nancy K; Dysart, Jennifer E
2015-02-01
Eyewitnesses (494) to actual crimes in 4 police jurisdictions were randomly assigned to view simultaneous or sequential photo lineups using laptop computers and double-blind administration. The sequential procedure used in the field experiment mimicked how it is conducted in actual practice (e.g., using a continuation rule, witness does not know how many photos are to be viewed, witnesses resolve any multiple identifications), which is not how most lab experiments have tested the sequential lineup. No significant differences emerged in rates of identifying lineup suspects (25% overall) but the sequential procedure produced a significantly lower rate (11%) of identifying known-innocent lineup fillers than did the simultaneous procedure (18%). The simultaneous/sequential pattern did not significantly interact with estimator variables and no lineup-position effects were observed for either the simultaneous or sequential procedures. Rates of nonidentification were not significantly different for simultaneous and sequential but nonidentifiers from the sequential procedure were more likely to use the "not sure" response option than were nonidentifiers from the simultaneous procedure. Among witnesses who made an identification, 36% (41% of simultaneous and 32% of sequential) identified a known-innocent filler rather than a suspect, indicating that eyewitness performance overall was very poor. The results suggest that the sequential procedure that is used in the field reduces the identification of known-innocent fillers, but the differences are relatively small.
Mitochondrial genomes of two Australian fishflies with an evolutionary timescale of Chauliodinae.
Yang, Fan; Jiang, Yunlan; Yang, Ding; Liu, Xingyue
2017-06-30
Fishflies (Corydalidae: Chauliodinae) with a total of ca. 130 extant species are one of the major groups of the holometabolous insect order Megaloptera. As a group which originated during the Mesozoic, the phylogeny and historical biogeography of fishflies are of high interest. The previous hypothesis on the evolutionary history of fishflies was based primarily on morphological data. To further test the existing phylogenetic relationships and to understand the divergence pattern of fishflies, we conducted a molecule-based study. We determined the complete mitochondrial (mt) genomes of two Australian fishfly species, Archichauliodes deceptor Kimmins, 1954 and Protochauliodes biconicus Kimmins, 1954, both members of a major subgroup of Chauliodinae with high phylogenetic significance. A phylogenomic analysis was carried out based on 13 mt protein coding genes (PCGs) and two rRNAs genes from the megalopteran species with determined mt genomes. Both maximum likelihood and Bayesian inference analyses recovered the Dysmicohermes clade as the sister group of the Archichauliodes clade + the Protochauliodes clade, which is consistent with the previous morphology-based hypothesis. The divergence time estimation suggested that the divergence among the three major subgroups of fishflies occurred during the Late Jurassic and Early Cretaceous when the supercontinent Pangaea was undergoing sequential breakup.
Cognitive Fatigue Facilitates Procedural Sequence Learning.
Borragán, Guillermo; Slama, Hichem; Destrebecqz, Arnaud; Peigneux, Philippe
2016-01-01
Enhanced procedural learning has been evidenced in conditions where cognitive control is diminished, including hypnosis, disruption of prefrontal activity and non-optimal time of the day. Another condition depleting the availability of controlled resources is cognitive fatigue (CF). We tested the hypothesis that CF, eventually leading to diminished cognitive control, facilitates procedural sequence learning. In a two-day experiment, 23 young healthy adults were administered a serial reaction time task (SRTT) following the induction of high or low levels of CF, in a counterbalanced order. CF was induced using the Time load Dual-back (TloadDback) paradigm, a dual working memory task that allows tailoring cognitive load levels to the individual's optimal performance capacity. In line with our hypothesis, reaction times (RT) in the SRTT were faster in the high- than in the low-level fatigue condition, and performance improvement was higher for the sequential than the motor components. Altogether, our results suggest a paradoxical, facilitating impact of CF on procedural motor sequence learning. We propose that facilitated learning in the high-level fatigue condition stems from a reduction in the cognitive resources devoted to cognitive control processes that normally oppose automatic procedural acquisition mechanisms.
Alternatives to the sequential lineup: the importance of controlling the pictures.
Lindsay, R C; Bellinger, K
1999-06-01
Because sequential lineups reduce false-positive choices, their use has been recommended (R. C. L. Lindsay, 1999; R. C. L. Lindsay & G. L. Wells, 1985). Blind testing is included in the recommended procedures. Police, concerned about blind testing, devised alternative procedures, including self-administered sequential lineups, to reduce use of relative judgments (G. L. Wells, 1984) while permitting the investigating officer to conduct the procedure. Identification data from undergraduates exposed to a staged crime (N = 165) demonstrated that 4 alternative identification procedures tested were less effective than the original sequential lineup. Allowing witnesses to control the photographs resulted in higher rates of false-positive identification. Self-reports of using relative judgments were shown to be postdictive of decision accuracy.
NASA Astrophysics Data System (ADS)
Selvaraj, A.; Nambi, I. M.
2014-12-01
In this study, an innovative technique of ZVI mediated 'coupling of Fenton like oxidation of phenol and Cr(VI) reduction technique' was attempted. The hypothesis is that Fe3+ generated from Cr(VI) reduction process acts as electron acceptor and catalyst for Fenton's Phenol oxidation process. The Fe2+ formed from Fenton reactions can be reused for Cr(VI) reduction. Thus iron can be made to recycle between two reactions, changing back and forth between Fe2+ and Fe3+ forms, makes treatment sustainable.(Fig 1) This approach advances current Fenton like oxidation process by (i)single system removal of heavy metal and organic matter (ii)recycling of iron species; hence no additional iron required (iii)more contaminant removal to ZVI ratio (iv)eliminating sludge related issues. Preliminary batch studies were conducted at different modes i) concurrent removal ii) sequential removal. The sequential removal was found better for in-situ PRB applications. PRB was designed based on kinetic rate slope and half-life time, obtained from primary column study. This PRB has two segments (i)ZVI segment[Cr(VI)] (ii)iron species segment[phenol]. This makes treatment sustainable by (i) having no iron ions in outlet stream (ii)meeting hypothesis and elongates the life span of PRB. Sequential removal of contaminates were tested in pilot scale PRB(Fig 2) and its life span was calculated based on the exhaustion of filling material. Aqueous, sand and iron aliquots were collected at various segments of PRB and analyzed for precipitation and chemical speciation thoroughly (UV spectrometer, XRD, FTIR, electron microscope). Chemical speciation profile eliminates the uncertainties over in-situ PRB's long term performance. Based on the pilot scale PRB study, 'field level PRB wall construction' was suggested to remove heavy metal and organic compounds from Pallikaranai marshland(Fig 3)., which is contaminated with leachate coming from nearby Perungudi dumpsite. This research provides (i)deeper insight into the environmental friendly, accelerated, sustainable technique for combined removal of organic matter and heavy metal (ii)evaluation of the novel technique in PRB, which resulted in PRB's increased life span (iii)designing of PRB to remediate the marshland and its ecosystem, thus save the habitats related to it.
The Sequential Probability Ratio Test and Binary Item Response Models
ERIC Educational Resources Information Center
Nydick, Steven W.
2014-01-01
The sequential probability ratio test (SPRT) is a common method for terminating item response theory (IRT)-based adaptive classification tests. To decide whether a classification test should stop, the SPRT compares a simple log-likelihood ratio, based on the classification bound separating two categories, to prespecified critical values. As has…
A spatial scan statistic for multiple clusters.
Li, Xiao-Zhou; Wang, Jin-Feng; Yang, Wei-Zhong; Li, Zhong-Jie; Lai, Sheng-Jie
2011-10-01
Spatial scan statistics are commonly used for geographical disease surveillance and cluster detection. While there are multiple clusters coexisting in the study area, they become difficult to detect because of clusters' shadowing effect to each other. The recently proposed sequential method showed its better power for detecting the second weaker cluster, but did not improve the ability of detecting the first stronger cluster which is more important than the second one. We propose a new extension of the spatial scan statistic which could be used to detect multiple clusters. Through constructing two or more clusters in the alternative hypothesis, our proposed method accounts for other coexisting clusters in the detecting and evaluating process. The performance of the proposed method is compared to the sequential method through an intensive simulation study, in which our proposed method shows better power in terms of both rejecting the null hypothesis and accurately detecting the coexisting clusters. In the real study of hand-foot-mouth disease data in Pingdu city, a true cluster town is successfully detected by our proposed method, which cannot be evaluated to be statistically significant by the standard method due to another cluster's shadowing effect. Copyright © 2011 Elsevier Inc. All rights reserved.
Sequential Cotard and Capgras delusions.
Wright, S; Young, A W; Hellawell, D J
1993-09-01
We report sequential Cotard and Capgras delusions in the same patient, KH, and offer a simple hypothesis to account for this link. The Cotard delusion occurred when KH was depressed and the Capgras delusion arose in the context of persecutory delusions. We suggest that the Cotard and Capgras delusions reflect different interpretations of similar anomalous experiences, and that the persecutory delusions and suspiciousness that are often noted in Capgras cases contribute to the patients' mistaking a change in themselves for a change in others ('they are impostors'), whereas people who are depressed exaggerate the negative effects of the same change whilst correctly attributing it to themselves ('I am dead'). This explains why there might be an underlying similarity between delusions which are phenomenally distinct.
Unraveling the Tangles of Language Evolution
NASA Astrophysics Data System (ADS)
Petroni, F.; Serva, M.; Volchenkov, D.
2012-07-01
The relationships between languages molded by extremely complex social, cultural and political factors are assessed by an automated method, in which the distance between languages is estimated by the average normalized Levenshtein distance between words from the list of 200 meanings maximally resistant to change. A sequential process of language classification described by random walks on the matrix of lexical distances allows to represent complex relationships between languages geometrically, in terms of distances and angles. We have tested the method on a sample of 50 Indo-European and 50 Austronesian languages. The geometric representations of language taxonomy allows for making accurate interfaces on the most significant events of human history by tracing changes in language families through time. The Anatolian and Kurgan hypothesis of the Indo-European origin and the "express train" model of the Polynesian origin are thoroughly discussed.
Cole, A J; Griffiths, D; Lavender, S; Summers, P; Rich, K
2006-01-01
Aims To test the hypothesis that artefact caused by postmortem off‐gassing is at least partly responsible for the presence of gas within the vascular system and tissues of the cadaver following death associated with compressed air diving. Methods Controlled experiment sacrificing sheep after a period of simulated diving in a hyperbaric chamber and carrying out sequential postmortem computed tomography (CT) on the cadavers. Results All the subject sheep developed significant quantities of gas in the vascular system within 24 hours, as demonstrated by CT and necropsy, while the control animals did not. Conclusions The presence of gas in the vascular system of human cadavers following diving associated fatalities is to be expected, and is not necessarily connected with gas embolism following pulmonary barotrauma, as has previously been claimed. PMID:16489175
NASA Astrophysics Data System (ADS)
Tan, Maxine; Leader, Joseph K.; Liu, Hong; Zheng, Bin
2015-03-01
We recently investigated a new mammographic image feature based risk factor to predict near-term breast cancer risk after a woman has a negative mammographic screening. We hypothesized that unlike the conventional epidemiology-based long-term (or lifetime) risk factors, the mammographic image feature based risk factor value will increase as the time lag between the negative and positive mammography screening decreases. The purpose of this study is to test this hypothesis. From a large and diverse full-field digital mammography (FFDM) image database with 1278 cases, we collected all available sequential FFDM examinations for each case including the "current" and 1 to 3 most recently "prior" examinations. All "prior" examinations were interpreted negative, and "current" ones were either malignant or recalled negative/benign. We computed 92 global mammographic texture and density based features, and included three clinical risk factors (woman's age, family history and subjective breast density BIRADS ratings). On this initial feature set, we applied a fast and accurate Sequential Forward Floating Selection (SFFS) feature selection algorithm to reduce feature dimensionality. The features computed on both mammographic views were individually/ separately trained using two artificial neural network (ANN) classifiers. The classification scores of the two ANNs were then merged with a sequential ANN. The results show that the maximum adjusted odds ratios were 5.59, 7.98, and 15.77 for using the 3rd, 2nd, and 1st "prior" FFDM examinations, respectively, which demonstrates a higher association of mammographic image feature change and an increasing risk trend of developing breast cancer in the near-term after a negative screening.
ERIC Educational Resources Information Center
Boekkooi-Timminga, Ellen
Nine methods for automated test construction are described. All are based on the concepts of information from item response theory. Two general kinds of methods for the construction of parallel tests are presented: (1) sequential test design; and (2) simultaneous test design. Sequential design implies that the tests are constructed one after the…
Acquiring Procedural Skills from Lesson Sequences.
1985-08-13
Teachers of Mathematics . Washington, D)C: NCTM . Brueckner, I..J. (1930) Diagnostic aund remedial teaching in arithmetic. Philadelphia. PA: Winston. Burton...arithmetic and algebra, fr-m multi-lesson curricula. The central hypothesis is that students and teachers obey cc: :-.entions that cause the goal hierarchy...students and • . teachers obey conventions that cause the goal hierarchy of the acquired procedure to be a particular structural function of the sequential
NASA Astrophysics Data System (ADS)
Masson, F.; Mouyen, M.; Hwang, C.; Wu, Y.-M.; Ponton, F.; Lehujeur, M.; Dorbath, C.
2012-11-01
Using a Bouguer anomaly map and a dense seismic data set, we have performed two studies in order to improve our knowledge of the deep structure of Taiwan. First, we model the Bouguer anomaly along a profile crossing the island using simple forward modelling. The modelling is 2D, with the hypothesis of cylindrical symmetry. Second we present a joint analysis of gravity anomaly and seismic arrival time data recorded in Taiwan. An initial velocity model has been obtained by local earthquake tomography (LET) of the seismological data. The LET velocity model was used to construct an initial 3D gravity model, using a linear velocity-density relationship (Birch's law). The synthetic Bouguer anomaly calculated for this model has the same shape and wavelength as the observed anomaly. However some characteristics of the anomaly map are not retrieved. To derive a crustal velocity/density model which accounts for both types of observations, we performed a sequential inversion of seismological and gravity data. The variance reduction of the arrival time data for the final sequential model was comparable to the variance reduction obtained by simple LET. Moreover, the sequential model explained about 80% of the observed gravity anomaly. New 3D model of Taiwan lithosphere is presented.
Simple and flexible SAS and SPSS programs for analyzing lag-sequential categorical data.
O'Connor, B P
1999-11-01
This paper describes simple and flexible programs for analyzing lag-sequential categorical data, using SAS and SPSS. The programs read a stream of codes and produce a variety of lag-sequential statistics, including transitional frequencies, expected transitional frequencies, transitional probabilities, adjusted residuals, z values, Yule's Q values, likelihood ratio tests of stationarity across time and homogeneity across groups or segments, transformed kappas for unidirectional dependence, bidirectional dependence, parallel and nonparallel dominance, and significance levels based on both parametric and randomization tests.
[Dilemma of null hypothesis in ecological hypothesis's experiment test.
Li, Ji
2016-06-01
Experimental test is one of the major test methods of ecological hypothesis, though there are many arguments due to null hypothesis. Quinn and Dunham (1983) analyzed the hypothesis deduction model from Platt (1964) and thus stated that there is no null hypothesis in ecology that can be strictly tested by experiments. Fisher's falsificationism and Neyman-Pearson (N-P)'s non-decisivity inhibit statistical null hypothesis from being strictly tested. Moreover, since the null hypothesis H 0 (α=1, β=0) and alternative hypothesis H 1 '(α'=1, β'=0) in ecological progresses are diffe-rent from classic physics, the ecological null hypothesis can neither be strictly tested experimentally. These dilemmas of null hypothesis could be relieved via the reduction of P value, careful selection of null hypothesis, non-centralization of non-null hypothesis, and two-tailed test. However, the statistical null hypothesis significance testing (NHST) should not to be equivalent to the causality logistical test in ecological hypothesis. Hence, the findings and conclusions about methodological studies and experimental tests based on NHST are not always logically reliable.
Lee, Mei-Hua; Bodfish, James W.; Lewis, Mark H.; Newell, Karl M.
2009-01-01
This study investigated the mean rate and time-dependent sequential organization of spontaneous eye blinks in adults with intellectual and developmental disability (IDD) and individuals from this group that were additionally categorized with stereotypic movement disorder (IDD+SMD). The mean blink rate was lower in the IDD+SMD group than the IDD group and both of these groups had a lower blink rate than a contrast group of healthy adults. In the IDD group the n to n+1 sequential organization over time of the eye blink durations showed a stronger compensatory organization than the contrast group suggesting decreased complexity/dimensionality of eye-blink behavior. Very low blink rate (and thus insufficient time series data) precluded analysis of time-dependent sequential properties in the IDD+SMD group. These findings support the hypothesis that both IDD and SMD are associated with a reduction in the dimension and adaptability of movement behavior and that this may serve as a risk factor for the expression of abnormal movements. PMID:19819672
2012-01-01
Background This paper explores smoking cessation participants’ perceptions of attempting weight management alongside smoking cessation within the context of a health improvement intervention implemented in Glasgow, Scotland. Methods One hundred and thirty-eight participants were recruited from smoking cessation classes in areas of multiple deprivation in Glasgow and randomised to intervention, receiving dietary advice, or to control groups. The primary outcome of the study was to determine the % change in body weight. Semi-structured interviews were conducted with a purposive sample of 15 intervention and 15 control participants at weeks 6 (during the intervention) and 24 (at the end of the intervention). The current paper, though predominantly qualitative, links perceptions of behaviour modification to % weight change and cessation rates at week 24 thereby enabling a better understanding of the mediators influencing multiple behaviour change. Results Our findings suggest that participants who perceive separate behaviour changes as part of a broader approach to a healthier lifestyle, and hence attempt behaviour changes concurrently, may be at comparative advantage in positively achieving dual outcomes. Conclusions These findings highlight the need to assess participants’ preference for attempting multiple behaviour changes sequentially or simultaneously in addition to assessing their readiness to change. Further testing of this hypothesis is warranted. Trial Registration ISRCTN94961361 PMID:22759785
DOE Office of Scientific and Technical Information (OSTI.GOV)
Pulver, A.E.; Wolyniec, P.S.; Lasseter, V.K.
To identify genes responsible for the susceptibility for schizophrenia, and to test the hypothesis that schizophrenia is etiologically heterogeneous, we have studied 39 multiplex families from a systematic sample of schizophrenic patients. Using a complex autosomal dominant model, which considers only those with a diagnosis of schizophrenia or schizoaffective disorder as affected, a random search of the genome for detection of linkage was undertaken. Pairwise linkage analyses suggest a potential linkage (LRH = 34.7 or maximum lod score = 1.54) for one region (22q12-q13.1). Reanalyses, varying parameters in the dominant model, maximized the LRH at 660.7 (maximum lod score 2.82).more » This finding is of sufficient interest to warrant further investigation through collaborative studies. 72 refs., 5 tabs.« less
Asset surveillance system: apparatus and method
NASA Technical Reports Server (NTRS)
Bickford, Randall L. (Inventor)
2007-01-01
System and method for providing surveillance of an asset comprised of numerically fitting at least one mathematical model to obtained residual data correlative to asset operation; storing at least one mathematical model in a memory; obtaining a current set of signal data from the asset; retrieving at least one mathematical model from the memory, using the retrieved mathematical model in a sequential hypothesis test for determining if the current set of signal data is indicative of a fault condition; determining an asset fault cause correlative to a determined indication of a fault condition; providing an indication correlative to a determined fault cause, and an action when warranted. The residual data can be mode partitioned, a current mode of operation can be determined from the asset, and at least one mathematical model can be retrieved from the memory as a function of the determined mode of operation.
Configural and component processing in simultaneous and sequential lineup procedures.
Flowe, Heather D; Smith, Harriet M J; Karoğlu, Nilda; Onwuegbusi, Tochukwu O; Rai, Lovedeep
2016-01-01
Configural processing supports accurate face recognition, yet it has never been examined within the context of criminal identification lineups. We tested, using the inversion paradigm, the role of configural processing in lineups. Recent research has found that face discrimination accuracy in lineups is better in a simultaneous compared to a sequential lineup procedure. Therefore, we compared configural processing in simultaneous and sequential lineups to examine whether there are differences. We had participants view a crime video, and then they attempted to identify the perpetrator from a simultaneous or sequential lineup. The test faces were presented either upright or inverted, as previous research has shown that inverting test faces disrupts configural processing. The size of the inversion effect for faces was the same across lineup procedures, indicating that configural processing underlies face recognition in both procedures. Discrimination accuracy was comparable across lineup procedures in both the upright and inversion condition. Theoretical implications of the results are discussed.
[Bilateral cochlear implants in children: acquisition of binaural hearing].
Ramos-Macías, Angel; Deive-Maggiolo, Leopoldo; Artiles-Cabrera, Ovidio; González-Aguado, Rocío; Borkoski-Barreiro, Silvia A; Masgoret-Palau, Elizabeth; Falcón-González, Juan C; Bueno-Yanes, Jorge
2013-01-01
Several studies have indicated the benefit of bilateral cochlear implants in the acquisition of binaural hearing and bilateralism. In children with cochlear implants, is it possible to achieve binaurality after a second implant? When is the ideal time to implant them? The objective of this study was to analyse the binaural effect in children with bilateral implants and the differences between subjects with simultaneous and sequential implants with both short and long intervals. There were 90 patients between 1 and 2 years of age (the first surgery), implanted between 2000 and 2008. Of these, 25 were unilateral users and 65 bilateral; 17 patients had received simultaneous implants, 29 had sequential implants before 12 months after the first one (short interimplant period) and 19 after 12 months (long period). All of them were tested for silent and noisy verbal perception and a tonal threshold audiometry was performed. The silent perception test showed that the simultaneous and short period sequential implant patients (mean: 84.67%) versus unilateral and long period sequential implants (mean: 79.66%), had a statistically-significant difference (P=0,23). Likewise, the noisy perception test showed a difference with statistical significance (P=0,22) comparing the simultaneous implanted and short period sequential implants (mean, 77.17%) versus unilateral implanted and long period sequential ones (mean: 69.32%). The simultaneous and sequential short period implants acquired the advantages of binaural hearing. Copyright © 2012 Elsevier España, S.L. All rights reserved.
Comparison of Performance of Eight-Year-Old Children on Three Auditory Sequential Memory Tests.
ERIC Educational Resources Information Center
Chermak, Gail D.; O'Connell, Vickie I.
1981-01-01
Twenty normal children were administered three tests of auditory sequential memory. A Pearson product-moment correlation of .50 and coefficients of determination showed all but one relationship to be nonsignificant and predictability between pairs of scores to be poor. (Author)
Sequential Computerized Mastery Tests--Three Simulation Studies
ERIC Educational Resources Information Center
Wiberg, Marie
2006-01-01
A simulation study of a sequential computerized mastery test is carried out with items modeled with the 3 parameter logistic item response theory model. The examinees' responses are either identically distributed, not identically distributed, or not identically distributed together with estimation errors in the item characteristics. The…
Biomechanical effects of hydration in vocal fold tissues.
Chan, Roger W; Tayama, Niro
2002-05-01
It has often been hypothesized, with little empirical support, that vocal fold hydration affects voice production by mediating changes in vocal fold tissue rheology. To test this hypothesis, we attempted in this study to quantify the effects of hydration on the viscoelastic shear properties of vocal fold tissues in vitro. Osmotic changes in hydration (dehydration and rehydration) of 5 excised canine larynges were induced by sequential incubation of the tissues in isotonic, hypertonic, and hypotonic solutions. Elastic shear modulus (G'), dynamic viscosity eta' and the damping ratio zeta of the vocal fold mucosa (lamina propria) were measured as a function of frequency (0.01 to 15 Hz) with a torsional rheometer. Vocal fold tissue stiffness (G') and viscosity (eta) increased significantly (by 4 to 7 times) with the osmotically induced dehydration, whereas they decreased by 22% to 38% on the induced rehydration. Damping ratio (zeta) also increased with dehydration and decreased with rehydration, but the detected differences were not statistically significant at all frequencies. These findings support the long-standing hypothesis that hydration affects vocal fold vibration by altering tissue rheologic (or viscoelastic) properties. Our results demonstrated the biomechanical importance of hydration in vocal fold tissues and suggested that hydration approaches may potentially improve the biomechanics of phonation in vocal fold lesions involving disordered fluid balance.
Iwasaki, Miho; Noguchi, Yasuki; Kakigi, Ryusuke
2018-06-07
Some researchers in aesthetics assume visual features related to aesthetic perception (e.g. golden ratio and symmetry) commonly embedded in masterpieces. If this is true, an intriguing hypothesis is that the human brain has neural circuitry specialized for the processing of visual beauty. We presently tested this hypothesis by combining a neuroimaging technique with the repetition suppression (RS) paradigm. Subjects (non-experts in art) viewed two images of sculptures sequentially presented. Some sculptures obeyed the golden ratio (canonical images), while the golden proportion were impaired in other sculptures (deformed images). We found that the occipito-temporal cortex in the right hemisphere showed the RS when a canonical sculpture (e.g. Venus de Milo) was repeatedly presented, but not when its deformed version was repeated. Furthermore, the right parietal cortex showed the RS to the canonical proportion even when two sculptures had different identities (e.g. Venus de Milo as the first stimulus and David di Michelangelo as the second), indicating that this region encodes the golden ratio as an abstract rule shared by different sculptures. Those results suggest two separate stages of neural processing for aesthetic information (one in the occipito-temporal and another in the parietal regions) that are hierarchically arranged in the human brain.
A maximally selected test of symmetry about zero.
Laska, Eugene; Meisner, Morris; Wanderling, Joseph
2012-11-20
The problem of testing symmetry about zero has a long and rich history in the statistical literature. We introduce a new test that sequentially discards observations whose absolute value is below increasing thresholds defined by the data. McNemar's statistic is obtained at each threshold and the largest is used as the test statistic. We obtain the exact distribution of this maximally selected McNemar and provide tables of critical values and a program for computing p-values. Power is compared with the t-test, the Wilcoxon Signed Rank Test and the Sign Test. The new test, MM, is slightly less powerful than the t-test and Wilcoxon Signed Rank Test for symmetric normal distributions with nonzero medians and substantially more powerful than all three tests for asymmetric mixtures of normal random variables with or without zero medians. The motivation for this test derives from the need to appraise the safety profile of new medications. If pre and post safety measures are obtained, then under the null hypothesis, the variables are exchangeable and the distribution of their difference is symmetric about a zero median. Large pre-post differences are the major concern of a safety assessment. The discarded small observations are not particularly relevant to safety and can reduce power to detect important asymmetry. The new test was utilized on data from an on-road driving study performed to determine if a hypnotic, a drug used to promote sleep, has next day residual effects. Copyright © 2012 John Wiley & Sons, Ltd.
Effects of neostriatal 6-OHDA lesion on performance in a rat sequential reaction time task.
Domenger, D; Schwarting, R K W
2008-10-31
Work in humans and monkeys has provided evidence that the basal ganglia, and the neurotransmitter dopamine therein, play an important role for sequential learning and performance. Compared to primates, experimental work in rodents is rather sparse, largely due to the fact that tasks comparable to the human ones, especially serial reaction time tasks (SRTT), had been lacking until recently. We have developed a rat model of the SRTT, which allows to study neural correlates of sequential performance and motor sequence execution. Here, we report the effects of dopaminergic neostriatal lesions, performed using bilateral 6-hydroxydopamine injections, on performance of well-trained rats tested in our SRTT. Sequential behavior was measured in two ways: for one, the effects of small violations of otherwise well trained sequences were examined as a measure of attention and automation. Secondly, sequential versus random performance was compared as a measure of sequential learning. Neurochemically, the lesions led to sub-total dopamine depletions in the neostriatum, which ranged around 60% in the lateral, and around 40% in the medial neostriatum. These lesions led to a general instrumental impairment in terms of reduced speed (response latencies) and response rate, and these deficits were correlated with the degree of striatal dopamine loss. Furthermore, the violation test indicated that the lesion group conducted less automated responses. The comparison of random versus sequential responding showed that the lesion group did not retain its superior sequential performance in terms of speed, whereas they did in terms of accuracy. Also, rats with lesions did not improve further in overall performance as compared to pre-lesion values, whereas controls did. These results support previous results that neostriatal dopamine is involved in instrumental behaviour in general. Also, these lesions are not sufficient to completely abolish sequential performance, at least when acquired before lesion as tested here.
Goodness of fit of probability distributions for sightings as species approach extinction.
Vogel, Richard M; Hosking, Jonathan R M; Elphick, Chris S; Roberts, David L; Reed, J Michael
2009-04-01
Estimating the probability that a species is extinct and the timing of extinctions is useful in biological fields ranging from paleoecology to conservation biology. Various statistical methods have been introduced to infer the time of extinction and extinction probability from a series of individual sightings. There is little evidence, however, as to which of these models provide adequate fit to actual sighting records. We use L-moment diagrams and probability plot correlation coefficient (PPCC) hypothesis tests to evaluate the goodness of fit of various probabilistic models to sighting data collected for a set of North American and Hawaiian bird populations that have either gone extinct, or are suspected of having gone extinct, during the past 150 years. For our data, the uniform, truncated exponential, and generalized Pareto models performed moderately well, but the Weibull model performed poorly. Of the acceptable models, the uniform distribution performed best based on PPCC goodness of fit comparisons and sequential Bonferroni-type tests. Further analyses using field significance tests suggest that although the uniform distribution is the best of those considered, additional work remains to evaluate the truncated exponential model more fully. The methods we present here provide a framework for evaluating subsequent models.
Mirror neurons, birdsong, and human language: a hypothesis.
Levy, Florence
2011-01-01
THE MIRROR SYSTEM HYPOTHESIS AND INVESTIGATIONS OF BIRDSONG ARE REVIEWED IN RELATION TO THE SIGNIFICANCE FOR THE DEVELOPMENT OF HUMAN SYMBOLIC AND LANGUAGE CAPACITY, IN TERMS OF THREE FUNDAMENTAL FORMS OF COGNITIVE REFERENCE: iconic, indexical, and symbolic. Mirror systems are initially iconic but can progress to indexical reference when produced without the need for concurrent stimuli. Developmental stages in birdsong are also explored with reference to juvenile subsong vs complex stereotyped adult syllables, as an analogy with human language development. While birdsong remains at an indexical reference stage, human language benefits from the capacity for symbolic reference. During a pre-linguistic "babbling" stage, recognition of native phonemic categories is established, allowing further development of subsequent prefrontal and linguistic circuits for sequential language capacity.
Mirror Neurons, Birdsong, and Human Language: A Hypothesis
Levy, Florence
2012-01-01
The mirror system hypothesis and investigations of birdsong are reviewed in relation to the significance for the development of human symbolic and language capacity, in terms of three fundamental forms of cognitive reference: iconic, indexical, and symbolic. Mirror systems are initially iconic but can progress to indexical reference when produced without the need for concurrent stimuli. Developmental stages in birdsong are also explored with reference to juvenile subsong vs complex stereotyped adult syllables, as an analogy with human language development. While birdsong remains at an indexical reference stage, human language benefits from the capacity for symbolic reference. During a pre-linguistic “babbling” stage, recognition of native phonemic categories is established, allowing further development of subsequent prefrontal and linguistic circuits for sequential language capacity. PMID:22287950
Hejnol, Andreas; Lowe, Christopher J
2015-12-19
Molecular biology has provided a rich dataset to develop hypotheses of nervous system evolution. The startling patterning similarities between distantly related animals during the development of their central nervous system (CNS) have resulted in the hypothesis that a CNS with a single centralized medullary cord and a partitioned brain is homologous across bilaterians. However, the ability to precisely reconstruct ancestral neural architectures from molecular genetic information requires that these gene networks specifically map with particular neural anatomies. A growing body of literature representing the development of a wider range of metazoan neural architectures demonstrates that patterning gene network complexity is maintained in animals with more modest levels of neural complexity. Furthermore, a robust phylogenetic framework that provides the basis for testing the congruence of these homology hypotheses has been lacking since the advent of the field of 'evo-devo'. Recent progress in molecular phylogenetics is refining the necessary framework to test previous homology statements that span large evolutionary distances. In this review, we describe recent advances in animal phylogeny and exemplify for two neural characters-the partitioned brain of arthropods and the ventral centralized nerve cords of annelids-a test for congruence using this framework. The sequential sister taxa at the base of Ecdysozoa and Spiralia comprise small, interstitial groups. This topology is not consistent with the hypothesis of homology of tripartitioned brain of arthropods and vertebrates as well as the ventral arthropod and rope-like ladder nervous system of annelids. There can be exquisite conservation of gene regulatory networks between distantly related groups with contrasting levels of nervous system centralization and complexity. Consequently, the utility of molecular characters to reconstruct ancestral neural organization in deep time is limited. © 2015 The Authors.
Hejnol, Andreas; Lowe, Christopher J.
2015-01-01
Molecular biology has provided a rich dataset to develop hypotheses of nervous system evolution. The startling patterning similarities between distantly related animals during the development of their central nervous system (CNS) have resulted in the hypothesis that a CNS with a single centralized medullary cord and a partitioned brain is homologous across bilaterians. However, the ability to precisely reconstruct ancestral neural architectures from molecular genetic information requires that these gene networks specifically map with particular neural anatomies. A growing body of literature representing the development of a wider range of metazoan neural architectures demonstrates that patterning gene network complexity is maintained in animals with more modest levels of neural complexity. Furthermore, a robust phylogenetic framework that provides the basis for testing the congruence of these homology hypotheses has been lacking since the advent of the field of ‘evo-devo’. Recent progress in molecular phylogenetics is refining the necessary framework to test previous homology statements that span large evolutionary distances. In this review, we describe recent advances in animal phylogeny and exemplify for two neural characters—the partitioned brain of arthropods and the ventral centralized nerve cords of annelids—a test for congruence using this framework. The sequential sister taxa at the base of Ecdysozoa and Spiralia comprise small, interstitial groups. This topology is not consistent with the hypothesis of homology of tripartitioned brain of arthropods and vertebrates as well as the ventral arthropod and rope-like ladder nervous system of annelids. There can be exquisite conservation of gene regulatory networks between distantly related groups with contrasting levels of nervous system centralization and complexity. Consequently, the utility of molecular characters to reconstruct ancestral neural organization in deep time is limited. PMID:26554039
Comparison of futility monitoring guidelines using completed phase III oncology trials.
Zhang, Qiang; Freidlin, Boris; Korn, Edward L; Halabi, Susan; Mandrekar, Sumithra; Dignam, James J
2017-02-01
Futility (inefficacy) interim monitoring is an important component in the conduct of phase III clinical trials, especially in life-threatening diseases. Desirable futility monitoring guidelines allow timely stopping if the new therapy is harmful or if it is unlikely to demonstrate to be sufficiently effective if the trial were to continue to its final analysis. There are a number of analytical approaches that are used to construct futility monitoring boundaries. The most common approaches are based on conditional power, sequential testing of the alternative hypothesis, or sequential confidence intervals. The resulting futility boundaries vary considerably with respect to the level of evidence required for recommending stopping the study. We evaluate the performance of commonly used methods using event histories from completed phase III clinical trials of the Radiation Therapy Oncology Group, Cancer and Leukemia Group B, and North Central Cancer Treatment Group. We considered published superiority phase III trials with survival endpoints initiated after 1990. There are 52 studies available for this analysis from different disease sites. Total sample size and maximum number of events (statistical information) for each study were calculated using protocol-specified effect size, type I and type II error rates. In addition to the common futility approaches, we considered a recently proposed linear inefficacy boundary approach with an early harm look followed by several lack-of-efficacy analyses. For each futility approach, interim test statistics were generated for three schedules with different analysis frequency, and early stopping was recommended if the interim result crossed a futility stopping boundary. For trials not demonstrating superiority, the impact of each rule is summarized as savings on sample size, study duration, and information time scales. For negative studies, our results show that the futility approaches based on testing the alternative hypothesis and repeated confidence interval rules yielded less savings (compared to the other two rules). These boundaries are too conservative, especially during the first half of the study (<50% of information). The conditional power rules are too aggressive during the second half of the study (>50% of information) and may stop a trial even when there is a clinically meaningful treatment effect. The linear inefficacy boundary with three or more interim analyses provided the best results. For positive studies, we demonstrated that none of the futility rules would have stopped the trials. The linear inefficacy boundary futility approach is attractive from statistical, clinical, and logistical standpoints in clinical trials evaluating new anti-cancer agents.
Bayes factor design analysis: Planning for compelling evidence.
Schönbrodt, Felix D; Wagenmakers, Eric-Jan
2018-02-01
A sizeable literature exists on the use of frequentist power analysis in the null-hypothesis significance testing (NHST) paradigm to facilitate the design of informative experiments. In contrast, there is almost no literature that discusses the design of experiments when Bayes factors (BFs) are used as a measure of evidence. Here we explore Bayes Factor Design Analysis (BFDA) as a useful tool to design studies for maximum efficiency and informativeness. We elaborate on three possible BF designs, (a) a fixed-n design, (b) an open-ended Sequential Bayes Factor (SBF) design, where researchers can test after each participant and can stop data collection whenever there is strong evidence for either [Formula: see text] or [Formula: see text], and (c) a modified SBF design that defines a maximal sample size where data collection is stopped regardless of the current state of evidence. We demonstrate how the properties of each design (i.e., expected strength of evidence, expected sample size, expected probability of misleading evidence, expected probability of weak evidence) can be evaluated using Monte Carlo simulations and equip researchers with the necessary information to compute their own Bayesian design analyses.
Sequence-specific procedural learning deficits in children with specific language impairment.
Hsu, Hsinjen Julie; Bishop, Dorothy V M
2014-05-01
This study tested the procedural deficit hypothesis of specific language impairment (SLI) by comparing children's performance in two motor procedural learning tasks and an implicit verbal sequence learning task. Participants were 7- to 11-year-old children with SLI (n = 48), typically developing age-matched children (n = 20) and younger typically developing children matched for receptive grammar (n = 28). In a serial reaction time task, the children with SLI performed at the same level as the grammar-matched children, but poorer than age-matched controls in learning motor sequences. When tested with a motor procedural learning task that did not involve learning sequential relationships between discrete elements (i.e. pursuit rotor), the children with SLI performed comparably with age-matched children and better than younger grammar-matched controls. In addition, poor implicit learning of word sequences in a verbal memory task (the Hebb effect) was found in the children with SLI. Together, these findings suggest that SLI might be characterized by deficits in learning sequence-specific information, rather than generally weak procedural learning. © 2014 The Authors. Developmental Science Published by John Wiley & Sons Ltd.
Dissociable contributions of motor-execution and action-observation to intramanual transfer.
Hayes, Spencer J; Elliott, Digby; Andrew, Matthew; Roberts, James W; Bennett, Simon J
2012-09-01
We examined the hypothesis that different processes and representations are associated with the learning of a movement sequence through motor-execution and action-observation. Following a pre-test in which participants attempted to achieve an absolute, and relative, time goal in a sequential goal-directed aiming movement, participants received either physical or observational practice with feedback. Post-test performance indicated that motor-execution and action-observation participants learned equally well. Participants then transferred to conditions where the gain between the limb movements and their visual consequences were manipulated. Under both bigger and smaller transfer conditions, motor-execution and action-observation participants exhibited similar intramanual transfer of absolute timing. However, participants in the action-observation group exhibited superior transfer of relative timing than the motor-execution group. These findings suggest that learning via action-observation is underpinned by a visual-spatial representation, while learning via motor-execution depends more on specific force-time planning (feed forward) and afferent processing associated with sensorimotor feedback. These behavioural effects are discussed with reference to neural processes associated with striatum, cerebellum and motor cortical regions (pre-motor cortex; SMA; pre-SMA).
Safeguarding a Lunar Rover with Wald's Sequential Probability Ratio Test
NASA Technical Reports Server (NTRS)
Furlong, Michael; Dille, Michael; Wong, Uland; Nefian, Ara
2016-01-01
The virtual bumper is a safeguarding mechanism for autonomous and remotely operated robots. In this paper we take a new approach to the virtual bumper system by using an old statistical test. By using a modified version of Wald's sequential probability ratio test we demonstrate that we can reduce the number of false positive reported by the virtual bumper, thereby saving valuable mission time. We use the concept of sequential probability ratio to control vehicle speed in the presence of possible obstacles in order to increase certainty about whether or not obstacles are present. Our new algorithm reduces the chances of collision by approximately 98 relative to traditional virtual bumper safeguarding without speed control.
Energetics, kinetics, and pathway of SNARE folding and assembly revealed by optical tweezers.
Zhang, Yongli
2017-07-01
Soluble N-ethylmaleimide-sensitive factor attachment protein receptors (SNAREs) are universal molecular engines that drive membrane fusion. Particularly, synaptic SNAREs mediate fast calcium-triggered fusion of neurotransmitter-containing vesicles with plasma membranes for synaptic transmission, the basis of all thought and action. During membrane fusion, complementary SNAREs located on two apposed membranes (often called t- and v-SNAREs) join together to assemble into a parallel four-helix bundle, releasing the energy to overcome the energy barrier for fusion. A long-standing hypothesis suggests that SNAREs act like a zipper to draw the two membranes into proximity and thereby force them to fuse. However, a quantitative test of this SNARE zippering hypothesis was hindered by difficulties to determine the energetics and kinetics of SNARE assembly and to identify the relevant folding intermediates. Here, we first review different approaches that have been applied to study SNARE assembly and then focus on high-resolution optical tweezers. We summarize the folding energies, kinetics, and pathways of both wild-type and mutant SNARE complexes derived from this new approach. These results show that synaptic SNAREs assemble in four distinct stages with different functions: slow N-terminal domain association initiates SNARE assembly; a middle domain suspends and controls SNARE assembly; and rapid sequential zippering of the C-terminal domain and the linker domain directly drive membrane fusion. In addition, the kinetics and pathway of the stagewise assembly are shared by other SNARE complexes. These measurements prove the SNARE zippering hypothesis and suggest new mechanisms for SNARE assembly regulated by other proteins. © 2017 The Protein Society.
Current-State Constrained Filter Bank for Wald Testing of Spacecraft Conjunctions
NASA Technical Reports Server (NTRS)
Carpenter, J. Russell; Markley, F. Landis
2012-01-01
We propose a filter bank consisting of an ordinary current-state extended Kalman filter, and two similar but constrained filters: one is constrained by a null hypothesis that the miss distance between two conjuncting spacecraft is inside their combined hard body radius at the predicted time of closest approach, and one is constrained by an alternative complementary hypothesis. The unconstrained filter is the basis of an initial screening for close approaches of interest. Once the initial screening detects a possibly risky conjunction, the unconstrained filter also governs measurement editing for all three filters, and predicts the time of closest approach. The constrained filters operate only when conjunctions of interest occur. The computed likelihoods of the innovations of the two constrained filters form a ratio for a Wald sequential probability ratio test. The Wald test guides risk mitigation maneuver decisions based on explicit false alarm and missed detection criteria. Since only current-state Kalman filtering is required to compute the innovations for the likelihood ratio, the present approach does not require the mapping of probability density forward to the time of closest approach. Instead, the hard-body constraint manifold is mapped to the filter update time by applying a sigma-point transformation to a projection function. Although many projectors are available, we choose one based on Lambert-style differential correction of the current-state velocity. We have tested our method using a scenario based on the Magnetospheric Multi-Scale mission, scheduled for launch in late 2014. This mission involves formation flight in highly elliptical orbits of four spinning spacecraft equipped with antennas extending 120 meters tip-to-tip. Eccentricities range from 0.82 to 0.91, and close approaches generally occur in the vicinity of perigee, where rapid changes in geometry may occur. Testing the method using two 12,000-case Monte Carlo simulations, we found the method achieved a missed detection rate of 0.1%, and a false alarm rate of 2%.
Discovering Visual Scanning Patterns in a Computerized Cancellation Test
ERIC Educational Resources Information Center
Huang, Ho-Chuan; Wang, Tsui-Ying
2013-01-01
The purpose of this study was to develop an attention sequential mining mechanism for investigating the sequential patterns of children's visual scanning process in a computerized cancellation test. Participants had to locate and cancel the target amongst other non-targets in a structured form, and a random form with Chinese stimuli. Twenty-three…
Jones, Sarah E.
2016-01-01
Degeneracy of respiratory network function would imply that anatomically discrete aspects of the brain stem are capable of producing respiratory rhythm. To test this theory we a priori transected brain stem preparations before reperfusion and reoxygenation at 4 rostrocaudal levels: 1.5 mm caudal to obex (n = 5), at obex (n = 5), and 1.5 (n = 7) and 3 mm (n = 6) rostral to obex. The respiratory activity of these preparations was assessed via recordings of phrenic and vagal nerves and lumbar spinal expiratory motor output. Preparations with a priori transection at level of the caudal brain stem did not produce stable rhythmic respiratory bursting, even when the arterial chemoreceptors were stimulated with sodium cyanide (NaCN). Reperfusion of brain stems that preserved the pre-Bötzinger complex (pre-BötC) showed spontaneous and sustained rhythmic respiratory bursting at low phrenic nerve activity (PNA) amplitude that occurred simultaneously in all respiratory motor outputs. We refer to this rhythm as the pre-BötC burstlet-type rhythm. Conserving circuitry up to the pontomedullary junction consistently produced robust high-amplitude PNA at lower burst rates, whereas sequential motor patterning across the respiratory motor outputs remained absent. Some of the rostrally transected preparations expressed both burstlet-type and regular PNA amplitude rhythms. Further analysis showed that the burstlet-type rhythm and high-amplitude PNA had 1:2 quantal relation, with burstlets appearing to trigger high-amplitude bursts. We conclude that no degenerate rhythmogenic circuits are located in the caudal medulla oblongata and confirm the pre-BötC as the primary rhythmogenic kernel. The absence of sequential motor patterning in a priori transected preparations suggests that pontine circuits govern respiratory pattern formation. PMID:26888109
Jones, Sarah E; Dutschmann, Mathias
2016-05-01
Degeneracy of respiratory network function would imply that anatomically discrete aspects of the brain stem are capable of producing respiratory rhythm. To test this theory we a priori transected brain stem preparations before reperfusion and reoxygenation at 4 rostrocaudal levels: 1.5 mm caudal to obex (n = 5), at obex (n = 5), and 1.5 (n = 7) and 3 mm (n = 6) rostral to obex. The respiratory activity of these preparations was assessed via recordings of phrenic and vagal nerves and lumbar spinal expiratory motor output. Preparations with a priori transection at level of the caudal brain stem did not produce stable rhythmic respiratory bursting, even when the arterial chemoreceptors were stimulated with sodium cyanide (NaCN). Reperfusion of brain stems that preserved the pre-Bötzinger complex (pre-BötC) showed spontaneous and sustained rhythmic respiratory bursting at low phrenic nerve activity (PNA) amplitude that occurred simultaneously in all respiratory motor outputs. We refer to this rhythm as the pre-BötC burstlet-type rhythm. Conserving circuitry up to the pontomedullary junction consistently produced robust high-amplitude PNA at lower burst rates, whereas sequential motor patterning across the respiratory motor outputs remained absent. Some of the rostrally transected preparations expressed both burstlet-type and regular PNA amplitude rhythms. Further analysis showed that the burstlet-type rhythm and high-amplitude PNA had 1:2 quantal relation, with burstlets appearing to trigger high-amplitude bursts. We conclude that no degenerate rhythmogenic circuits are located in the caudal medulla oblongata and confirm the pre-BötC as the primary rhythmogenic kernel. The absence of sequential motor patterning in a priori transected preparations suggests that pontine circuits govern respiratory pattern formation. Copyright © 2016 the American Physiological Society.
Robustness of the sequential lineup advantage.
Gronlund, Scott D; Carlson, Curt A; Dailey, Sarah B; Goodsell, Charles A
2009-06-01
A growing movement in the United States and around the world involves promoting the advantages of conducting an eyewitness lineup in a sequential manner. We conducted a large study (N = 2,529) that included 24 comparisons of sequential versus simultaneous lineups. A liberal statistical criterion revealed only 2 significant sequential lineup advantages and 3 significant simultaneous advantages. Both sequential advantages occurred when the good photograph of the guilty suspect or either innocent suspect was in the fifth position in the sequential lineup; all 3 simultaneous advantages occurred when the poorer quality photograph of the guilty suspect or either innocent suspect was in the second position. Adjusting the statistical criterion to control for the multiple tests (.05/24) revealed no significant sequential advantages. Moreover, despite finding more conservative overall choosing for the sequential lineup, no support was found for the proposal that a sequential advantage was due to that conservative criterion shift. Unless lineups with particular characteristics predominate in the real world, there appears to be no strong preference for conducting lineups in either a sequential or a simultaneous manner. (PsycINFO Database Record (c) 2009 APA, all rights reserved).
40 CFR 53.34 - Test procedure for methods for PM10 and Class I methods for PM2.5.
Code of Federal Regulations, 2010 CFR
2010-07-01
... simultaneous PM10 or PM2.5 measurements as necessary (see table C-4 of this subpart), each set consisting of...) in appendix A to this subpart). (f) Sequential samplers. For sequential samplers, the sampler shall be configured for the maximum number of sequential samples and shall be set for automatic collection...
ERIC Educational Resources Information Center
Passig, David
2009-01-01
Children with mental retardation have pronounced difficulties in using cognitive strategies and comprehending abstract concepts--among them, the concept of sequential time (Van-Handel, Swaab, De-Vries, & Jongmans, 2007). The perception of sequential time is generally tested by using scenarios presenting a continuum of actions. The goal of this…
A detailed description of the sequential probability ratio test for 2-IMU FDI
NASA Technical Reports Server (NTRS)
Rich, T. M.
1976-01-01
The sequential probability ratio test (SPRT) for 2-IMU FDI (inertial measuring unit failure detection/isolation) is described. The SPRT is a statistical technique for detecting and isolating soft IMU failures originally developed for the strapdown inertial reference unit. The flowchart of a subroutine incorporating the 2-IMU SPRT is included.
Coy Males and Seductive Females in the Sexually Cannibalistic Colonial Spider, Cyrtophora citricola.
Yip, Eric C; Berner-Aharon, Na'ama; Smith, Deborah R; Lubin, Yael
2016-01-01
The abundance of sperm relative to eggs selects for males that maximize their number of mates and for females that choose high quality males. However, in many species, males exercise mate choice, even when they invest little in their offspring. Sexual cannibalism may promote male choosiness by limiting the number of females a male can inseminate and by biasing the sex ratio toward females because, while females can reenter the mating pool, cannibalized males cannot. These effects may be insufficient for male choosiness to evolve, however, if males face low sequential encounter rates with females. We hypothesized that sexual cannibalism should facilitate the evolution of male choosiness in group living species because a male is likely to encounter multiple receptive females simultaneously. We tested this hypothesis in a colonial orb-weaving spider, Cyrtophora citricola, with a high rate of sexual cannibalism. We tested whether mated females would mate with multiple males, and thereby shift the operational sex ratio toward females. We also investigated whether either sex chooses mates based on nutritional state and age, and whether males choose females based on reproductive state. We found that females are readily polyandrous and exhibit no mate choice related to male feeding or age. Males courted more often when the male was older and the female was younger, and males copulated more often with well-fed females. The data show that males are choosier than females for the traits we measured, supporting our hypothesis that group living and sexual cannibalism may together promote the evolution of male mate choice.
Coy Males and Seductive Females in the Sexually Cannibalistic Colonial Spider, Cyrtophora citricola
Yip, Eric C.; Berner-Aharon, Na’ama; Smith, Deborah R.; Lubin, Yael
2016-01-01
The abundance of sperm relative to eggs selects for males that maximize their number of mates and for females that choose high quality males. However, in many species, males exercise mate choice, even when they invest little in their offspring. Sexual cannibalism may promote male choosiness by limiting the number of females a male can inseminate and by biasing the sex ratio toward females because, while females can reenter the mating pool, cannibalized males cannot. These effects may be insufficient for male choosiness to evolve, however, if males face low sequential encounter rates with females. We hypothesized that sexual cannibalism should facilitate the evolution of male choosiness in group living species because a male is likely to encounter multiple receptive females simultaneously. We tested this hypothesis in a colonial orb-weaving spider, Cyrtophora citricola, with a high rate of sexual cannibalism. We tested whether mated females would mate with multiple males, and thereby shift the operational sex ratio toward females. We also investigated whether either sex chooses mates based on nutritional state and age, and whether males choose females based on reproductive state. We found that females are readily polyandrous and exhibit no mate choice related to male feeding or age. Males courted more often when the male was older and the female was younger, and males copulated more often with well-fed females. The data show that males are choosier than females for the traits we measured, supporting our hypothesis that group living and sexual cannibalism may together promote the evolution of male mate choice. PMID:27249787
Context-dependent preferences in starlings: linking ecology, foraging and choice.
Vasconcelos, Marco; Monteiro, Tiago; Kacelnik, Alex
2013-01-01
Foraging animals typically encounter opportunities that they either pursue or skip, but occasionally meet several alternatives simultaneously. Behavioural ecologists predict preferences using absolute properties of each option, while decision theorists focus on relative evaluations at the time of choice. We use European starlings (Sturnus vulgaris) to integrate ecological reasoning with decision models, linking and testing hypotheses for value acquisition and choice mechanism. We hypothesise that options' values depend jointly on absolute attributes, learning context, and subject's state. In simultaneous choices, preference could result either from comparing subjective values using deliberation time, or from processing each alternative independently, without relative comparisons. The combination of the value acquisition hypothesis and independent processing at choice time has been called the Sequential Choice Model. We test this model with options equated in absolute properties to exclude the possibility of preference being built at the time of choice. Starlings learned to obtain food by responding to four stimuli in two contexts. In context [AB], they encountered options A5 or B10 in random alternation; in context [CD], they met C10 or D20. Delay to food is denoted, in seconds, by the suffixes. Observed latency to respond (Li) to each option alone (our measure of value) ranked thus: LA≈LC
Simultaneous control of microorganisms and disinfection by-products by sequential chlorination.
Chen, Chao; Zhang, Xiao-Jian; He, Wen-Jie; Han, Hong-Da
2007-04-01
To introduce a new sequential chlorination disinfection process in which short-term free chlorine and chloramine are sequentially added. Pilot tests of this sequential chlorination were carried out in a drinking water plant. The sequential chlorination disinfection process had the same or better efficiency on microbe (including virus) inactivation compared with the free chlorine disinfection process. There seemed to be some synergetic disinfection effect between free chlorine and monochloramine because they attacked different targets. The sequential chlorination disinfection process resulted in 35.7%-77.0% TTHM formation and 36.6%-54.8% THAA5 formation less than the free chlorination process. The poorer the water quality was, the more advantage the sequential chlorination disinfection had over the free chlorination. This process takes advantages of free chlorine's quick inactivation of microorganisms and chloramine's low disinfection by-product (DBP) yield and long-term residual effect, allowing simultaneous control of microbes and DBPs in an effective and economic way.
NASA Astrophysics Data System (ADS)
Rammage, Robert L.
1990-10-01
A device for sequentially testing the plurality of connectors in a wiring harness is disclosed. The harness is attached to the tester by means of adapter cables and a rotary switch is used to sequentially, individually test the connectors by passing a current through the connector. If the connector is unbroken, a light will flash to show it is electrically sound. The adapters allow a large number of cable configurations to be tested using a single tester configuration.
Multiple model cardinalized probability hypothesis density filter
NASA Astrophysics Data System (ADS)
Georgescu, Ramona; Willett, Peter
2011-09-01
The Probability Hypothesis Density (PHD) filter propagates the first-moment approximation to the multi-target Bayesian posterior distribution while the Cardinalized PHD (CPHD) filter propagates both the posterior likelihood of (an unlabeled) target state and the posterior probability mass function of the number of targets. Extensions of the PHD filter to the multiple model (MM) framework have been published and were implemented either with a Sequential Monte Carlo or a Gaussian Mixture approach. In this work, we introduce the multiple model version of the more elaborate CPHD filter. We present the derivation of the prediction and update steps of the MMCPHD particularized for the case of two target motion models and proceed to show that in the case of a single model, the new MMCPHD equations reduce to the original CPHD equations.
Deaths from bacterial pneumonia during 1918-19 influenza pandemic.
Brundage, John F; Shanks, G Dennis
2008-08-01
Deaths during the 1918-19 influenza pandemic have been attributed to a hypervirulent influenza strain. Hence, preparations for the next pandemic focus almost exclusively on vaccine prevention and antiviral treatment for infections with a novel influenza strain. However, we hypothesize that infections with the pandemic strain generally caused self-limited (rarely fatal) illnesses that enabled colonizing strains of bacteria to produce highly lethal pneumonias. This sequential-infection hypothesis is consistent with characteristics of the 1918-19 pandemic, contemporaneous expert opinion, and current knowledge regarding the pathophysiologic effects of influenza viruses and their interactions with respiratory bacteria. This hypothesis suggests opportunities for prevention and treatment during the next pandemic (e.g., with bacterial vaccines and antimicrobial drugs), particularly if a pandemic strain-specific vaccine is unavailable or inaccessible to isolated, crowded, or medically underserved populations.
Human Inferences about Sequences: A Minimal Transition Probability Model
2016-01-01
The brain constantly infers the causes of the inputs it receives and uses these inferences to generate statistical expectations about future observations. Experimental evidence for these expectations and their violations include explicit reports, sequential effects on reaction times, and mismatch or surprise signals recorded in electrophysiology and functional MRI. Here, we explore the hypothesis that the brain acts as a near-optimal inference device that constantly attempts to infer the time-varying matrix of transition probabilities between the stimuli it receives, even when those stimuli are in fact fully unpredictable. This parsimonious Bayesian model, with a single free parameter, accounts for a broad range of findings on surprise signals, sequential effects and the perception of randomness. Notably, it explains the pervasive asymmetry between repetitions and alternations encountered in those studies. Our analysis suggests that a neural machinery for inferring transition probabilities lies at the core of human sequence knowledge. PMID:28030543
Control of Task Sequences: What is the Role of Language?
Mayr, Ulrich; Kleffner, Killian; Kikumoto, Atsushi; Redford, Melissa A.
2015-01-01
It is almost a truism that language aids serial-order control through self-cuing of upcoming sequential elements. We measured speech onset latencies as subjects performed hierarchically organized task sequences while "thinking aloud" each task label. Surprisingly, speech onset latencies and response times (RTs) were highly synchronized, a pattern that is not consistent with the hypothesis that speaking aids proactive retrieval of upcoming sequential elements during serial-order control. We also found that when instructed to do so, participants were able to speak task labels prior to presentation of response-relevant stimuli and that this substantially reduced RT signatures of retrieval—however at the cost of more sequencing errors. Thus, while proactive retrieval is possible in principle, in natural situations it seems to be prevented through a strong, "gestalt-like" tendency to synchronize speech and action. We suggest that this tendency may support context updating rather than proactive control. PMID:24274386
Explorations in statistics: hypothesis tests and P values.
Curran-Everett, Douglas
2009-06-01
Learning about statistics is a lot like learning about science: the learning is more meaningful if you can actively explore. This second installment of Explorations in Statistics delves into test statistics and P values, two concepts fundamental to the test of a scientific null hypothesis. The essence of a test statistic is that it compares what we observe in the experiment to what we expect to see if the null hypothesis is true. The P value associated with the magnitude of that test statistic answers this question: if the null hypothesis is true, what proportion of possible values of the test statistic are at least as extreme as the one I got? Although statisticians continue to stress the limitations of hypothesis tests, there are two realities we must acknowledge: hypothesis tests are ingrained within science, and the simple test of a null hypothesis can be useful. As a result, it behooves us to explore the notions of hypothesis tests, test statistics, and P values.
Off-line simulation inspires insight: A neurodynamics approach to efficient robot task learning.
Sousa, Emanuel; Erlhagen, Wolfram; Ferreira, Flora; Bicho, Estela
2015-12-01
There is currently an increasing demand for robots able to acquire the sequential organization of tasks from social learning interactions with ordinary people. Interactive learning-by-demonstration and communication is a promising research topic in current robotics research. However, the efficient acquisition of generalized task representations that allow the robot to adapt to different users and contexts is a major challenge. In this paper, we present a dynamic neural field (DNF) model that is inspired by the hypothesis that the nervous system uses the off-line re-activation of initial memory traces to incrementally incorporate new information into structured knowledge. To achieve this, the model combines fast activation-based learning to robustly represent sequential information from single task demonstrations with slower, weight-based learning during internal simulations to establish longer-term associations between neural populations representing individual subtasks. The efficiency of the learning process is tested in an assembly paradigm in which the humanoid robot ARoS learns to construct a toy vehicle from its parts. User demonstrations with different serial orders together with the correction of initial prediction errors allow the robot to acquire generalized task knowledge about possible serial orders and the longer term dependencies between subgoals in very few social learning interactions. This success is shown in a joint action scenario in which ARoS uses the newly acquired assembly plan to construct the toy together with a human partner. Copyright © 2015 Elsevier Ltd. All rights reserved.
Mixed Emotions and Coping: The Benefits of Secondary Emotions
Braniecka, Anna; Trzebińska, Ewa; Dowgiert, Aneta; Wytykowska, Agata
2014-01-01
The existing empirical literature suggests that during difficult situations, the concurrent experience of positive and negative affects may be ideal for ensuring successful adaptation and well-being. However, different patterns of mixed emotions may have different adaptive consequences. The present research tested the proposition that experiencing a pattern of secondary mixed emotion (i.e., secondary emotion that embrace both positive and negative affects) more greatly promotes adaptive coping than experiencing two other patterns of mixed emotional experiences: simultaneous (i.e., two emotions of opposing affects taking place at the same time) and sequential (i.e., two emotions of opposing affects switching back and forth). Support for this hypothesis was obtained from two experiments (Studies 1 and 2) and a longitudinal survey (Study 3). The results revealed that secondary mixed emotions predominate over sequential and simultaneous mixed emotional experiences in promoting adaptive coping through fostering the motivational and informative functions of emotions; this is done by providing solution-oriented actions rather than avoidance, faster decisions regarding coping strategies (Study 1), easier access to self-knowledge, and better narrative organization (Study 2). Furthermore, individuals characterized as being prone to feeling secondary mixed emotions were more resilient to stress caused by transitions than those who were characterized as being prone to feeling opposing emotions separately (Study 3). Taken together, the preliminary results indicate that the pattern of secondary mixed emotion provides individuals with a higher capacity to handle adversity than the other two patterns of mixed emotional experience. PMID:25084461
Bayesian randomized clinical trials: From fixed to adaptive design.
Yin, Guosheng; Lam, Chi Kin; Shi, Haolun
2017-08-01
Randomized controlled studies are the gold standard for phase III clinical trials. Using α-spending functions to control the overall type I error rate, group sequential methods are well established and have been dominating phase III studies. Bayesian randomized design, on the other hand, can be viewed as a complement instead of competitive approach to the frequentist methods. For the fixed Bayesian design, the hypothesis testing can be cast in the posterior probability or Bayes factor framework, which has a direct link to the frequentist type I error rate. Bayesian group sequential design relies upon Bayesian decision-theoretic approaches based on backward induction, which is often computationally intensive. Compared with the frequentist approaches, Bayesian methods have several advantages. The posterior predictive probability serves as a useful and convenient tool for trial monitoring, and can be updated at any time as the data accrue during the trial. The Bayesian decision-theoretic framework possesses a direct link to the decision making in the practical setting, and can be modeled more realistically to reflect the actual cost-benefit analysis during the drug development process. Other merits include the possibility of hierarchical modeling and the use of informative priors, which would lead to a more comprehensive utilization of information from both historical and longitudinal data. From fixed to adaptive design, we focus on Bayesian randomized controlled clinical trials and make extensive comparisons with frequentist counterparts through numerical studies. Copyright © 2017 Elsevier Inc. All rights reserved.
Mixed emotions and coping: the benefits of secondary emotions.
Braniecka, Anna; Trzebińska, Ewa; Dowgiert, Aneta; Wytykowska, Agata
2014-01-01
The existing empirical literature suggests that during difficult situations, the concurrent experience of positive and negative affects may be ideal for ensuring successful adaptation and well-being. However, different patterns of mixed emotions may have different adaptive consequences. The present research tested the proposition that experiencing a pattern of secondary mixed emotion (i.e., secondary emotion that embrace both positive and negative affects) more greatly promotes adaptive coping than experiencing two other patterns of mixed emotional experiences: simultaneous (i.e., two emotions of opposing affects taking place at the same time) and sequential (i.e., two emotions of opposing affects switching back and forth). Support for this hypothesis was obtained from two experiments (Studies 1 and 2) and a longitudinal survey (Study 3). The results revealed that secondary mixed emotions predominate over sequential and simultaneous mixed emotional experiences in promoting adaptive coping through fostering the motivational and informative functions of emotions; this is done by providing solution-oriented actions rather than avoidance, faster decisions regarding coping strategies (Study 1), easier access to self-knowledge, and better narrative organization (Study 2). Furthermore, individuals characterized as being prone to feeling secondary mixed emotions were more resilient to stress caused by transitions than those who were characterized as being prone to feeling opposing emotions separately (Study 3). Taken together, the preliminary results indicate that the pattern of secondary mixed emotion provides individuals with a higher capacity to handle adversity than the other two patterns of mixed emotional experience.
Servant, Mathieu; White, Corey; Montagnini, Anna; Burle, Borís
2015-07-15
Most decisions that we make build upon multiple streams of sensory evidence and control mechanisms are needed to filter out irrelevant information. Sequential sampling models of perceptual decision making have recently been enriched by attentional mechanisms that weight sensory evidence in a dynamic and goal-directed way. However, the framework retains the longstanding hypothesis that motor activity is engaged only once a decision threshold is reached. To probe latent assumptions of these models, neurophysiological indices are needed. Therefore, we collected behavioral and EMG data in the flanker task, a standard paradigm to investigate decisions about relevance. Although the models captured response time distributions and accuracy data, EMG analyses of response agonist muscles challenged the assumption of independence between decision and motor processes. Those analyses revealed covert incorrect EMG activity ("partial error") in a fraction of trials in which the correct response was finally given, providing intermediate states of evidence accumulation and response activation at the single-trial level. We extended the models by allowing motor activity to occur before a commitment to a choice and demonstrated that the proposed framework captured the rate, latency, and EMG surface of partial errors, along with the speed of the correction process. In return, EMG data provided strong constraints to discriminate between competing models that made similar behavioral predictions. Our study opens new theoretical and methodological avenues for understanding the links among decision making, cognitive control, and motor execution in humans. Sequential sampling models of perceptual decision making assume that sensory information is accumulated until a criterion quantity of evidence is obtained, from where the decision terminates in a choice and motor activity is engaged. The very existence of covert incorrect EMG activity ("partial error") during the evidence accumulation process challenges this longstanding assumption. In the present work, we use partial errors to better constrain sequential sampling models at the single-trial level. Copyright © 2015 the authors 0270-6474/15/3510371-15$15.00/0.
Two-IMU FDI performance of the sequential probability ratio test during shuttle entry
NASA Technical Reports Server (NTRS)
Rich, T. M.
1976-01-01
Performance data for the sequential probability ratio test (SPRT) during shuttle entry are presented. Current modeling constants and failure thresholds are included for the full mission 3B from entry through landing trajectory. Minimum 100 percent detection/isolation failure levels and a discussion of the effects of failure direction are presented. Finally, a limited comparison of failures introduced at trajectory initiation shows that the SPRT algorithm performs slightly worse than the data tracking test.
A sampling and classification item selection approach with content balancing.
Chen, Pei-Hua
2015-03-01
Existing automated test assembly methods typically employ constrained combinatorial optimization. Constructing forms sequentially based on an optimization approach usually results in unparallel forms and requires heuristic modifications. Methods based on a random search approach have the major advantage of producing parallel forms sequentially without further adjustment. This study incorporated a flexible content-balancing element into the statistical perspective item selection method of the cell-only method (Chen et al. in Educational and Psychological Measurement, 72(6), 933-953, 2012). The new method was compared with a sequential interitem distance weighted deviation model (IID WDM) (Swanson & Stocking in Applied Psychological Measurement, 17(2), 151-166, 1993), a simultaneous IID WDM, and a big-shadow-test mixed integer programming (BST MIP) method to construct multiple parallel forms based on matching a reference form item-by-item. The results showed that the cell-only method with content balancing and the sequential and simultaneous versions of IID WDM yielded results comparable to those obtained using the BST MIP method. The cell-only method with content balancing is computationally less intensive than the sequential and simultaneous versions of IID WDM.
Numerical study on the sequential Bayesian approach for radioactive materials detection
NASA Astrophysics Data System (ADS)
Qingpei, Xiang; Dongfeng, Tian; Jianyu, Zhu; Fanhua, Hao; Ge, Ding; Jun, Zeng
2013-01-01
A new detection method, based on the sequential Bayesian approach proposed by Candy et al., offers new horizons for the research of radioactive detection. Compared with the commonly adopted detection methods incorporated with statistical theory, the sequential Bayesian approach offers the advantages of shorter verification time during the analysis of spectra that contain low total counts, especially in complex radionuclide components. In this paper, a simulation experiment platform implanted with the methodology of sequential Bayesian approach was developed. Events sequences of γ-rays associating with the true parameters of a LaBr3(Ce) detector were obtained based on an events sequence generator using Monte Carlo sampling theory to study the performance of the sequential Bayesian approach. The numerical experimental results are in accordance with those of Candy. Moreover, the relationship between the detection model and the event generator, respectively represented by the expected detection rate (Am) and the tested detection rate (Gm) parameters, is investigated. To achieve an optimal performance for this processor, the interval of the tested detection rate as a function of the expected detection rate is also presented.
ANAEROBIC AND AEROBIC TREATMENT OF CHLORINATED ALIPHATIC COMPOUNDS
Biological degradation of 12 chlorinated aliphatic compounds (CACs) was assessed in bench-top reactors and in serum bottle tests. Three continuously mixed daily batch-fed reactor systems were evaluated: anaerobic, aerobic, and sequential-anaerobic-aerobic (sequential). Glucose,...
Harold R. Offord
1966-01-01
Sequential sampling based on a negative binomial distribution of ribes populations required less than half the time taken by regular systematic line transect sampling in a comparison test. It gave the same control decision as the regular method in 9 of 13 field trials. A computer program that permits sequential plans to be built readily for other white pine regions is...
Visual Working Memory Is Independent of the Cortical Spacing Between Memoranda.
Harrison, William J; Bays, Paul M
2018-03-21
The sensory recruitment hypothesis states that visual short-term memory is maintained in the same visual cortical areas that initially encode a stimulus' features. Although it is well established that the distance between features in visual cortex determines their visibility, a limitation known as crowding, it is unknown whether short-term memory is similarly constrained by the cortical spacing of memory items. Here, we investigated whether the cortical spacing between sequentially presented memoranda affects the fidelity of memory in humans (of both sexes). In a first experiment, we varied cortical spacing by taking advantage of the log-scaling of visual cortex with eccentricity, presenting memoranda in peripheral vision sequentially along either the radial or tangential visual axis with respect to the fovea. In a second experiment, we presented memoranda sequentially either within or beyond the critical spacing of visual crowding, a distance within which visual features cannot be perceptually distinguished due to their nearby cortical representations. In both experiments and across multiple measures, we found strong evidence that the ability to maintain visual features in memory is unaffected by cortical spacing. These results indicate that the neural architecture underpinning working memory has properties inconsistent with the known behavior of sensory neurons in visual cortex. Instead, the dissociation between perceptual and memory representations supports a role of higher cortical areas such as posterior parietal or prefrontal regions or may involve an as yet unspecified mechanism in visual cortex in which stimulus features are bound to their temporal order. SIGNIFICANCE STATEMENT Although much is known about the resolution with which we can remember visual objects, the cortical representation of items held in short-term memory remains contentious. A popular hypothesis suggests that memory of visual features is maintained via the recruitment of the same neural architecture in sensory cortex that encodes stimuli. We investigated this claim by manipulating the spacing in visual cortex between sequentially presented memoranda such that some items shared cortical representations more than others while preventing perceptual interference between stimuli. We found clear evidence that short-term memory is independent of the intracortical spacing of memoranda, revealing a dissociation between perceptual and memory representations. Our data indicate that working memory relies on different neural mechanisms from sensory perception. Copyright © 2018 Harrison and Bays.
Ivanovski, Ivan; Ješić, Miloš; Ivanovski, Ana; Garavelli, Livia; Ivanovski, Petar
2017-11-28
The underlying pathophysiology of liver dysfunction in urea cycle disorders (UCDs) is still largely elusive. There is some evidence that the accumulation of urea cycle (UC) intermediates are toxic for hepatocyte mitochondria. It is possible that liver injury is directly caused by the toxicity of ammonia. The rarity of UCDs, the lack of checking of iron level in these patients, superficial knowledge of UC and an underestimation of the metabolic role of fumaric acid, are the main reasons that are responsible for the incomprehension of the mechanism of liver injury in patients suffering from UCDs. Owing to our routine clinical practice to screen for iron overload in severely ill neonates, with the focus on the newborns suffering from acute liver failure, we report a case of citrullinemia with neonatal liver failure and high blood parameters of iron overload. We hypothesize that the key is in the decreased-deficient fumaric acid production in the course of UC in UCDs that causes several sequentially intertwined metabolic disturbances with final result of liver iron overload. The presented hypothesis could be easily tested by examining the patients suffering from UCDs, for liver iron overload. This could be easily performed in countries with a high population and comprehensive national register for inborn errors of metabolism. Providing the hypothesis is correct, neonatal liver damage in patients having UCD can be prevented by the supplementation of pregnant women with fumaric or succinic acid, prepared in the form of iron supplementation pills. After birth, liver damage in patients having UCDs can be prevented by supplementation of these patients with zinc fumarate or zinc succinylate, as well.
Berggren, Elisabet; White, Andrew; Ouedraogo, Gladys; Paini, Alicia; Richarz, Andrea-Nicole; Bois, Frederic Y; Exner, Thomas; Leite, Sofia; Grunsven, Leo A van; Worth, Andrew; Mahony, Catherine
2017-11-01
We describe and illustrate a workflow for chemical safety assessment that completely avoids animal testing. The workflow, which was developed within the SEURAT-1 initiative, is designed to be applicable to cosmetic ingredients as well as to other types of chemicals, e.g. active ingredients in plant protection products, biocides or pharmaceuticals. The aim of this work was to develop a workflow to assess chemical safety without relying on any animal testing, but instead constructing a hypothesis based on existing data, in silico modelling, biokinetic considerations and then by targeted non-animal testing. For illustrative purposes, we consider a hypothetical new ingredient x as a new component in a body lotion formulation. The workflow is divided into tiers in which points of departure are established through in vitro testing and in silico prediction, as the basis for estimating a safe external dose in a repeated use scenario. The workflow includes a series of possible exit (decision) points, with increasing levels of confidence, based on the sequential application of the Threshold of Toxicological (TTC) approach, read-across, followed by an "ab initio" assessment, in which chemical safety is determined entirely by new in vitro testing and in vitro to in vivo extrapolation by means of mathematical modelling. We believe that this workflow could be applied as a tool to inform targeted and toxicologically relevant in vitro testing, where necessary, and to gain confidence in safety decision making without the need for animal testing.
Test Generation for Highly Sequential Circuits
1989-08-01
Sequential CircuitsI Abhijit Ghosh, Srinivas Devadas , and A. Richard Newton Abstract We address the problem of generating test sequences for stuck-at...Electrical Engineering and Computer Sciences, University of California, Berkeley, CA 94720. Devadas : Department of Electrical Engineering and Computer...attn1 b ~een propagatedl to ltne nnext state lites aloine. then we obtain tine fnalty Is as bit. valunes is called A miniteri state. Iti genecral. a
NASA Astrophysics Data System (ADS)
Chaudhary, A.; Payne, T.; Kinateder, K.; Dao, P.; Beecher, E.; Boone, D.; Elliott, B.
The objective of on-line flagging in this paper is to perform interactive assessment of geosynchronous satellites anomalies such as cross-tagging of a satellites in a cluster, solar panel offset change, etc. This assessment will utilize a Bayesian belief propagation procedure and will include automated update of baseline signature data for the satellite, while accounting for the seasonal changes. Its purpose is to enable an ongoing, automated assessment of satellite behavior through its life cycle using the photometry data collected during the synoptic search performed by a ground or space-based sensor as a part of its metrics mission. The change in the satellite features will be reported along with the probabilities of Type I and Type II errors. The objective of adaptive sequential hypothesis testing in this paper is to define future sensor tasking for the purpose of characterization of fine features of the satellite. The tasking will be designed in order to maximize new information with the least number of photometry data points to be collected during the synoptic search by a ground or space-based sensor. Its calculation is based on the utilization of information entropy techniques. The tasking is defined by considering a sequence of hypotheses in regard to the fine features of the satellite. The optimal observation conditions are then ordered in order to maximize new information about a chosen fine feature. The combined objective of on-line flagging and adaptive sequential hypothesis testing is to progressively discover new information about the features of a geosynchronous satellites by leveraging the regular but sparse cadence of data collection during the synoptic search performed by a ground or space-based sensor. Automated Algorithm to Detect Changes in Geostationary Satellite's Configuration and Cross-Tagging Phan Dao, Air Force Research Laboratory/RVB By characterizing geostationary satellites based on photometry and color photometry, analysts can evaluate satellite operational status and affirm its true identity. The process of ingesting photometry data and deriving satellite physical characteristics can be directed by analysts in a batch mode, meaning using a batch of recent data, or by automated algorithms in an on-line mode in which the assessment is updated with each new data point. Tools used for detecting change to satellite's status or identity, whether performed with a human in the loop or automated algorithms, are generally not built to detect with minimum latency and traceable confidence intervals. To alleviate those deficiencies, we investigate the use of Hidden Markov Models (HMM), in a Bayesian Network framework, to infer the hidden state (changed or unchanged) of a three-axis stabilized geostationary satellite using broadband and color photometry. Unlike frequentist statistics which exploit only the stationary statistics of the observables in the database, HMM also exploits the temporal pattern of the observables as well. The algorithm also operates in “learning” mode to gradually evolve the HMM and accommodate natural changes such as due to the seasonal dependence of GEO satellite's light curve. Our technique is designed to operate with missing color data. The version that ingests both panchromatic and color data can accommodate gaps in color photometry data. That attribute is important because while color indices, e.g. Johnson R and B, enhance the belief (probability) of a hidden state, in real world situations, flux data is collected sporadically in an untasked collect, and color data is limited and sometimes absent. Fluxes are measured with experimental error whose effect on the algorithm will be studied. Photometry data in the AFRL's Geo Color Photometry Catalog and Geo Observations with Latitudinal Diversity Simultaneously (GOLDS) data sets are used to simulate a wide variety of operational changes and identity cross tags. The algorithm is tested against simulated sequences of observed magnitudes, mimicking both the cadence of untasked SSN and other ground sensors, occasional operational changes and possible occurrence of cross tags of in-cluster satellites. We would like to show that the on-line algorithm can detect change; sometimes right after the first post-change data point is analyzed, for zero latency. We also want to show the unsupervised “learning” capability that allows the HMM to evolve with time without user's assistance. For example, the users are not required to “label” the true state of the data points.
Optimal sequential measurements for bipartite state discrimination
NASA Astrophysics Data System (ADS)
Croke, Sarah; Barnett, Stephen M.; Weir, Graeme
2017-05-01
State discrimination is a useful test problem with which to clarify the power and limitations of different classes of measurement. We consider the problem of discriminating between given states of a bipartite quantum system via sequential measurement of the subsystems, with classical feed-forward of measurement results. Our aim is to understand when sequential measurements, which are relatively easy to implement experimentally, perform as well, or almost as well, as optimal joint measurements, which are in general more technologically challenging. We construct conditions that the optimal sequential measurement must satisfy, analogous to the well-known Helstrom conditions for minimum error discrimination in the unrestricted case. We give several examples and compare the optimal probability of correctly identifying the state via global versus sequential measurement strategies.
1981-12-01
CONCERNING THE RELIABILITY OF A SYSTEM MODELED BY A TWO-PARAMETER WEIBULL DISTRIBUTION THESIS AFIT/GOR/MA/81D-8 Philippe A. Lussier 2nd Lt USAF... MODELED BY A TWO-PARAMETER WEIBULL DISTRIBUTION THESIS Presented to the Faculty of the School of Engineering of the Air Force Institute of Technology...repetitions are used for these test procedures. vi Sequential Testing of Hypotheses Concerning the Reliability of a System Modeled by a Two-Parameter
McKerr, Caoimhe; Adak, Goutam K.; Nichols, Gordon; Gorton, Russell; Chalmers, Rachel M.; Kafatos, George; Cosford, Paul; Charlett, Andre; Reacher, Mark; Pollock, Kevin G.; Alexander, Claire L.; Morton, Stephen
2015-01-01
Background We report a widespread foodborne outbreak of Cryptosporidium parvum in England and Scotland in May 2012. Cases were more common in female adults, and had no history of foreign travel. Over 300 excess cases were identified during the period of the outbreak. Speciation and microbiological typing revealed the outbreak strain to be C. parvum gp60 subtype IIaA15G2R1. Methods Hypothesis generation questionnaires were administered and an unmatched case control study was undertaken to test the hypotheses raised. Cases and controls were interviewed by telephone. Controls were selected using sequential digit dialling. Information was gathered on demographics, foods consumed and retailers where foods were purchased. Results Seventy-four laboratory confirmed cases and 74 controls were included in analyses. Infection was found to be strongly associated with the consumption of pre-cut mixed salad leaves sold by a single retailer. This is the largest documented outbreak of cryptosporidiosis attributed to a food vehicle. PMID:26017538
McKerr, Caoimhe; Adak, Goutam K; Nichols, Gordon; Gorton, Russell; Chalmers, Rachel M; Kafatos, George; Cosford, Paul; Charlett, Andre; Reacher, Mark; Pollock, Kevin G; Alexander, Claire L; Morton, Stephen
2015-01-01
We report a widespread foodborne outbreak of Cryptosporidium parvum in England and Scotland in May 2012. Cases were more common in female adults, and had no history of foreign travel. Over 300 excess cases were identified during the period of the outbreak. Speciation and microbiological typing revealed the outbreak strain to be C. parvum gp60 subtype IIaA15G2R1. Hypothesis generation questionnaires were administered and an unmatched case control study was undertaken to test the hypotheses raised. Cases and controls were interviewed by telephone. Controls were selected using sequential digit dialling. Information was gathered on demographics, foods consumed and retailers where foods were purchased. Seventy-four laboratory confirmed cases and 74 controls were included in analyses. Infection was found to be strongly associated with the consumption of pre-cut mixed salad leaves sold by a single retailer. This is the largest documented outbreak of cryptosporidiosis attributed to a food vehicle.
Marsella, Pasquale; Scorpecci, Alessandro; Vecchiato, Giovanni; Colosimo, Alfredo; Maglione, Anton Giulio; Babiloni, Fabio
2014-05-01
To investigate by means of non-invasive neuroelectrical imaging the differences in the perceived pleasantness of music between children with cochlear implants (CI) and normal-hearing (NH) children. 5 NH children and 5 children who received a sequential bilateral CI were assessed by means of High-Resolution EEG with Source Reconstruction as they watched a musical cartoon. Implanted children were tested before and after the second implant. For each subject the scalp Power Spectral Density was calculated in order to investigate the EEG alpha asymmetry. The scalp topographic distribution of the EEG power spectrum in the alpha band was different in children using one CI as compared to NH children (see figure). With two CIs the cortical activation pattern changed significantly, becoming more similar to the one observed in NH children. The findings support the hypothesis that bilateral CI users have a closer-to-normal perception of the pleasantness of music than unilaterally implanted children.
Prado-Gutierrez, Pavel; Castro-Fariñas, Anisleidy; Morgado-Rodriguez, Lisbet; Velarde-Reyes, Ernesto; Martínez, Agustín D.; Martínez-Montes, Eduardo
2015-01-01
Generation of the auditory steady state responses (ASSR) is commonly explained by the linear combination of random background noise activity and the stationary response. Based on this model, the decrease of amplitude that occurs over the sequential averaging of epochs of the raw data has been exclusively linked to the cancelation of noise. Nevertheless, this behavior might also reflect the non-stationary response of the ASSR generators. We tested this hypothesis by characterizing the ASSR time course in rats with different auditory maturational stages. ASSR were evoked by 8-kHz tones of different supra-threshold intensities, modulated in amplitude at 115 Hz. Results show that the ASSR amplitude habituated to the sustained stimulation and that dishabituation occurred when deviant stimuli were presented. ASSR habituation increased as animals became adults, suggesting that the ability to filter acoustic stimuli with no-relevant temporal information increased with age. Results are discussed in terms of the current model of the ASSR generation and analysis procedures. They might have implications for audiometric tests designed to assess hearing in subjects who cannot provide reliable results in the psychophysical trials. PMID:26557360
Seeking health information on the web: positive hypothesis testing.
Kayhan, Varol Onur
2013-04-01
The goal of this study is to investigate positive hypothesis testing among consumers of health information when they search the Web. After demonstrating the extent of positive hypothesis testing using Experiment 1, we conduct Experiment 2 to test the effectiveness of two debiasing techniques. A total of 60 undergraduate students searched a tightly controlled online database developed by the authors to test the validity of a hypothesis. The database had four abstracts that confirmed the hypothesis and three abstracts that disconfirmed it. Findings of Experiment 1 showed that majority of participants (85%) exhibited positive hypothesis testing. In Experiment 2, we found that the recommendation technique was not effective in reducing positive hypothesis testing since none of the participants assigned to this server could retrieve disconfirming evidence. Experiment 2 also showed that the incorporation technique successfully reduced positive hypothesis testing since 75% of the participants could retrieve disconfirming evidence. Positive hypothesis testing on the Web is an understudied topic. More studies are needed to validate the effectiveness of the debiasing techniques discussed in this study and develop new techniques. Search engine developers should consider developing new options for users so that both confirming and disconfirming evidence can be presented in search results as users test hypotheses using search engines. Copyright © 2012 Elsevier Ireland Ltd. All rights reserved.
Palmer, Matthew A; Brewer, Neil
2012-06-01
When compared with simultaneous lineup presentation, sequential presentation has been shown to reduce false identifications to a greater extent than it reduces correct identifications. However, there has been much debate about whether this difference in identification performance represents improved discriminability or more conservative responding. In this research, data from 22 experiments that compared sequential and simultaneous lineups were analyzed using a compound signal-detection model, which is specifically designed to describe decision-making performance on tasks such as eyewitness identification tests. Sequential (cf. simultaneous) presentation did not influence discriminability, but produced a conservative shift in response bias that resulted in less-biased choosing for sequential than simultaneous lineups. These results inform understanding of the effects of lineup presentation mode on eyewitness identification decisions.
Automated ILA design for synchronous sequential circuits
NASA Technical Reports Server (NTRS)
Liu, M. N.; Liu, K. Z.; Maki, G. K.; Whitaker, S. R.
1991-01-01
An iterative logic array (ILA) architecture for synchronous sequential circuits is presented. This technique utilizes linear algebra to produce the design equations. The ILA realization of synchronous sequential logic can be fully automated with a computer program. A programmable design procedure is proposed to fullfill the design task and layout generation. A software algorithm in the C language has been developed and tested to generate 1 micron CMOS layouts using the Hewlett-Packard FUNGEN module generator shell.
Tai, Yiping; McBride, Murray B; Li, Zhian
2013-03-30
In the present study, we evaluated a commonly employed modified Bureau Communautaire de Référence (BCR test) 3-step sequential extraction procedure for its ability to distinguish forms of solid-phase Pb in soils with different sources and histories of contamination. When the modified BCR test was applied to mineral soils spiked with three forms of Pb (pyromorphite, hydrocerussite and nitrate salt), the added Pb was highly susceptible to dissolution in the operationally-defined "reducible" or "oxide" fraction regardless of form. When three different materials (mineral soil, organic soil and goethite) were spiked with soluble Pb nitrate, the BCR sequential extraction profiles revealed that soil organic matter was capable of retaining Pb in more stable and acid-resistant forms than silicate clay minerals or goethite. However, the BCR sequential extraction for field-collected soils with known and different sources of Pb contamination was not sufficiently discriminatory in the dissolution of soil Pb phases to allow soil Pb forms to be "fingerprinted" by this method. It is concluded that standard sequential extraction procedures are probably not very useful in predicting lability and bioavailability of Pb in contaminated soils. Copyright © 2013 Elsevier B.V. All rights reserved.
Toombs, Elaine; Unruh, Anita; McGrath, Patrick
2018-01-01
This study aimed to assess the Parent-Adolescent Communication Toolkit, an online intervention designed to help improve parent communication with their adolescents. Participant preferences for two module delivery systems (sequential and unrestricted module access) were identified. Usability assessment of the PACT intervention was completed using pre-test and posttest comparisons. Usability data, including participant completion and satisfaction ratings were examined. Parents ( N = 18) of adolescents were randomized to a sequential or unrestricted chapter access group. Parent participants completed pre-test measures, the PACT intervention and posttest measures. Participants provided feedback for the intervention to improve modules and provided usability ratings. Adolescent pre- and posttest ratings were evaluated. Usability ratings were high and parent feedback was positive. The sequential module access groups rated the intervention content higher and completed more content than the unrestricted chapter access group, indicating support for the sequential access design. Parent mean posttest communication scores were significantly higher ( p < .05) than pre-test scores. No significant differences were detected for adolescent participants. Findings suggest that the Parent-Adolescent Communication Toolkit has potential to improve parent-adolescent communication but further effectiveness assessment is required.
Herbage intake of dairy cows in mixed sequential grazing with breeding ewes as followers.
Jiménez-Rosales, Juan Daniel; Améndola-Massiotti, Ricardo Daniel; Burgueño-Ferreira, Juan Andrés; Ramírez-Valverde, Rodolfo; Topete-Pelayo, Pedro; Huerta-Bravo, Maximino
2018-03-01
This study aimed to evaluate the hypothesis that mixed sequential grazing of dairy cows and breeding ewes is beneficial. During the seasons of spring-summer 2013 and autumn-winter 2013-2014, 12 (spring-summer) and 16 (autumn-winter) Holstein Friesian cows and 24 gestating (spring-summer) and lactating (autumn-winter) Pelibuey ewes grazed on six (spring-summer) and nine (autumn-winter) paddocks of alfalfa and orchard grass mixed pastures. The treatments "single species cow grazing" (CowG) and "mixed sequential grazing with ewes as followers of cows" (MixG) were evaluated, under a completely randomized design with two replicates per paddock. Herbage mass on offer (HO) and residual herbage mass (RH) were estimated by cutting samples. The estimate of herbage intake (HI) of cows was based on the use of internal and external markers; the apparent HI of ewes was calculated as the difference between HO (RH of cows) and RH. Even though HO was higher in CowG, the HI of cows was higher in MixG during spring-summer and similar in both treatments during autumn-winter, implying that in MixG the effects on the cows HI of higher alfalfa proportion and herbage accumulation rate evolving from lower residual herbage mass in the previous cycle counteracted that of a higher HO in CowG. The HI of ewes was sufficient to enable satisfactory performance as breeding ewes. Thus, the benefits of mixed sequential grazing arose from higher herbage accumulation, positive changes in botanical composition, and the achievement of sheep production without negative effects on the herbage intake of cows.
Expert system for online surveillance of nuclear reactor coolant pumps
Gross, Kenny C.; Singer, Ralph M.; Humenik, Keith E.
1993-01-01
An expert system for online surveillance of nuclear reactor coolant pumps. This system provides a means for early detection of pump or sensor degradation. Degradation is determined through the use of a statistical analysis technique, sequential probability ratio test, applied to information from several sensors which are responsive to differing physical parameters. The results of sequential testing of the data provide the operator with an early warning of possible sensor or pump failure.
Technical Reports Prepared Under Contract N00014-76-C-0475.
1987-05-29
264 Approximations to Densities in Geometric H. Solomon 10/27/78 Probability M.A. Stephens 3. Technical Relort No. Title Author Date 265 Sequential ...Certain Multivariate S. Iyengar 8/12/82 Normal Probabilities 323 EDF Statistics for Testing for the Gamma M.A. Stephens 8/13/82 Distribution with...20-85 Nets 360 Random Sequential Coding By Hamming Distance Yoshiaki Itoh 07-11-85 Herbert Solomon 361 Transforming Censored Samples And Testing Fit
Wald Sequential Probability Ratio Test for Analysis of Orbital Conjunction Data
NASA Technical Reports Server (NTRS)
Carpenter, J. Russell; Markley, F. Landis; Gold, Dara
2013-01-01
We propose a Wald Sequential Probability Ratio Test for analysis of commonly available predictions associated with spacecraft conjunctions. Such predictions generally consist of a relative state and relative state error covariance at the time of closest approach, under the assumption that prediction errors are Gaussian. We show that under these circumstances, the likelihood ratio of the Wald test reduces to an especially simple form, involving the current best estimate of collision probability, and a similar estimate of collision probability that is based on prior assumptions about the likelihood of collision.
DOE Office of Scientific and Technical Information (OSTI.GOV)
Shirley, C.; Pohlmann, K.; Andricevic, R.
1996-09-01
Geological and geophysical data are used with the sequential indicator simulation algorithm of Gomez-Hernandez and Srivastava to produce multiple, equiprobable, three-dimensional maps of informal hydrostratigraphic units at the Frenchman Flat Corrective Action Unit, Nevada Test Site. The upper 50 percent of the Tertiary volcanic lithostratigraphic column comprises the study volume. Semivariograms are modeled from indicator-transformed geophysical tool signals. Each equiprobable study volume is subdivided into discrete classes using the ISIM3D implementation of the sequential indicator simulation algorithm. Hydraulic conductivity is assigned within each class using the sequential Gaussian simulation method of Deutsch and Journel. The resulting maps show the contiguitymore » of high and low hydraulic conductivity regions.« less
Robust inference for group sequential trials.
Ganju, Jitendra; Lin, Yunzhi; Zhou, Kefei
2017-03-01
For ethical reasons, group sequential trials were introduced to allow trials to stop early in the event of extreme results. Endpoints in such trials are usually mortality or irreversible morbidity. For a given endpoint, the norm is to use a single test statistic and to use that same statistic for each analysis. This approach is risky because the test statistic has to be specified before the study is unblinded, and there is loss in power if the assumptions that ensure optimality for each analysis are not met. To minimize the risk of moderate to substantial loss in power due to a suboptimal choice of a statistic, a robust method was developed for nonsequential trials. The concept is analogous to diversification of financial investments to minimize risk. The method is based on combining P values from multiple test statistics for formal inference while controlling the type I error rate at its designated value.This article evaluates the performance of 2 P value combining methods for group sequential trials. The emphasis is on time to event trials although results from less complex trials are also included. The gain or loss in power with the combination method relative to a single statistic is asymmetric in its favor. Depending on the power of each individual test, the combination method can give more power than any single test or give power that is closer to the test with the most power. The versatility of the method is that it can combine P values from different test statistics for analysis at different times. The robustness of results suggests that inference from group sequential trials can be strengthened with the use of combined tests. Copyright © 2017 John Wiley & Sons, Ltd.
Oury, Vincent; Tardieu, François; Turc, Olivier
2016-06-01
Grain abortion allows the production of at least a few viable seeds under water deficit but causes major yield loss. It is maximum for water deficits occurring during flowering in maize (Zea mays). We have tested the hypothesis that abortion is linked to the differential development of ovary cohorts along the ear and to the timing of silk emergence. Ovary volume and silk growth were followed over 25 to 30 d under four levels of water deficit and in four hybrids in two experiments. A position-time model allowed characterizing the development of ovary cohorts and their silk emergence. Silk growth rate decreased in water deficit and stopped 2 to 3 d after first silk emergence, simultaneously for all ovary cohorts, versus 7 to 8 d in well-watered plants. Abortion rate in different treatments and positions on the ear was not associated with ovary growth rate. It was accounted for by the superposition of (1) the sequential emergence of silks originating from ovaries of different cohorts along the ear with (2) one event occurring on a single day, the simultaneous silk growth arrest. Abortion occurred in the youngest ovaries whose silks did not emerge 2 d before silk arrest. This mechanism accounted for more than 90% of drought-related abortion in our experiments. It resembles the control of abortion in a large range of species and inflorescence architectures. This finding has large consequences for breeding drought-tolerant maize and for modeling grain yields in water deficit. © 2016 American Society of Plant Biologists. All Rights Reserved.
Tardieu, François
2016-01-01
Grain abortion allows the production of at least a few viable seeds under water deficit but causes major yield loss. It is maximum for water deficits occurring during flowering in maize (Zea mays). We have tested the hypothesis that abortion is linked to the differential development of ovary cohorts along the ear and to the timing of silk emergence. Ovary volume and silk growth were followed over 25 to 30 d under four levels of water deficit and in four hybrids in two experiments. A position-time model allowed characterizing the development of ovary cohorts and their silk emergence. Silk growth rate decreased in water deficit and stopped 2 to 3 d after first silk emergence, simultaneously for all ovary cohorts, versus 7 to 8 d in well-watered plants. Abortion rate in different treatments and positions on the ear was not associated with ovary growth rate. It was accounted for by the superposition of (1) the sequential emergence of silks originating from ovaries of different cohorts along the ear with (2) one event occurring on a single day, the simultaneous silk growth arrest. Abortion occurred in the youngest ovaries whose silks did not emerge 2 d before silk arrest. This mechanism accounted for more than 90% of drought-related abortion in our experiments. It resembles the control of abortion in a large range of species and inflorescence architectures. This finding has large consequences for breeding drought-tolerant maize and for modeling grain yields in water deficit. PMID:26598464
The Estrous Cycle of the Ewe Is Resistant to Disruption by Repeated, Acute Psychosocial Stress1
Wagenmaker, Elizabeth R.; Breen, Kellie M.; Oakley, Amy E.; Tilbrook, Alan J.; Karsch, Fred J.
2010-01-01
Five experiments were conducted to test the hypothesis that psychosocial stress interferes with the estrous cycle of sheep. In experiment 1, ewes were repeatedly isolated during the follicular phase. Timing, amplitude, and duration of the preovulatory luteinizing hormone (LH) surge were not affected. In experiment 2, follicular-phase ewes were subjected twice to a “layered stress” paradigm consisting of sequential, hourly application of isolation, restraint, blindfold, and predator cues. This reduced the LH pulse amplitude but did not affect the LH surge. In experiment 3, different acute stressors were given sequentially within the follicular phase: food denial plus unfamiliar noises and forced exercise, layered stress, exercise around midnight, and transportation. This, too, did not affect the LH surge. In experiment 4, variable acute psychosocial stress was given every 1–2 days for two entire estrous cycles; this did not disrupt any parameter of the cycle monitored. Lastly, experiment 5 examined whether the psychosocial stress paradigms of experiment 4 would disrupt the cycle and estrous behavior if sheep were metabolically stressed by chronic food restriction. Thirty percent of the food-restricted ewes exhibited deterioration of estrous cycle parameters followed by cessation of cycles and failure to express estrous behavior. However, disruption was not more evident in ewes that also encountered psychosocial stress. Collectively, these findings indicate the estrous cycle of sheep is remarkably resistant to disruption by acute bouts of psychosocial stress applied intermittently during either a single follicular phase or repeatedly over two estrous cycles. PMID:20164438
Loomis, Jack M; Klatzky, Roberta L; McHugh, Brendan; Giudice, Nicholas A
2012-08-01
Spatial working memory can maintain representations from vision, hearing, and touch, representations referred to here as spatial images. The present experiment addressed whether spatial images from vision and hearing that are simultaneously present within working memory retain modality-specific tags or are amodal. Observers were presented with short sequences of targets varying in angular direction, with the targets in a given sequence being all auditory, all visual, or a sequential mixture of the two. On two thirds of the trials, one of the locations was repeated, and observers had to respond as quickly as possible when detecting this repetition. Ancillary detection and localization tasks confirmed that the visual and auditory targets were perceptually comparable. Response latencies in the working memory task showed small but reliable costs in performance on trials involving a sequential mixture of auditory and visual targets, as compared with trials of pure vision or pure audition. These deficits were statistically reliable only for trials on which the modalities of the matching location switched from the penultimate to the final target in the sequence, indicating a switching cost. The switching cost for the pair in immediate succession means that the spatial images representing the target locations retain features of the visual or auditory representations from which they were derived. However, there was no reliable evidence of a performance cost for mixed modalities in the matching pair when the second of the two did not immediately follow the first, suggesting that more enduring spatial images in working memory may be amodal.
SEQUENTIAL EXTRACTIONS FOR PARTITIONING OF ARSENIC ON HYDROUS IRON OXIDES AND IRON SULFIDES
The objective of this study was to use model solids to test solutions designed to extract arsenic from relatively labile solid phase fractions. The use of sequential extractions provides analytical constraints on the identification of mineral phases that control arsenic mobility...
New methods of testing nonlinear hypothesis using iterative NLLS estimator
NASA Astrophysics Data System (ADS)
Mahaboob, B.; Venkateswarlu, B.; Mokeshrayalu, G.; Balasiddamuni, P.
2017-11-01
This research paper discusses the method of testing nonlinear hypothesis using iterative Nonlinear Least Squares (NLLS) estimator. Takeshi Amemiya [1] explained this method. However in the present research paper, a modified Wald test statistic due to Engle, Robert [6] is proposed to test the nonlinear hypothesis using iterative NLLS estimator. An alternative method for testing nonlinear hypothesis using iterative NLLS estimator based on nonlinear hypothesis using iterative NLLS estimator based on nonlinear studentized residuals has been proposed. In this research article an innovative method of testing nonlinear hypothesis using iterative restricted NLLS estimator is derived. Pesaran and Deaton [10] explained the methods of testing nonlinear hypothesis. This paper uses asymptotic properties of nonlinear least squares estimator proposed by Jenrich [8]. The main purpose of this paper is to provide very innovative methods of testing nonlinear hypothesis using iterative NLLS estimator, iterative NLLS estimator based on nonlinear studentized residuals and iterative restricted NLLS estimator. Eakambaram et al. [12] discussed least absolute deviation estimations versus nonlinear regression model with heteroscedastic errors and also they studied the problem of heteroscedasticity with reference to nonlinear regression models with suitable illustration. William Grene [13] examined the interaction effect in nonlinear models disused by Ai and Norton [14] and suggested ways to examine the effects that do not involve statistical testing. Peter [15] provided guidelines for identifying composite hypothesis and addressing the probability of false rejection for multiple hypotheses.
Sequential megafaunal collapse in the North Pacific Ocean: An ongoing legacy of industrial whaling?
Springer, A.M.; Estes, J.A.; Van Vliet, Gus B.; Williams, T.M.; Doak, D.F.; Danner, E.M.; Forney, K.A.; Pfister, B.
2003-01-01
Populations of seals, sea lions, and sea otters have sequentially collapsed over large areas of the northern North Pacific Ocean and southern Bering Sea during the last several decades. A bottom-up nutritional limitation mechanism induced by physical oceanographic change or competition with fisheries was long thought to be largely responsible for these declines. The current weight of evidence is more consistent with top-down forcing. Increased predation by killer whales probably drove the sea otter collapse and may have been responsible for the earlier pinniped declines as well. We propose that decimation of the great whales by post-World War II industrial whaling caused the great whales' foremost natural predators, killer whales, to begin feeding more intensively on the smaller marine mammals, thus "fishing-down" this element of the marine food web. The timing of these events, information on the abundance, diet, and foraging behavior of both predators and prey, and feasibility analyses based on demographic and energetic modeling are all consistent with this hypothesis.
Sequential megafaunal collapse in the North Pacific Ocean: An ongoing legacy of industrial whaling?
Springer, A. M.; Estes, J. A.; van Vliet, G. B.; Williams, T. M.; Doak, D. F.; Danner, E. M.; Forney, K. A.; Pfister, B.
2003-01-01
Populations of seals, sea lions, and sea otters have sequentially collapsed over large areas of the northern North Pacific Ocean and southern Bering Sea during the last several decades. A bottom-up nutritional limitation mechanism induced by physical oceanographic change or competition with fisheries was long thought to be largely responsible for these declines. The current weight of evidence is more consistent with top-down forcing. Increased predation by killer whales probably drove the sea otter collapse and may have been responsible for the earlier pinniped declines as well. We propose that decimation of the great whales by post-World War II industrial whaling caused the great whales' foremost natural predators, killer whales, to begin feeding more intensively on the smaller marine mammals, thus “fishing-down” this element of the marine food web. The timing of these events, information on the abundance, diet, and foraging behavior of both predators and prey, and feasibility analyses based on demographic and energetic modeling are all consistent with this hypothesis. PMID:14526101
Grossman, R A
1995-09-01
The purpose of this study was to determine whether women can discriminate better from less effective paracervical block techniques applied to opposite sides of the cervix. If this discrimination could be made, it would be possible to compare different techniques and thus improve the quality of paracervical anesthesia. Two milliliters of local anesthetic was applied to one side and 6 ml to the other side of volunteers' cervices before cervical dilation. Statistical examination was by sequential analysis. The study was stopped after 47 subjects had entered, when sequential analysis found that there was no significant difference in women's perception of pain. Nine women reported more pain on the side with more anesthesia and eight reported more pain on the side with less anesthesia. Because the amount of anesthesia did not make a difference, the null hypothesis (that women cannot discriminate between different anesthetic techniques) was accepted. Women are not able to discriminate different doses of local anesthetic when applied to opposite sides of the cervix.
NASA Astrophysics Data System (ADS)
Chen, Xinjia; Lacy, Fred; Carriere, Patrick
2015-05-01
Sequential test algorithms are playing increasingly important roles for quick detecting network intrusions such as portscanners. In view of the fact that such algorithms are usually analyzed based on intuitive approximation or asymptotic analysis, we develop an exact computational method for the performance analysis of such algorithms. Our method can be used to calculate the probability of false alarm and average detection time up to arbitrarily pre-specified accuracy.
NASA Astrophysics Data System (ADS)
Zhang, Wei; Bi, Zhengzheng; Shen, Dehua
2017-02-01
This paper investigates the impact of investor structure on the price-volume relationship by simulating a continuous double auction market. Connected with the underlying mechanisms of the price-volume relationship, i.e., the Mixture of Distribution Hypothesis (MDH) and the Sequential Information Arrival Hypothesis (SIAH), the simulation results show that: (1) there exists a strong lead-lag relationship between the return volatility and trading volume when the number of informed investors is close to the number of uninformed investors in the market; (2) as more and more informed investors entering the market, the lead-lag relationship becomes weaker and weaker, while the contemporaneous relationship between the return volatility and trading volume becomes more prominent; (3) when the informed investors are in absolute majority, the market can achieve the new equilibrium immediately. Therefore, we can conclude that the investor structure is a key factor in affecting the price-volume relationship.
Maurer, Willi; Jones, Byron; Chen, Ying
2018-05-10
In a 2×2 crossover trial for establishing average bioequivalence (ABE) of a generic agent and a currently marketed drug, the recommended approach to hypothesis testing is the two one-sided test (TOST) procedure, which depends, among other things, on the estimated within-subject variability. The power of this procedure, and therefore the sample size required to achieve a minimum power, depends on having a good estimate of this variability. When there is uncertainty, it is advisable to plan the design in two stages, with an interim sample size reestimation after the first stage, using an interim estimate of the within-subject variability. One method and 3 variations of doing this were proposed by Potvin et al. Using simulation, the operating characteristics, including the empirical type I error rate, of the 4 variations (called Methods A, B, C, and D) were assessed by Potvin et al and Methods B and C were recommended. However, none of these 4 variations formally controls the type I error rate of falsely claiming ABE, even though the amount of inflation produced by Method C was considered acceptable. A major disadvantage of assessing type I error rate inflation using simulation is that unless all possible scenarios for the intended design and analysis are investigated, it is impossible to be sure that the type I error rate is controlled. Here, we propose an alternative, principled method of sample size reestimation that is guaranteed to control the type I error rate at any given significance level. This method uses a new version of the inverse-normal combination of p-values test, in conjunction with standard group sequential techniques, that is more robust to large deviations in initial assumptions regarding the variability of the pharmacokinetic endpoints. The sample size reestimation step is based on significance levels and power requirements that are conditional on the first-stage results. This necessitates a discussion and exploitation of the peculiar properties of the power curve of the TOST testing procedure. We illustrate our approach with an example based on a real ABE study and compare the operating characteristics of our proposed method with those of Method B of Povin et al. Copyright © 2018 John Wiley & Sons, Ltd.
Individuation of Pairs of Objects in Infancy
ERIC Educational Resources Information Center
Leslie, Alan M.; Chen, Marian L.
2007-01-01
Looking-time studies examined whether 11-month-old infants can individuate two pairs of objects using only shape information. In order to test individuation, the object pairs were presented sequentially. Infants were familiarized either with the sequential pairs, disk-triangle/disk-triangle (XY/XY), whose shapes differed within but not across…
2013-08-01
in Sequential Design Optimization with Concurrent Calibration-Based Model Validation Dorin Drignei 1 Mathematics and Statistics Department...Validation 5a. CONTRACT NUMBER 5b. GRANT NUMBER 5c. PROGRAM ELEMENT NUMBER 6. AUTHOR(S) Dorin Drignei; Zissimos Mourelatos; Vijitashwa Pandey
Phosphate attenuates the anti-proteinuric effect of very low-protein diet in CKD patients.
Di Iorio, Biagio R; Bellizzi, Vincenzo; Bellasi, Antonio; Torraca, Serena; D'Arrigo, Graziella; Tripepi, Giovanni; Zoccali, Carmine
2013-03-01
High phosphate levels attenuate nephroprotection through angiotensin-converting enzyme inhibition in patients with proteinuric chronic kidney disease (CKD). Whether this phenomenon holds true for other nephroprotective interventions like very-low-protein diet (VLPD) is unknown. We tested the hypothesis that phosphate interferes with the anti-proteinuric response to VLPD in a non-randomized, sequential study in 99 proteinuric CKD patients who sequentially underwent low-protein diet (LPD; 0.6 g/kg) and VLPD (0.3 g/kg) supplemented with keto-analogues, each for periods longer than 1 year. Serum phosphate significantly reduced during VLPD (3.2 ± 0.6 mg/dL) when compared with LPD (3.7 ± 0.6 mg/dL, P < 0.001), an effect paralleled by a substantial decline in phosphate excretion (LPD, 649 ± 180 mg/day; VLPD, 462 ± 97 mg/day; P < 0.001). The median proteinuria during LPD was 1910 mg/24 h (interquartile range: 1445-2376 mg/24 h) and reduced to 987 mg/24 h (656-1300 mg/24 h) during VLPD (P < 0.001). No significant change in the estimated glomerular filtration rate (eGFR) was observed during the two diet periods. In linear mixed models including the diagnosis of renal disease, eGFR, 24-h urine sodium and urea and other potential confounders, there was a strong interaction between serum phosphate (P = 0.04) and phosphaturia (P < 0.001) with the anti-proteinuric response to VLPD. Accordingly, 24-h proteinuria reduced modestly in patients who maintained relatively higher serum phosphate levels or relatively higher phosphaturia to be maximal in those who achieved the lowest level of serum and urine phosphate. Phosphate is an important modifier of the anti-proteinuric response to VLPD. Reducing phosphate burden may decrease proteinuria and slow the progression of renal disease in CKD patients, an issue that remains to be tested in specific clinical trials.
Sequential Objective Structured Clinical Examination based on item response theory in Iran.
Hejri, Sara Mortaz; Jalili, Mohammad
2017-01-01
In a sequential objective structured clinical examination (OSCE), all students initially take a short screening OSCE. Examinees who pass are excused from further testing, but an additional OSCE is administered to the remaining examinees. Previous investigations of sequential OSCE were based on classical test theory. We aimed to design and evaluate screening OSCEs based on item response theory (IRT). We carried out a retrospective observational study. At each station of a 10-station OSCE, the students' performance was graded on a Likert-type scale. Since the data were polytomous, the difficulty parameters, discrimination parameters, and students' ability were calculated using a graded response model. To design several screening OSCEs, we identified the 5 most difficult stations and the 5 most discriminative ones. For each test, 5, 4, or 3 stations were selected. Normal and stringent cut-scores were defined for each test. We compared the results of each of the 12 screening OSCEs to the main OSCE and calculated the positive and negative predictive values (PPV and NPV), as well as the exam cost. A total of 253 students (95.1%) passed the main OSCE, while 72.6% to 94.4% of examinees passed the screening tests. The PPV values ranged from 0.98 to 1.00, and the NPV values ranged from 0.18 to 0.59. Two tests effectively predicted the results of the main exam, resulting in financial savings of 34% to 40%. If stations with the highest IRT-based discrimination values and stringent cut-scores are utilized in the screening test, sequential OSCE can be an efficient and convenient way to conduct an OSCE.
Scaccianoce, Giuseppe; Hassan, Cesare; Panarese, Alba; Piglionica, Donato; Morini, Sergio; Zullo, Angelo
2006-01-01
BACKGROUND Helicobacter pylori eradication rates achieved by standard seven-day triple therapies are decreasing in several countries, while a novel 10-day sequential regimen has achieved a very high success rate. A longer 10-day triple therapy, similar to the sequential regimen, was tested to see whether it could achieve a better infection cure rate. METHODS Patients with nonulcer dyspepsia and H pylori infection were randomly assigned to one of the following three therapies: esomeprazole 20 mg, clarithromycin 500 mg and amoxycillin 1 g for seven days or 10 days, or a 10-day sequential regimen including esomeprazole 20 mg plus amoxycillin 1 g for five days and esomeprazole 20 mg, clarithromycin 500 mg and tinidazole 500 mg for the remaining five days. All drugs were given twice daily. H pylori eradication was checked four to six weeks after treatment by using a 13C-urea breath test. RESULTS Overall, 213 patients were enrolled. H pylori eradication was achieved in 75.7% and 77.9%, in 81.7% and 84.1%, and in 94.4% and 97.1% of patients following seven-day or 10-day triple therapy and the 10-day sequential regimen, at intention-to-treat and per protocol analyses, respectively. The eradication rate following the sequential regimen was higher than either seven-day (P=0.002) or 10-day triple therapy (P=0.02), while no significant difference emerged between the latter two regimens (P=0.6). CONCLUSIONS The 10-day sequential regimen was significantly more effective than both triple regimens, while 10-day triple therapy failed to significantly increase the H pylori eradication rate achieved by the standard seven-day regimen. PMID:16482238
Globalization and human cooperation
Buchan, Nancy R.; Grimalda, Gianluca; Wilson, Rick; Brewer, Marilynn; Fatas, Enrique; Foddy, Margaret
2009-01-01
Globalization magnifies the problems that affect all people and that require large-scale human cooperation, for example, the overharvesting of natural resources and human-induced global warming. However, what does globalization imply for the cooperation needed to address such global social dilemmas? Two competing hypotheses are offered. One hypothesis is that globalization prompts reactionary movements that reinforce parochial distinctions among people. Large-scale cooperation then focuses on favoring one's own ethnic, racial, or language group. The alternative hypothesis suggests that globalization strengthens cosmopolitan attitudes by weakening the relevance of ethnicity, locality, or nationhood as sources of identification. In essence, globalization, the increasing interconnectedness of people worldwide, broadens the group boundaries within which individuals perceive they belong. We test these hypotheses by measuring globalization at both the country and individual levels and analyzing the relationship between globalization and individual cooperation with distal others in multilevel sequential cooperation experiments in which players can contribute to individual, local, and/or global accounts. Our samples were drawn from the general populations of the United States, Italy, Russia, Argentina, South Africa, and Iran. We find that as country and individual levels of globalization increase, so too does individual cooperation at the global level vis-à-vis the local level. In essence, “globalized” individuals draw broader group boundaries than others, eschewing parochial motivations in favor of cosmopolitan ones. Globalization may thus be fundamental in shaping contemporary large-scale cooperation and may be a positive force toward the provision of global public goods. PMID:19255433
Levin, Gregory P; Emerson, Sarah C; Emerson, Scott S
2014-09-01
Many papers have introduced adaptive clinical trial methods that allow modifications to the sample size based on interim estimates of treatment effect. There has been extensive commentary on type I error control and efficiency considerations, but little research on estimation after an adaptive hypothesis test. We evaluate the reliability and precision of different inferential procedures in the presence of an adaptive design with pre-specified rules for modifying the sampling plan. We extend group sequential orderings of the outcome space based on the stage at stopping, likelihood ratio statistic, and sample mean to the adaptive setting in order to compute median-unbiased point estimates, exact confidence intervals, and P-values uniformly distributed under the null hypothesis. The likelihood ratio ordering is found to average shorter confidence intervals and produce higher probabilities of P-values below important thresholds than alternative approaches. The bias adjusted mean demonstrates the lowest mean squared error among candidate point estimates. A conditional error-based approach in the literature has the benefit of being the only method that accommodates unplanned adaptations. We compare the performance of this and other methods in order to quantify the cost of failing to plan ahead in settings where adaptations could realistically be pre-specified at the design stage. We find the cost to be meaningful for all designs and treatment effects considered, and to be substantial for designs frequently proposed in the literature. © 2014, The International Biometric Society.
Modulation of V1 Spike Response by Temporal Interval of Spatiotemporal Stimulus Sequence
Kim, Taekjun; Kim, HyungGoo R.; Kim, Kayeon; Lee, Choongkil
2012-01-01
The spike activity of single neurons of the primary visual cortex (V1) becomes more selective and reliable in response to wide-field natural scenes compared to smaller stimuli confined to the classical receptive field (RF). However, it is largely unknown what aspects of natural scenes increase the selectivity of V1 neurons. One hypothesis is that modulation by surround interaction is highly sensitive to small changes in spatiotemporal aspects of RF surround. Such a fine-tuned modulation would enable single neurons to hold information about spatiotemporal sequences of oriented stimuli, which extends the role of V1 neurons as a simple spatiotemporal filter confined to the RF. In the current study, we examined the hypothesis in the V1 of awake behaving monkeys, by testing whether the spike response of single V1 neurons is modulated by temporal interval of spatiotemporal stimulus sequence encompassing inside and outside the RF. We used two identical Gabor stimuli that were sequentially presented with a variable stimulus onset asynchrony (SOA): the preceding one (S1) outside the RF and the following one (S2) in the RF. This stimulus configuration enabled us to examine the spatiotemporal selectivity of response modulation from a focal surround region. Although S1 alone did not evoke spike responses, visual response to S2 was modulated for SOA in the range of tens of milliseconds. These results suggest that V1 neurons participate in processing spatiotemporal sequences of oriented stimuli extending outside the RF. PMID:23091631
Globalization and human cooperation.
Buchan, Nancy R; Grimalda, Gianluca; Wilson, Rick; Brewer, Marilynn; Fatas, Enrique; Foddy, Margaret
2009-03-17
Globalization magnifies the problems that affect all people and that require large-scale human cooperation, for example, the overharvesting of natural resources and human-induced global warming. However, what does globalization imply for the cooperation needed to address such global social dilemmas? Two competing hypotheses are offered. One hypothesis is that globalization prompts reactionary movements that reinforce parochial distinctions among people. Large-scale cooperation then focuses on favoring one's own ethnic, racial, or language group. The alternative hypothesis suggests that globalization strengthens cosmopolitan attitudes by weakening the relevance of ethnicity, locality, or nationhood as sources of identification. In essence, globalization, the increasing interconnectedness of people worldwide, broadens the group boundaries within which individuals perceive they belong. We test these hypotheses by measuring globalization at both the country and individual levels and analyzing the relationship between globalization and individual cooperation with distal others in multilevel sequential cooperation experiments in which players can contribute to individual, local, and/or global accounts. Our samples were drawn from the general populations of the United States, Italy, Russia, Argentina, South Africa, and Iran. We find that as country and individual levels of globalization increase, so too does individual cooperation at the global level vis-à-vis the local level. In essence, "globalized" individuals draw broader group boundaries than others, eschewing parochial motivations in favor of cosmopolitan ones. Globalization may thus be fundamental in shaping contemporary large-scale cooperation and may be a positive force toward the provision of global public goods.
Sequential biases in accumulating evidence
Huggins, Richard; Dogo, Samson Henry
2015-01-01
Whilst it is common in clinical trials to use the results of tests at one phase to decide whether to continue to the next phase and to subsequently design the next phase, we show that this can lead to biased results in evidence synthesis. Two new kinds of bias associated with accumulating evidence, termed ‘sequential decision bias’ and ‘sequential design bias’, are identified. Both kinds of bias are the result of making decisions on the usefulness of a new study, or its design, based on the previous studies. Sequential decision bias is determined by the correlation between the value of the current estimated effect and the probability of conducting an additional study. Sequential design bias arises from using the estimated value instead of the clinically relevant value of an effect in sample size calculations. We considered both the fixed‐effect and the random‐effects models of meta‐analysis and demonstrated analytically and by simulations that in both settings the problems due to sequential biases are apparent. According to our simulations, the sequential biases increase with increased heterogeneity. Minimisation of sequential biases arises as a new and important research area necessary for successful evidence‐based approaches to the development of science. © 2015 The Authors. Research Synthesis Methods Published by John Wiley & Sons Ltd. PMID:26626562
Hypothesis testing in hydrology: Theory and practice
NASA Astrophysics Data System (ADS)
Kirchner, James; Pfister, Laurent
2017-04-01
Well-posed hypothesis tests have spurred major advances in hydrological theory. However, a random sample of recent research papers suggests that in hydrology, as in other fields, hypothesis formulation and testing rarely correspond to the idealized model of the scientific method. Practices such as "p-hacking" or "HARKing" (Hypothesizing After the Results are Known) are major obstacles to more rigorous hypothesis testing in hydrology, along with the well-known problem of confirmation bias - the tendency to value and trust confirmations more than refutations - among both researchers and reviewers. Hypothesis testing is not the only recipe for scientific progress, however: exploratory research, driven by innovations in measurement and observation, has also underlain many key advances. Further improvements in observation and measurement will be vital to both exploratory research and hypothesis testing, and thus to advancing the science of hydrology.
Testing the null hypothesis: the forgotten legacy of Karl Popper?
Wilkinson, Mick
2013-01-01
Testing of the null hypothesis is a fundamental aspect of the scientific method and has its basis in the falsification theory of Karl Popper. Null hypothesis testing makes use of deductive reasoning to ensure that the truth of conclusions is irrefutable. In contrast, attempting to demonstrate the new facts on the basis of testing the experimental or research hypothesis makes use of inductive reasoning and is prone to the problem of the Uniformity of Nature assumption described by David Hume in the eighteenth century. Despite this issue and the well documented solution provided by Popper's falsification theory, the majority of publications are still written such that they suggest the research hypothesis is being tested. This is contrary to accepted scientific convention and possibly highlights a poor understanding of the application of conventional significance-based data analysis approaches. Our work should remain driven by conjecture and attempted falsification such that it is always the null hypothesis that is tested. The write up of our studies should make it clear that we are indeed testing the null hypothesis and conforming to the established and accepted philosophical conventions of the scientific method.
Unscaled Bayes factors for multiple hypothesis testing in microarray experiments.
Bertolino, Francesco; Cabras, Stefano; Castellanos, Maria Eugenia; Racugno, Walter
2015-12-01
Multiple hypothesis testing collects a series of techniques usually based on p-values as a summary of the available evidence from many statistical tests. In hypothesis testing, under a Bayesian perspective, the evidence for a specified hypothesis against an alternative, conditionally on data, is given by the Bayes factor. In this study, we approach multiple hypothesis testing based on both Bayes factors and p-values, regarding multiple hypothesis testing as a multiple model selection problem. To obtain the Bayes factors we assume default priors that are typically improper. In this case, the Bayes factor is usually undetermined due to the ratio of prior pseudo-constants. We show that ignoring prior pseudo-constants leads to unscaled Bayes factor which do not invalidate the inferential procedure in multiple hypothesis testing, because they are used within a comparative scheme. In fact, using partial information from the p-values, we are able to approximate the sampling null distribution of the unscaled Bayes factor and use it within Efron's multiple testing procedure. The simulation study suggests that under normal sampling model and even with small sample sizes, our approach provides false positive and false negative proportions that are less than other common multiple hypothesis testing approaches based only on p-values. The proposed procedure is illustrated in two simulation studies, and the advantages of its use are showed in the analysis of two microarray experiments. © The Author(s) 2011.
NASA DOE POD NDE Capabilities Data Book
NASA Technical Reports Server (NTRS)
Generazio, Edward R.
2015-01-01
This data book contains the Directed Design of Experiments for Validating Probability of Detection (POD) Capability of NDE Systems (DOEPOD) analyses of the nondestructive inspection data presented in the NTIAC, Nondestructive Evaluation (NDE) Capabilities Data Book, 3rd ed., NTIAC DB-97-02. DOEPOD is designed as a decision support system to validate inspection system, personnel, and protocol demonstrating 0.90 POD with 95% confidence at critical flaw sizes, a90/95. The test methodology used in DOEPOD is based on the field of statistical sequential analysis founded by Abraham Wald. Sequential analysis is a method of statistical inference whose characteristic feature is that the number of observations required by the procedure is not determined in advance of the experiment. The decision to terminate the experiment depends, at each stage, on the results of the observations previously made. A merit of the sequential method, as applied to testing statistical hypotheses, is that test procedures can be constructed which require, on average, a substantially smaller number of observations than equally reliable test procedures based on a predetermined number of observations.
A Critique of One-Tailed Hypothesis Test Procedures in Business and Economics Statistics Textbooks.
ERIC Educational Resources Information Center
Liu, Tung; Stone, Courtenay C.
1999-01-01
Surveys introductory business and economics statistics textbooks and finds that they differ over the best way to explain one-tailed hypothesis tests: the simple null-hypothesis approach or the composite null-hypothesis approach. Argues that the composite null-hypothesis approach contains methodological shortcomings that make it more difficult for…
Apollo experience report: Command and service module sequential events control subsystem
NASA Technical Reports Server (NTRS)
Johnson, G. W.
1975-01-01
The Apollo command and service module sequential events control subsystem is described, with particular emphasis on the major systems and component problems and solutions. The subsystem requirements, design, and development and the test and flight history of the hardware are discussed. Recommendations to avoid similar problems on future programs are outlined.
Sequential color video to parallel color video converter
NASA Technical Reports Server (NTRS)
1975-01-01
The engineering design, development, breadboard fabrication, test, and delivery of a breadboard field sequential color video to parallel color video converter is described. The converter was designed for use onboard a manned space vehicle to eliminate a flickering TV display picture and to reduce the weight and bulk of previous ground conversion systems.
Alternating and Sequential Motion Rates in Older Adults
ERIC Educational Resources Information Center
Pierce, John E.; Cotton, Susan; Perry, Alison
2013-01-01
Background: Alternating motion rate (AMR) and sequential motion rate (SMR) are tests of articulatory diadochokinesis that are widely used in the evaluation of motor speech. However, there are no quality normative data available for adults aged 65 years and older. Aims: There were two aims: (1) to obtain a representative, normative dataset of…
NASA Astrophysics Data System (ADS)
Thamvichai, Ratchaneekorn; Huang, Liang-Chih; Ashok, Amit; Gong, Qian; Coccarelli, David; Greenberg, Joel A.; Gehm, Michael E.; Neifeld, Mark A.
2017-05-01
We employ an adaptive measurement system, based on sequential hypotheses testing (SHT) framework, for detecting material-based threats using experimental data acquired on an X-ray experimental testbed system. This testbed employs 45-degree fan-beam geometry and 15 views over a 180-degree span to generate energy sensitive X-ray projection data. Using this testbed system, we acquire multiple view projection data for 200 bags. We consider an adaptive measurement design where the X-ray projection measurements are acquired in a sequential manner and the adaptation occurs through the choice of the optimal "next" source/view system parameter. Our analysis of such an adaptive measurement design using the experimental data demonstrates a 3x-7x reduction in the probability of error relative to a static measurement design. Here the static measurement design refers to the operational system baseline that corresponds to a sequential measurement using all the available sources/views. We also show that by using adaptive measurements it is possible to reduce the number of sources/views by nearly 50% compared a system that relies on static measurements.
Härkönen, Kati; Kivekäs, Ilkka; Rautiainen, Markus; Kotti, Voitto; Sivonen, Ville; Vasama, Juha-Pekka
2015-05-01
This prospective study shows that working performance, quality of life (QoL), and quality of hearing (QoH) are better with two compared with a single cochlear implant (CI). The impact of the second CI on the patient's QoL is as significant as the impact of the first CI. To evaluate the benefits of sequential bilateral cochlear implantation in working, QoL, and QoH. We studied working performance, work-related stress, QoL, and QoH with specific questionnaires in 15 patients with unilateral CI scheduled for sequential CI of another ear. Sound localization performance and speech perception in noise were measured with specific tests. All questionnaires and tests were performed before the second CI surgery and 6 and 12 months after its activation. Bilateral CIs increased patients' working performance and their work-related stress and fatigue decreased. Communication with co-workers was easier and patients were more active in their working environment. Sequential bilateral cochlear implantation improved QoL, QoH, sound localization, and speech perception in noise statistically significantly.
Chen, Connie; Gribble, Matthew O; Bartroff, Jay; Bay, Steven M; Goldstein, Larry
2017-05-01
The United States's Clean Water Act stipulates in section 303(d) that states must identify impaired water bodies for which total maximum daily loads (TMDLs) of pollution inputs into water bodies are developed. Decision-making procedures about how to list, or delist, water bodies as impaired, or not, per Clean Water Act 303(d) differ across states. In states such as California, whether or not a particular monitoring sample suggests that water quality is impaired can be regarded as a binary outcome variable, and California's current regulatory framework invokes a version of the exact binomial test to consolidate evidence across samples and assess whether the overall water body complies with the Clean Water Act. Here, we contrast the performance of California's exact binomial test with one potential alternative, the Sequential Probability Ratio Test (SPRT). The SPRT uses a sequential testing framework, testing samples as they become available and evaluating evidence as it emerges, rather than measuring all the samples and calculating a test statistic at the end of the data collection process. Through simulations and theoretical derivations, we demonstrate that the SPRT on average requires fewer samples to be measured to have comparable Type I and Type II error rates as the current fixed-sample binomial test. Policymakers might consider efficient alternatives such as SPRT to current procedure. Copyright © 2017 Elsevier Ltd. All rights reserved.
NASA Astrophysics Data System (ADS)
Barrientos, Rafael; Virgós, Emilio
2006-07-01
The common genet ( Genetta genetta) and the stone marten ( Martes foina) are two species that overlap extensively in their distribution ranges in southwest Europe. Available diet data from these species allow us to predict some interference competition for food resources in sympatric populations. We checked the food interference hypothesis in a sympatric population. The diet of both predators was analyzed through scat collection. Seasonal differences in biomass consumption were compared between both species in those items considered as key resources according to biomass consumption. Strawberry tree fruits can be considered as key resource exclusively for genets whereas fungi, blackberries and rabbits are keys for stone martens only. For other key resources consumed by both species (wood mouse and figs) we suggest that a possible mechanism to reduce diet overlap could be the sequential use of these resources: no intensive exploitation by both species of the same key resource during the same season was detected. Figs and wood mouse were used alternatively. Although strawberry tree fruits and blackberry are exclusive key resources of one of the species, their consumptions showed the same pattern. Diet niche overlap in our study is low compared with other carnivore communities suggesting that exclusive use of some key resources and sequential use of shared ones is an optimal scenario to reduce overall competition for food resources.
Giobbie-Hurder, Anita; Price, Karen N; Gelber, Richard D
2009-06-01
Aromatase inhibitors provide superior disease control when compared with tamoxifen as adjuvant therapy for postmenopausal women with endocrine-responsive early breast cancer. To present the design, history, and analytic challenges of the Breast International Group (BIG) 1-98 trial: an international, multicenter, randomized, double-blind, phase-III study comparing the aromatase inhibitor letrozole with tamoxifen in this clinical setting. From 1998-2003, BIG 1-98 enrolled 8028 women to receive monotherapy with either tamoxifen or letrozole for 5 years, or sequential therapy of 2 years of one agent followed by 3 years of the other. Randomization to one of four treatment groups permitted two complementary analyses to be conducted several years apart. The first, reported in 2005, provided a head-to-head comparison of letrozole versus tamoxifen. Statistical power was increased by an enriched design, which included patients who were assigned sequential treatments until the time of the treatment switch. The second, reported in late 2008, used a conditional landmark approach to test the hypothesis that switching endocrine agents at approximately 2 years from randomization for patients who are disease-free is superior to continuing with the original agent. The 2005 analysis showed the superiority of letrozole compared with tamoxifen. The patients who were assigned tamoxifen alone were unblinded and offered the opportunity to switch to letrozole. Results from other trials increased the clinical relevance about whether or not to start treatment with letrozole or tamoxifen, and analysis plans were expanded to evaluate sequential versus single-agent strategies from randomization. Due to the unblinding of patients assigned tamoxifen alone, analysis of updated data will require ascertainment of the influence of selective crossover from tamoxifen to letrozole. BIG 1-98 is an example of an enriched design, involving complementary analyses addressing different questions several years apart, and subject to evolving analytic plans influenced by new data that emerge over time.
Lin, Carol Y; Li, Ling
2016-11-07
HPV DNA diagnostic tests for epidemiology monitoring (research purpose) or cervical cancer screening (clinical purpose) have often been considered separately. Women with positive Linear Array (LA) polymerase chain reaction (PCR) research test results typically are neither informed nor referred for colposcopy. Recently, a sequential testing by using Hybrid Capture 2 (HC2) HPV clinical test as a triage before genotype by LA has been adopted for monitoring HPV infections. Also, HC2 has been reported as a more feasible screening approach for cervical cancer in low-resource countries. Thus, knowing the performance of testing strategies incorporating HPV clinical test (i.e., HC2-only or using HC2 as a triage before genotype by LA) compared with LA-only testing in measuring HPV prevalence will be informative for public health practice. We conducted a Monte Carlo simulation study. Data were generated using mathematical algorithms. We designated the reported HPV infection prevalence in the U.S. and Latin America as the "true" underlying type-specific HPV prevalence. Analytical sensitivity of HC2 for detecting 14 high-risk (oncogenic) types was considered to be less than LA. Estimated-to-true prevalence ratios and percentage reductions were calculated. When the "true" HPV prevalence was designated as the reported prevalence in the U.S., with LA genotyping sensitivity and specificity of (0.95, 0.95), estimated-to-true prevalence ratios of 14 high-risk types were 2.132, 1.056, 0.958 for LA-only, HC2-only, and sequential testing, respectively. Estimated-to-true prevalence ratios of two vaccine-associated high-risk types were 2.359 and 1.063 for LA-only and sequential testing, respectively. When designated type-specific prevalence of HPV16 and 18 were reduced by 50 %, using either LA-only or sequential testing, prevalence estimates were reduced by 18 %. Estimated-to-true HPV infection prevalence ratios using LA-only testing strategy are generally higher than using HC2-only or using HC2 as a triage before genotype by LA. HPV clinical testing can be incorporated to monitor HPV prevalence or vaccine effectiveness. Caution is needed when comparing apparent prevalence from different testing strategies.
Naik, Umesh Chandra; Das, Mihir Tanay; Sauran, Swati; Thakur, Indu Shekhar
2014-03-01
The present study compares in vitro toxicity of electroplating effluent after the batch treatment process with that obtained after the sequential treatment process. Activated charcoal prepared from sugarcane bagasse through chemical carbonization, and tolerant indigenous bacteria, Bacillus sp. strain IST105, were used individually and sequentially for the treatment of electroplating effluent. The sequential treatment involving activated charcoal followed by bacterial treatment removed 99% of Cr(VI) compared with the batch processes, which removed 40% (charcoal) and 75% (bacteria), respectively. Post-treatment in vitro cyto/genotoxicity was evaluated by the MTT test and the comet assay in human HuH-7 hepatocarcinoma cells. The sequentially treated sample showed an increase in LC50 value with a 6-fold decrease in comet-assay DNA migration compared with that of untreated samples. A significant decrease in DNA migration and an increase in LC50 value of treated effluent proved the higher effectiveness of the sequential treatment process over the individual batch processes. Copyright © 2014 Elsevier B.V. All rights reserved.
Su, Chun-Lung; Gardner, Ian A; Johnson, Wesley O
2004-07-30
The two-test two-population model, originally formulated by Hui and Walter, for estimation of test accuracy and prevalence estimation assumes conditionally independent tests, constant accuracy across populations and binomial sampling. The binomial assumption is incorrect if all individuals in a population e.g. child-care centre, village in Africa, or a cattle herd are sampled or if the sample size is large relative to population size. In this paper, we develop statistical methods for evaluating diagnostic test accuracy and prevalence estimation based on finite sample data in the absence of a gold standard. Moreover, two tests are often applied simultaneously for the purpose of obtaining a 'joint' testing strategy that has either higher overall sensitivity or specificity than either of the two tests considered singly. Sequential versions of such strategies are often applied in order to reduce the cost of testing. We thus discuss joint (simultaneous and sequential) testing strategies and inference for them. Using the developed methods, we analyse two real and one simulated data sets, and we compare 'hypergeometric' and 'binomial-based' inferences. Our findings indicate that the posterior standard deviations for prevalence (but not sensitivity and specificity) based on finite population sampling tend to be smaller than their counterparts for infinite population sampling. Finally, we make recommendations about how small the sample size should be relative to the population size to warrant use of the binomial model for prevalence estimation. Copyright 2004 John Wiley & Sons, Ltd.
Phonological awareness and writing skills in children with Down syndrome.
Lavra-Pinto, Bárbara de; Lamprecht, Regina Ritter
2010-01-01
Down syndrome, phonological awareness, writing and working memory. to evaluate the phonological awareness of Brazilian children with Down syndrome; to analyze the relationship between the writing hypothesis and the phonological awareness scores of the participants; to compare the performance of children with Down syndrome to that of children with typical development according to the Phonological Awareness: Tool for sequential evaluation (PHONATSE), using the writing hypothesis as a matching criteria; to verify the correlation between the phonological awareness measurements and the phonological working memory. a group of eleven children aged between 7 and 14 years (average: 9 y 10 m) was selected for the study. Phonological awareness was evaluated using the PHONATSE. The phonological working memory was evaluated through an instrument developed by the researcher. all subjects presented measurable levels of phonological awareness through the PHONATSE. The phonological awareness scores and the writing hypothesis presented a significant positive association. The performance of children with Down syndrome was significantly lower than children with typical development who presented the same writing hypothesis. Measurements of phonological awareness and phonological working memory presented significant positive correlations. the phonological awareness of Brazilian children with Down syndrome can be evaluated through the PHONATSE. Syllable awareness improves with literacy, whereas phonemic awareness seems to result from written language learning. The phonological working memory influences the performance of children with Down syndrome in phonological awareness tasks.
P value and the theory of hypothesis testing: an explanation for new researchers.
Biau, David Jean; Jolles, Brigitte M; Porcher, Raphaël
2010-03-01
In the 1920s, Ronald Fisher developed the theory behind the p value and Jerzy Neyman and Egon Pearson developed the theory of hypothesis testing. These distinct theories have provided researchers important quantitative tools to confirm or refute their hypotheses. The p value is the probability to obtain an effect equal to or more extreme than the one observed presuming the null hypothesis of no effect is true; it gives researchers a measure of the strength of evidence against the null hypothesis. As commonly used, investigators will select a threshold p value below which they will reject the null hypothesis. The theory of hypothesis testing allows researchers to reject a null hypothesis in favor of an alternative hypothesis of some effect. As commonly used, investigators choose Type I error (rejecting the null hypothesis when it is true) and Type II error (accepting the null hypothesis when it is false) levels and determine some critical region. If the test statistic falls into that critical region, the null hypothesis is rejected in favor of the alternative hypothesis. Despite similarities between the two, the p value and the theory of hypothesis testing are different theories that often are misunderstood and confused, leading researchers to improper conclusions. Perhaps the most common misconception is to consider the p value as the probability that the null hypothesis is true rather than the probability of obtaining the difference observed, or one that is more extreme, considering the null is true. Another concern is the risk that an important proportion of statistically significant results are falsely significant. Researchers should have a minimum understanding of these two theories so that they are better able to plan, conduct, interpret, and report scientific experiments.
Vascular Effects of Early versus Late Postmenopausal Treatment with Estradiol.
Hodis, Howard N; Mack, Wendy J; Henderson, Victor W; Shoupe, Donna; Budoff, Matthew J; Hwang-Levine, Juliana; Li, Yanjie; Feng, Mei; Dustin, Laurie; Kono, Naoko; Stanczyk, Frank Z; Selzer, Robert H; Azen, Stanley P
2016-03-31
Data suggest that estrogen-containing hormone therapy is associated with beneficial effects with regard to cardiovascular disease when the therapy is initiated temporally close to menopause but not when it is initiated later. However, the hypothesis that the cardiovascular effects of postmenopausal hormone therapy vary with the timing of therapy initiation (the hormone-timing hypothesis) has not been tested. A total of 643 healthy postmenopausal women were stratified according to time since menopause (<6 years [early postmenopause] or ≥10 years [late postmenopause]) and were randomly assigned to receive either oral 17β-estradiol (1 mg per day, plus progesterone [45 mg] vaginal gel administered sequentially [i.e., once daily for 10 days of each 30-day cycle] for women with a uterus) or placebo (plus sequential placebo vaginal gel for women with a uterus). The primary outcome was the rate of change in carotid-artery intima-media thickness (CIMT), which was measured every 6 months. Secondary outcomes included an assessment of coronary atherosclerosis by cardiac computed tomography (CT), which was performed when participants completed the randomly assigned regimen. After a median of 5 years, the effect of estradiol, with or without progesterone, on CIMT progression differed between the early and late postmenopause strata (P=0.007 for the interaction). Among women who were less than 6 years past menopause at the time of randomization, the mean CIMT increased by 0.0078 mm per year in the placebo group versus 0.0044 mm per year in the estradiol group (P=0.008). Among women who were 10 or more years past menopause at the time of randomization, the rates of CIMT progression in the placebo and estradiol groups were similar (0.0088 and 0.0100 mm per year, respectively; P=0.29). CT measures of coronary-artery calcium, total stenosis, and plaque did not differ significantly between the placebo group and the estradiol group in either postmenopause stratum. Oral estradiol therapy was associated with less progression of subclinical atherosclerosis (measured as CIMT) than was placebo when therapy was initiated within 6 years after menopause but not when it was initiated 10 or more years after menopause. Estradiol had no significant effect on cardiac CT measures of atherosclerosis in either postmenopause stratum. (Funded by the National Institute on Aging, National Institutes of Health; ELITE ClinicalTrials.gov number, NCT00114517.).
Methylphenidate does not enhance visual working memory but benefits motivation in macaque monkeys.
Oemisch, Mariann; Johnston, Kevin; Paré, Martin
2016-10-01
Working memory is a limited-capacity cognitive process that retains relevant information temporarily to guide thoughts and behavior. A large body of work has suggested that catecholamines exert a major modulatory influence on cognition, but there is only equivocal evidence of a direct influence on working memory ability, which would be reflected in a dependence on working memory load. Here we tested the contribution of catecholamines to working memory by administering a wide range of acute oral doses of the dopamine and norepinephrine reuptake inhibitor methylphenidate (MPH, 0.1-9 mg/kg) to three female macaque monkeys (Macaca mulatta), whose working memory ability was measured from their performance in a visual sequential comparison task. This task allows the systematic manipulation of working memory load, and we therefore tested the specific hypothesis that MPH modulates performance in a manner that depends on both dose and memory load. We found no evidence of a dose- or memory load-dependent effect of MPH on performance. In contrast, significant effects on measures of motivation were observed. These findings suggest that an acute increase in catecholamines does not seem to affect the retention of visual information per se. As such, these results help delimit the effects of MPH on cognition. Copyright © 2016 Elsevier Ltd. All rights reserved.
The role of influenza in the epidemiology of pneumonia
Shrestha, Sourya; Foxman, Betsy; Berus, Joshua; van Panhuis, Willem G.; Steiner, Claudia; Viboud, Cécile; Rohani, Pejman
2015-01-01
Interactions arising from sequential viral and bacterial infections play important roles in the epidemiological outcome of many respiratory pathogens. Influenza virus has been implicated in the pathogenesis of several respiratory bacterial pathogens commonly associated with pneumonia. Though clinical evidence supporting this interaction is unambiguous, its population-level effects—magnitude, epidemiological impact and variation during pandemic and seasonal outbreaks—remain unclear. To address these unknowns, we used longitudinal influenza and pneumonia incidence data, at different spatial resolutions and across different epidemiological periods, to infer the nature, timing and the intensity of influenza-pneumonia interaction. We used a mechanistic transmission model within a likelihood-based inference framework to carry out formal hypothesis testing. Irrespective of the source of data examined, we found that influenza infection increases the risk of pneumonia by ~100-fold. We found no support for enhanced transmission or severity impact of the interaction. For model-validation, we challenged our fitted model to make out-of-sample pneumonia predictions during pandemic and non-pandemic periods. The consistency in our inference tests carried out on several distinct datasets, and the predictive skill of our model increase confidence in our overall conclusion that influenza infection substantially enhances the risk of pneumonia, though only for a short period. PMID:26486591
NASA Astrophysics Data System (ADS)
Kannan, Rohit; Tangirala, Arun K.
2014-06-01
Identification of directional influences in multivariate systems is of prime importance in several applications of engineering and sciences such as plant topology reconstruction, fault detection and diagnosis, and neurosciences. A spectrum of related directionality measures, ranging from linear measures such as partial directed coherence (PDC) to nonlinear measures such as transfer entropy, have emerged over the past two decades. The PDC-based technique is simple and effective, but being a linear directionality measure has limited applicability. On the other hand, transfer entropy, despite being a robust nonlinear measure, is computationally intensive and practically implementable only for bivariate processes. The objective of this work is to develop a nonlinear directionality measure, termed as KPDC, that possesses the simplicity of PDC but is still applicable to nonlinear processes. The technique is founded on a nonlinear measure called correntropy, a recently proposed generalized correlation measure. The proposed method is equivalent to constructing PDC in a kernel space where the PDC is estimated using a vector autoregressive model built on correntropy. A consistent estimator of the KPDC is developed and important theoretical results are established. A permutation scheme combined with the sequential Bonferroni procedure is proposed for testing hypothesis on absence of causality. It is demonstrated through several case studies that the proposed methodology effectively detects Granger causality in nonlinear processes.
Debates—Hypothesis testing in hydrology: Theory and practice
NASA Astrophysics Data System (ADS)
Pfister, Laurent; Kirchner, James W.
2017-03-01
The basic structure of the scientific method—at least in its idealized form—is widely championed as a recipe for scientific progress, but the day-to-day practice may be different. Here, we explore the spectrum of current practice in hypothesis formulation and testing in hydrology, based on a random sample of recent research papers. This analysis suggests that in hydrology, as in other fields, hypothesis formulation and testing rarely correspond to the idealized model of the scientific method. Practices such as "p-hacking" or "HARKing" (Hypothesizing After the Results are Known) are major obstacles to more rigorous hypothesis testing in hydrology, along with the well-known problem of confirmation bias—the tendency to value and trust confirmations more than refutations—among both researchers and reviewers. Nonetheless, as several examples illustrate, hypothesis tests have played an essential role in spurring major advances in hydrological theory. Hypothesis testing is not the only recipe for scientific progress, however. Exploratory research, driven by innovations in measurement and observation, has also underlain many key advances. Further improvements in observation and measurement will be vital to both exploratory research and hypothesis testing, and thus to advancing the science of hydrology.
Robust Stereo Visual Odometry Using Improved RANSAC-Based Methods for Mobile Robot Localization
Liu, Yanqing; Gu, Yuzhang; Li, Jiamao; Zhang, Xiaolin
2017-01-01
In this paper, we present a novel approach for stereo visual odometry with robust motion estimation that is faster and more accurate than standard RANSAC (Random Sample Consensus). Our method makes improvements in RANSAC in three aspects: first, the hypotheses are preferentially generated by sampling the input feature points on the order of ages and similarities of the features; second, the evaluation of hypotheses is performed based on the SPRT (Sequential Probability Ratio Test) that makes bad hypotheses discarded very fast without verifying all the data points; third, we aggregate the three best hypotheses to get the final estimation instead of only selecting the best hypothesis. The first two aspects improve the speed of RANSAC by generating good hypotheses and discarding bad hypotheses in advance, respectively. The last aspect improves the accuracy of motion estimation. Our method was evaluated in the KITTI (Karlsruhe Institute of Technology and Toyota Technological Institute) and the New Tsukuba dataset. Experimental results show that the proposed method achieves better results for both speed and accuracy than RANSAC. PMID:29027935
Wen, D.; Qing, L.; Harrison, G.; Golub, E.; Akintoye, S.O.
2010-01-01
Objectives Bisphosphonates commonly used to treat osteoporosis, Paget’s disease, multiple myeloma, hypercalcemia of malignancy and osteolytic lesions of cancer metastasis have been associated with bisphosphonate-associated jaw osteonecrosis (BJON). The underlying pathogenesis of BJON is unclear, but disproportionate bisphosphonate concentration in the jaw has been proposed as one potential etiological factor. This study tested the hypothesis that skeletal biodistribution of intravenous bisphosphonate is anatomic site-dependent in a rat model system. Materials and Methods Fluorescently labeled pamidronate was injected intravenously in athymic rats of equal weights followed by in vivo whole body fluorimetry, ex vivo optical imaging of oral, axial and appendicular bones and ethylenediaminetetraacetic acid bone decalcification to assess hydroxyapatite-bound bisphosphonate. Results Bisphosphonate uptake and bisphosphonate released per unit calcium were similar in oral and appendicular bones but lower than those in axial bones. Hydroxyapatite-bound bisphosphonate liberated by sequential acid decalcification was highest in oral relative to axial and appendicular bones (p < 0.05). Conclusions This study demonstrates regional differences in uptake and release of bisphosphonate from oral, axial and appendicular bones of immune deficient rats. PMID:21122034
Shefer, Sigal; Abelson, Avigdor; Mokady, Ofer; Geffen, Eli
2004-08-01
The biota of the eastern basin of the Mediterranean Sea has experienced dramatic changes in the last decades, in part as a result of the massive invasion of Red Sea species. The mechanism generally hypothesized for the 'Red-to-Med' invasion is that of natural dispersal through the Suez Canal. To date, however, this hypothesis has not been tested. This study examines the mode of invasion, using as a model the mussel Brachidontes pharaonis, an acclaimed 'Lessepsian migrant' that thrives along the eastern Mediterranean coast. Our findings reveal two distinct lineages of haplotypes, and five possible explanations are discussed for this observation. We show that the genetic exchange among the Mediterranean, Gulf of Suez and the northern Red Sea is sufficiently large to counteract the build up of sequential genetic structure. Nevertheless, these basins are rich in unique haplotypes of unknown origin. We propose that it is historic secondary contact, an ongoing anthropogenic transport or both processes, that participate in driving the population dynamics of B. pharaonis in the Mediterranean and northern Red Sea. Copyright 2004 Blackwell Publishing Ltd
How temporal cues can aid colour constancy
Foster, David H.; Amano, Kinjiro; Nascimento, Sérgio M. C.
2007-01-01
Colour constancy assessed by asymmetric simultaneous colour matching usually reveals limited levels of performance in the unadapted eye. Yet observers can readily discriminate illuminant changes on a scene from changes in the spectral reflectances of the surfaces making up the scene. This ability is probably based on judgements of relational colour constancy, in turn based on the physical stability of spatial ratios of cone excitations under illuminant changes. Evidence is presented suggesting that the ability to detect violations in relational colour constancy depends on temporal transient cues. Because colour constancy and relational colour constancy are closely connected, it should be possible to improve estimates of colour constancy by introducing similar transient cues into the matching task. To test this hypothesis, an experiment was performed in which observers made surface-colour matches between patterns presented in the same position in an alternating sequence with period 2 s or, as a control, presented simultaneously, side-by-side. The degree of constancy was significantly higher for sequential presentation, reaching 87% for matches averaged over 20 observers. Temporal cues may offer a useful source of information for making colour-constancy judgements. PMID:17515948
Kaser, Daniel J
2017-03-01
Fellows in Reproductive Endocrinology and Infertility training are expected to complete 18 months of clinical, basic, or epidemiological research. The goal of this research is not only to provide the basis for the thesis section of the oral board exam but also to spark interest in reproductive medicine research and to provide the next generation of physician-scientists with a foundational experience in research design and implementation. Incoming fellows often have varying degrees of training in research methodology and, likewise, different career goals. Ideally, selection of a thesis topic and mentor should be geared toward defining an "answerable" question and building a practical skill set for future investigation. This contribution to the JARG Young Investigator's Forum revisits the steps of the scientific method through the lens of one recently graduated fellow and his project aimed to test the hypothesis that "sequential oxygen exposure (5% from days 1 to 3, then 2% from days 3 to 5) improves blastocyst yield and quality compared to continuous exposure to 5% oxygen among human preimplantation embryos."
Bayesian inference for psychology. Part II: Example applications with JASP.
Wagenmakers, Eric-Jan; Love, Jonathon; Marsman, Maarten; Jamil, Tahira; Ly, Alexander; Verhagen, Josine; Selker, Ravi; Gronau, Quentin F; Dropmann, Damian; Boutin, Bruno; Meerhoff, Frans; Knight, Patrick; Raj, Akash; van Kesteren, Erik-Jan; van Doorn, Johnny; Šmíra, Martin; Epskamp, Sacha; Etz, Alexander; Matzke, Dora; de Jong, Tim; van den Bergh, Don; Sarafoglou, Alexandra; Steingroever, Helen; Derks, Koen; Rouder, Jeffrey N; Morey, Richard D
2018-02-01
Bayesian hypothesis testing presents an attractive alternative to p value hypothesis testing. Part I of this series outlined several advantages of Bayesian hypothesis testing, including the ability to quantify evidence and the ability to monitor and update this evidence as data come in, without the need to know the intention with which the data were collected. Despite these and other practical advantages, Bayesian hypothesis tests are still reported relatively rarely. An important impediment to the widespread adoption of Bayesian tests is arguably the lack of user-friendly software for the run-of-the-mill statistical problems that confront psychologists for the analysis of almost every experiment: the t-test, ANOVA, correlation, regression, and contingency tables. In Part II of this series we introduce JASP ( http://www.jasp-stats.org ), an open-source, cross-platform, user-friendly graphical software package that allows users to carry out Bayesian hypothesis tests for standard statistical problems. JASP is based in part on the Bayesian analyses implemented in Morey and Rouder's BayesFactor package for R. Armed with JASP, the practical advantages of Bayesian hypothesis testing are only a mouse click away.
Teaching Hypothesis Testing by Debunking a Demonstration of Telepathy.
ERIC Educational Resources Information Center
Bates, John A.
1991-01-01
Discusses a lesson designed to demonstrate hypothesis testing to introductory college psychology students. Explains that a psychology instructor demonstrated apparent psychic abilities to students. Reports that students attempted to explain the instructor's demonstrations through hypothesis testing and revision. Provides instructions on performing…
ERIC Educational Resources Information Center
Marinis, Theodoros; Saddy, Douglas
2013-01-01
Twenty-five monolingual (L1) children with specific language impairment (SLI), 32 sequential bilingual (L2) children, and 29 L1 controls completed the Test of Active & Passive Sentences-Revised (van der Lely 1996) and the Self-Paced Listening Task with Picture Verification for actives and passives (Marinis 2007). These revealed important…
ERIC Educational Resources Information Center
Conway, Christopher M.; Karpicke, Jennifer; Pisoni, David B.
2007-01-01
Spoken language consists of a complex, sequentially arrayed signal that contains patterns that can be described in terms of statistical relations among language units. Previous research has suggested that a domain-general ability to learn structured sequential patterns may underlie language acquisition. To test this prediction, we examined the…
The Development of Auditory Sequential Memory in Young Black and White Children.
ERIC Educational Resources Information Center
Hurley, Oliver L.; And Others
The question of whether Black children "peak" earlier than White children in auditory sequential memory (ASM) was investigated in 122 Black children and 120 White children in grades k-3 in two racially mixed schools in a large southern community. Each S was given the ASM subtest of the Illinois Test of Psycholinguistic Abilities. Results…
Examining Age-Related Movement Representations for Sequential (Fine-Motor) Finger Movements
ERIC Educational Resources Information Center
Gabbard, Carl; Cacola, Priscila; Bobbio, Tatiana
2011-01-01
Theory suggests that imagined and executed movement planning relies on internal models for action. Using a chronometry paradigm to compare the movement duration of imagined and executed movements, we tested children aged 7-11 years and adults on their ability to perform sequential finger movements. Underscoring this tactic was our desire to gain a…
Sequential Organization and Room Reverberation for Speech Segregation
2012-02-28
we have proposed two algorithms for sequential organization, an unsupervised clustering algorithm applicable to monaural recordings and a binaural ...algorithm that integrates monaural and binaural analyses. In addition, we have conducted speech intelligibility tests that Firmly establish the...comprehensive version is currently under review for journal publication. A binaural approach in room reverberation Most existing approaches to binaural or
ERIC Educational Resources Information Center
Fischer, Rico; Plessow, Franziska; Kunde, Wilfried; Kiesel, Andrea
2010-01-01
Interference effects are reduced after trials including response conflict. This sequential modulation has often been attributed to a top-down mediated adaptive control mechanism and/or to feature repetition mechanisms. In the present study we tested whether mechanisms responsible for such sequential modulations are subject to attentional…
ERIC Educational Resources Information Center
Ramaswamy, Ravishankar; Dix, Edward F.; Drew, Janet E.; Diamond, James J.; Inouye, Sharon K.; Roehl, Barbara J. O.
2011-01-01
Purpose of the Study: Delirium is a widespread concern for hospitalized seniors, yet is often unrecognized. A comprehensive and sequential intervention (CSI) aiming to effect change in clinician behavior by improving knowledge about delirium was tested. Design and Methods: A 2-day CSI program that consisted of progressive 4-part didactic series,…
Lash, Ayhan Aytekin; Plonczynski, Donna J; Sehdev, Amikar
2011-01-01
To compare the inclusion and the influences of selected variables on hypothesis testing during the 1980s and 1990s. In spite of the emphasis on conducting inquiry consistent with the tenets of logical positivism, there have been no studies investigating the frequency and patterns of hypothesis testing in nursing research The sample was obtained from the journal Nursing Research which was the research journal with the highest circulation during the study period under study. All quantitative studies published during the two decades including briefs and historical studies were included in the analyses A retrospective design was used to select the sample. Five years from the 1980s and 1990s each were randomly selected from the journal, Nursing Research. Of the 582 studies, 517 met inclusion criteria. Findings suggest that there has been a decline in the use of hypothesis testing in the last decades of the 20th century. Further research is needed to identify the factors that influence the conduction of research with hypothesis testing. Hypothesis testing in nursing research showed a steady decline from the 1980s to 1990s. Research purposes of explanation, and prediction/ control increased the likelihood of hypothesis testing. Hypothesis testing strengthens the quality of the quantitative studies, increases the generality of findings and provides dependable knowledge. This is particularly true for quantitative studies that aim to explore, explain and predict/control phenomena and/or test theories. The findings also have implications for doctoral programmes, research preparation of nurse-investigators, and theory testing.
A shift from significance test to hypothesis test through power analysis in medical research.
Singh, G
2006-01-01
Medical research literature until recently, exhibited substantial dominance of the Fisher's significance test approach of statistical inference concentrating more on probability of type I error over Neyman-Pearson's hypothesis test considering both probability of type I and II error. Fisher's approach dichotomises results into significant or not significant results with a P value. The Neyman-Pearson's approach talks of acceptance or rejection of null hypothesis. Based on the same theory these two approaches deal with same objective and conclude in their own way. The advancement in computing techniques and availability of statistical software have resulted in increasing application of power calculations in medical research and thereby reporting the result of significance tests in the light of power of the test also. Significance test approach, when it incorporates power analysis contains the essence of hypothesis test approach. It may be safely argued that rising application of power analysis in medical research may have initiated a shift from Fisher's significance test to Neyman-Pearson's hypothesis test procedure.
2011-01-01
Background Although many biological databases are applying semantic web technologies, meaningful biological hypothesis testing cannot be easily achieved. Database-driven high throughput genomic hypothesis testing requires both of the capabilities of obtaining semantically relevant experimental data and of performing relevant statistical testing for the retrieved data. Tissue Microarray (TMA) data are semantically rich and contains many biologically important hypotheses waiting for high throughput conclusions. Methods An application-specific ontology was developed for managing TMA and DNA microarray databases by semantic web technologies. Data were represented as Resource Description Framework (RDF) according to the framework of the ontology. Applications for hypothesis testing (Xperanto-RDF) for TMA data were designed and implemented by (1) formulating the syntactic and semantic structures of the hypotheses derived from TMA experiments, (2) formulating SPARQLs to reflect the semantic structures of the hypotheses, and (3) performing statistical test with the result sets returned by the SPARQLs. Results When a user designs a hypothesis in Xperanto-RDF and submits it, the hypothesis can be tested against TMA experimental data stored in Xperanto-RDF. When we evaluated four previously validated hypotheses as an illustration, all the hypotheses were supported by Xperanto-RDF. Conclusions We demonstrated the utility of high throughput biological hypothesis testing. We believe that preliminary investigation before performing highly controlled experiment can be benefited. PMID:21342584
McCarthy, Kathleen M; Mahon, Merle; Rosen, Stuart; Evans, Bronwen G
2014-01-01
The majority of bilingual speech research has focused on simultaneous bilinguals. Yet, in immigrant communities, children are often initially exposed to their family language (L1), before becoming gradually immersed in the host country's language (L2). This is typically referred to as sequential bilingualism. Using a longitudinal design, this study explored the perception and production of the English voicing contrast in 55 children (40 Sylheti-English sequential bilinguals and 15 English monolinguals). Children were tested twice: when they were in nursery (52-month-olds) and 1 year later. Sequential bilinguals' perception and production of English plosives were initially driven by their experience with their L1, but after starting school, changed to match that of their monolingual peers. PMID:25123987
Bush, Terry; Lovejoy, Jennifer; Javitz, Harold; Torres, Alula Jimenez; Wassum, Ken; Tan, Marcia M; Spring, Bonnie
2018-05-31
Smoking cessation often results in weight gain which discourages many smokers from quitting and can increase health risks. Treatments to reduce cessation-related weight gain have been tested in highly controlled trials of in-person treatment, but have never been tested in a real-world setting, which has inhibited dissemination. The Best Quit Study (BQS) is a replication and "real world" translation using telephone delivery of a prior in-person efficacy trial. randomized control trial in a quitline setting. Eligible smokers (n = 2540) were randomized to the standard 5-call quitline intervention or quitline plus simultaneous or sequential weight management. Regression analyses tested effectiveness of treatments on self-reported smoking abstinence and weight change at 6 and 12 months. Study enrollees were from 10 commercial employer groups and three state quitlines. Participants were between ages 18-72, 65.8% female, 68.2% white; 23.0% Medicaid-insured, and 76.3% overweight/obese. The follow-up response rate was lower in the simultaneous group than the control group at 6 months (p = 0.01). While a completers analysis of 30-day point prevalence abstinence detected no differences among groups at 6 or 12 months, multiply imputed abstinence showed quit rate differences at 6 months for:simultaneous (40.3%) vs. sequential (48.3%), p = 0.034 and simultaneous vs. control (44.9%), p = 0.043. At 12 months, multiply imputed abstinence, was significantly lower for the simultaneous group (40.7%) vs. control (46.0%), p < 0.05 and vs. sequential (46.3%), p < 0.05. Weight gain at 6 and 12 months was minimal and not different among treatment groups. The sequential group completed fewer total calls (3.75) vs. control (4.16) and vs. simultaneous group (3.83), p = 0.01, and fewer weight calls (0.94) than simultaneous (2.33), p < 0.0001. The number of calls completed predicted 30-day abstinence, p < 0.001, but not weight outcomes. This study offers a model for evaluating population-level public health interventions conducted in partnership with tobacco quitlines. Simultaneous (vs. sequential) delivery of phone/web weight management with cessation treatment in the quitline setting may adversely affect quit rate. Neither a simultaneous nor sequential approach to addressing weight produced any benefit on suppressing weight gain. This study highlights the need and the challenges of testing intensive interventions in real-world settings. ClinicalTrials.gov Identifier: NCT01867983 . Registered: May 30, 2013.
Geomorphic controls of soil spatial complexity in a primeval mountain forest in the Czech Republic
NASA Astrophysics Data System (ADS)
Daněk, Pavel; Šamonil, Pavel; Phillips, Jonathan D.
2016-11-01
Soil diversity and complexity is influenced by a variety of factors, and much recent research has been focused on interpreting or modeling complexity based on soil-topography relationships, and effects of biogeomorphic processes. We aimed to (i) describe local soil diversity in one of the oldest forest reserves in Europe, (ii) employ existing graph theory concepts in pedocomplexity calculation and extend them by a novel approach based on hypothesis testing and an index measuring graph sequentiality (the extent to which soils have gradual vs. abrupt variations in underlying soil factors), and (iii) reveal the main sources of pedocomplexity, with a particular focus on geomorphic controls. A total of 954 soil profiles were described and classified to soil taxonomic units (STU) within a 46 ha area. We analyzed soil diversity using the Shannon index, and soil complexity using a novel graph theory approach. Pairwise tests of observed adjacencies, spectral radius and a newly proposed sequentiality index were used to describe and quantify the complexity of the spatial pattern of STUs. This was then decomposed into the contributions of three soil factor sequences (SFS), (i) degree of weathering and leaching processes, (ii) hydromorphology, and (iii) proportion of rock fragments. Six Reference Soil Groups and 37 second-level soil units were found. A significant portion of pedocomplexity occurred at distances shorter than the 22 m spacing of neighbouring soil profiles. The spectral radius (an index of complexity) of the pattern of soil spatial adjacency was 14.73, to which the individual SFS accounted for values of 2.0, 8.0 and 3.5, respectively. Significant sequentiality was found for degree of weathering and hydromorphology. Exceptional overall pedocomplexity was particularly caused by enormous spatial variability of soil wetness, representing a crucial soil factor sequence in the primeval forest. Moreover, the soil wetness gradient was partly spatially correlated with the gradient of soil weathering and leaching, suggesting synergistic influences of topography, climate, (hydro)geology and biomechanical and biochemical effects of individual trees. The pattern of stony soils, random in most respects, resulted probably from local geology and quaternary biogeomorphological processes. Thus, while geomorphology is the primary control over a very locally complex soil pattern, microtopography and local disturbances, mostly related to the effects of individual trees, are also critical. Considerable local pedodiversity seems to be an important component of the dynamics of old-growth mixed temperate mountain forests, with implications for decreasing pedodiversity in managed forests and deforested areas.
Aldi, Silvia; Takano, Ken-ichi; Tomita, Kengo; Koda, Kenichiro; Chan, Noel Y.-K.; Marino, Alice; Salazar-Rodriguez, Mariselis; Thurmond, Robin L.
2014-01-01
Renin released by ischemia/reperfusion (I/R) from cardiac mast cells (MCs) activates a local renin-angiotensin system (RAS) causing arrhythmic dysfunction. Ischemic preconditioning (IPC) inhibits MC renin release and consequent activation of this local RAS. We postulated that MC histamine H4-receptors (H4Rs), being Gαi/o-coupled, might activate a protein kinase C isotype–ε (PKCε)–aldehyde dehydrogenase type-2 (ALDH2) cascade, ultimately eliminating MC-degranulating and renin-releasing effects of aldehydes formed in I/R and associated arrhythmias. We tested this hypothesis in ex vivo hearts, human mastocytoma cells, and bone marrow–derived MCs from wild-type and H4R knockout mice. We found that activation of MC H4Rs mimics the cardioprotective anti-RAS effects of IPC and that protection depends on the sequential activation of PKCε and ALDH2 in MCs, reducing aldehyde-induced MC degranulation and renin release and alleviating reperfusion arrhythmias. These cardioprotective effects are mimicked by selective H4R agonists and disappear when H4Rs are pharmacologically blocked or genetically deleted. Our results uncover a novel cardioprotective pathway in I/R, whereby activation of H4Rs on the MC membrane, possibly by MC-derived histamine, leads sequentially to PKCε and ALDH2 activation, reduction of toxic aldehyde-induced MC renin release, prevention of RAS activation, reduction of norepinephrine release, and ultimately to alleviation of reperfusion arrhythmias. This newly discovered protective pathway suggests that MC H4Rs may represent a new pharmacologic and therapeutic target for the direct alleviation of RAS-induced cardiac dysfunctions, including ischemic heart disease and congestive heart failure. PMID:24696042
Hormone therapy in postmenopausal women affects hemispheric asymmetries in fine motor coordination.
Bayer, Ulrike; Hausmann, Markus
2010-08-01
Evidence exists that the functional differences between the left and right cerebral hemispheres are affected by age. One prominent hypothesis proposes that frontal activity during cognitive task performance tends to be less lateralized in older than in younger adults, a pattern that has also been reported for motor functioning. Moreover, functional cerebral asymmetries (FCAs) have been shown to be affected by sex hormonal manipulations via hormone therapy (HT) in older women. Here, we investigate whether FCAs in fine motor coordination, as reflected by manual asymmetries (MAs), are susceptible to HT in older women. Therefore, sixty-two postmenopausal women who received hormone therapy either with estrogen (E) alone (n=15), an E-gestagen combination (n=21) or without HT (control group, n=26) were tested. Saliva levels of free estradiol and progesterone (P) were analyzed using chemiluminescence assays. MAs were measured with a finger tapping paradigm consisting of two different tapping conditions. As expected, postmenopausal controls without HT showed reduced MAs in simple (repetitive) finger tapping. In a more demanding sequential condition involving four fingers, however, they revealed enhanced MAs in favour of the dominant hand. This finding suggests an insufficient recruitment of critical motor brain areas (especially when the nondominant hand is used), probably as a result of age-related changes in corticocortical connectivity between motor areas. In contrast, both HT groups revealed reduced MAs in sequential finger tapping but an asymmetrical tapping performance related to estradiol levels in simple finger tapping. A similar pattern has previously been found in younger participants. The results suggest that, HT, and E exposure in particular, exerts positive effects on the motor system thereby counteracting an age-related reorganization. Copyright 2010 Elsevier Inc. All rights reserved.
Vojtechova, Iveta; Petrasek, Tomas; Hatalova, Hana; Pistikova, Adela; Vales, Karel; Stuchlik, Ales
2016-05-15
The prevention of engram interference, pattern separation, flexibility, cognitive coordination and spatial navigation are usually studied separately at the behavioral level. Impairment in executive functions is often observed in patients suffering from schizophrenia. We have designed a protocol for assessing these functions all together as behavioral separation. This protocol is based on alternated or sequential training in two tasks testing different hippocampal functions (the Morris water maze and active place avoidance), and alternated or sequential training in two similar environments of the active place avoidance task. In Experiment 1, we tested, in adult rats, whether the performance in two different spatial tasks was affected by their order in sequential learning, or by their day-to-day alternation. In Experiment 2, rats learned to solve the active place avoidance task in two environments either alternately or sequentially. We found that rats are able to acquire both tasks and to discriminate both similar contexts without obvious problems regardless of the order or the alternation. We used two groups of rats, controls and a rat model of psychosis induced by a subchronic intraperitoneal application of 0.08mg/kg of dizocilpine (MK-801), a non-competitive antagonist of NMDA receptors. Dizocilpine had no selective effect on parallel/sequential learning of tasks/contexts. However, it caused hyperlocomotion and a significant deficit in learning in the active place avoidance task regardless of the task alternation. Cognitive coordination tested by this task is probably more sensitive to dizocilpine than spatial orientation because no hyperactivity or learning impairment was observed in the Morris water maze. Copyright © 2016 Elsevier B.V. All rights reserved.
Irregularity, volatility, risk, and financial market time series
Pincus, Steve; Kalman, Rudolf E.
2004-01-01
The need to assess subtle, potentially exploitable changes in serial structure is paramount in the analysis of financial data. Herein, we demonstrate the utility of approximate entropy (ApEn), a model-independent measure of sequential irregularity, toward this goal, by several distinct applications. We consider both empirical data and models, including composite indices (Standard and Poor's 500 and Hang Seng), individual stock prices, the random-walk hypothesis, and the Black–Scholes and fractional Brownian motion models. Notably, ApEn appears to be a potentially useful marker of system stability, with rapid increases possibly foreshadowing significant changes in a financial variable. PMID:15358860
Parallel effects of memory set activation and search on timing and working memory capacity.
Schweickert, Richard; Fortin, Claudette; Xi, Zhuangzhuang; Viau-Quesnel, Charles
2014-01-01
Accurately estimating a time interval is required in everyday activities such as driving or cooking. Estimating time is relatively easy, provided a person attends to it. But a brief shift of attention to another task usually interferes with timing. Most processes carried out concurrently with timing interfere with it. Curiously, some do not. Literature on a few processes suggests a general proposition, the Timing and Complex-Span Hypothesis: A process interferes with concurrent timing if and only if process performance is related to complex span. Complex-span is the number of items correctly recalled in order, when each item presented for study is followed by a brief activity. Literature on task switching, visual search, memory search, word generation and mental time travel supports the hypothesis. Previous work found that another process, activation of a memory set in long term memory, is not related to complex-span. If the Timing and Complex-Span Hypothesis is true, activation should not interfere with concurrent timing in dual-task conditions. We tested such activation in single-task memory search task conditions and in dual-task conditions where memory search was executed with concurrent timing. In Experiment 1, activating a memory set increased reaction time, with no significant effect on time production. In Experiment 2, set size and memory set activation were manipulated. Activation and set size had a puzzling interaction for time productions, perhaps due to difficult conditions, leading us to use a related but easier task in Experiment 3. In Experiment 3 increasing set size lengthened time production, but memory activation had no significant effect. Results here and in previous literature on the whole support the Timing and Complex-Span Hypotheses. Results also support a sequential organization of activation and search of memory. This organization predicts activation and set size have additive effects on reaction time and multiplicative effects on percent correct, which was found.
Knowledge dimensions in hypothesis test problems
NASA Astrophysics Data System (ADS)
Krishnan, Saras; Idris, Noraini
2012-05-01
The reformation in statistics education over the past two decades has predominantly shifted the focus of statistical teaching and learning from procedural understanding to conceptual understanding. The emphasis of procedural understanding is on the formulas and calculation procedures. Meanwhile, conceptual understanding emphasizes students knowing why they are using a particular formula or executing a specific procedure. In addition, the Revised Bloom's Taxonomy offers a twodimensional framework to describe learning objectives comprising of the six revised cognition levels of original Bloom's taxonomy and four knowledge dimensions. Depending on the level of complexities, the four knowledge dimensions essentially distinguish basic understanding from the more connected understanding. This study identifiesthe factual, procedural and conceptual knowledgedimensions in hypothesis test problems. Hypothesis test being an important tool in making inferences about a population from sample informationis taught in many introductory statistics courses. However, researchers find that students in these courses still have difficulty in understanding the underlying concepts of hypothesis test. Past studies also show that even though students can perform the hypothesis testing procedure, they may not understand the rationale of executing these steps or know how to apply them in novel contexts. Besides knowing the procedural steps in conducting a hypothesis test, students must have fundamental statistical knowledge and deep understanding of the underlying inferential concepts such as sampling distribution and central limit theorem. By identifying the knowledge dimensions of hypothesis test problems in this study, suitable instructional and assessment strategies can be developed in future to enhance students' learning of hypothesis test as a valuable inferential tool.
Toward Joint Hypothesis-Tests Seismic Event Screening Analysis: Ms|mb and Event Depth
DOE Office of Scientific and Technical Information (OSTI.GOV)
Anderson, Dale; Selby, Neil
2012-08-14
Well established theory can be used to combine single-phenomenology hypothesis tests into a multi-phenomenology event screening hypothesis test (Fisher's and Tippett's tests). Commonly used standard error in Ms:mb event screening hypothesis test is not fully consistent with physical basis. Improved standard error - Better agreement with physical basis, and correctly partitions error to include Model Error as a component of variance, correctly reduces station noise variance through network averaging. For 2009 DPRK test - Commonly used standard error 'rejects' H0 even with better scaling slope ({beta} = 1, Selby et al.), improved standard error 'fails to rejects' H0.
Wakamiya, Eiji; Okumura, Tomohito; Nakanishi, Makoto; Takeshita, Takashi; Mizuta, Mekumi; Kurimoto, Naoko; Tamai, Hiroshi
2011-06-01
To clarify whether rapid naming ability itself is a main underpinning factor of rapid automatized naming tests (RAN) and how deep an influence the discrete decoding process has on reading, we performed discrete naming tasks and discrete hiragana reading tasks as well as sequential naming tasks and sequential hiragana reading tasks with 38 Japanese schoolchildren with reading difficulty. There were high correlations between both discrete and sequential hiragana reading and sentence reading, suggesting that some mechanism which automatizes hiragana reading makes sentence reading fluent. In object and color tasks, there were moderate correlations between sentence reading and sequential naming, and between sequential naming and discrete naming. But no correlation was found between reading tasks and discrete naming tasks. The influence of rapid naming ability of objects and colors upon reading seemed relatively small, and multi-item processing may work in relation to these. In contrast, in the digit naming task there was moderate correlation between sentence reading and discrete naming, while no correlation was seen between sequential naming and discrete naming. There was moderate correlation between reading tasks and sequential digit naming tasks. Digit rapid naming ability has more direct effect on reading while its effect on RAN is relatively limited. The ratio of how rapid naming ability influences RAN and reading seems to vary according to kind of the stimuli used. An assumption about components in RAN which influence reading is discussed in the context of both sequential processing and discrete naming speed. Copyright © 2010 The Japanese Society of Child Neurology. Published by Elsevier B.V. All rights reserved.
Tokunaga, Hironobu; Ando, Hirotaka; Obika, Mikako; Miyoshi, Tomoko; Tokuda, Yasuharu; Bautista, Miho; Kataoka, Hitomi; Terasawa, Hidekazu
2014-01-01
Objectives We report the preliminary development of a unique Web-based instrument for assessing and teaching knowledge and developing clinical thinking called the “Sequential Questions and Answers” (SQA) test. Included in this feasibility report are physicians’ answers to the Sequential Questions and Answers pre- and posttests and their brief questionnaire replies. Methods The authors refined the SQA test case scenario for content, ease of modifications of case scenarios, test uploading and answer retrieval. Eleven geographically distant physicians evaluated the SQA test, taking the pretest and posttest within two weeks. These physicians completed a brief questionnaire about the SQA test. Results Eleven physicians completed the SQA pre- and posttest; all answers were downloaded for analysis. They reported the ease of website login and navigating within the test module together with many helpful suggestions. Their average posttest score gain was 53% (p=0.012). Conclusions We report the successful launch of a unique Web-based instrument referred to as the Sequential Questions and Answers test. This distinctive test combines teaching organization of the clinical narrative into an assessment tool that promotes acquiring medical knowledge and clinical thinking. We successfully demonstrated the feasibility of geographically distant physicians to access the SQA instrument. The physicians’ helpful suggestions will be added to future SQA test versions. Medical schools might explore the integration of this multi-language-capable SQA assessment and teaching instrument into their undergraduate medical curriculum. PMID:25341203
Kania, Dramane; Sangaré, Lassana; Sakandé, Jean; Koanda, Abdoulaye; Nébié, Yacouba Kompingnin; Zerbo, Oumarou; Combasséré, Alain Wilfried; Guissou, Innocent Pierre; Rouet, François
2009-10-01
In Africa where blood-borne agents are highly prevalent, cheaper and feasible alternative strategies for blood donations testing are specifically required. From May to August 2002, 500 blood donations from Burkina Faso were tested for hepatitis B surface antigen (HBsAg), human immunodeficiency virus (HIV), syphilis, and hepatitis C virus (HCV) according to two distinct strategies. The first strategy was a conventional simultaneous screening of these four blood-borne infectious agents on each blood donation by using single-marker assays. The second strategy was a sequential screening starting by HBsAg. HBsAg-nonreactive blood donations were then further tested for HIV. If nonreactive, they were further tested for syphilis. If nonreactive, they were finally assessed for HCV antibodies. The accuracy and cost-effectiveness of the two strategies were compared. By using the simultaneous strategy, the seroprevalences of HBsAg, HIV, syphilis, and HCV among blood donors in Ouagadougou were estimated to be 19.2, 9.8, 1.6, and 5.2%. No significant difference of HIV, syphilis, and HCV prevalence rates was observed by using the sequential strategy (9.2, 1.9, and 4.7%, respectively). Whatever the strategy used, 157 blood donations (31.4%) were found to be reactive for at least one transfusion-transmissible agent and were thus discarded. The sequential strategy allowed a cost decrease of euro 908.6, compared to the simultaneous strategy. Given that approximately there are 50,000 blood donations annually in Burkina Faso, the money savings reached potentially euro 90,860. In resource-limited settings, the implementation of a sequential strategy appears as a pragmatic solution to promote safe blood supply and ensure sustainability of the system.
Guerrero-Ramos, Alvaro; Patel, Mauli; Kadakia, Kinjal; Haque, Tanzina
2014-06-01
The Architect EBV antibody panel is a new chemiluminescence immunoassay system used to determine the stage of Epstein-Barr virus (EBV) infection based on the detection of IgM and IgG antibodies to viral capsid antigen (VCA) and IgG antibodies against Epstein-Barr nuclear antigen 1 (EBNA-1). We evaluated its diagnostic accuracy in immunocompetent adolescents and young adults with clinical suspicion of infectious mononucleosis (IM) using the RecomLine EBV IgM and IgG immunoblots as the reference standard. In addition, the use of the antibody panel in a sequential testing algorithm based on initial EBNA-1 IgG analysis was assessed for cost-effectiveness. Finally, we investigated the degree of cross-reactivity of the VCA IgM marker during other primary viral infections that may present with an EBV IM-like picture. High sensitivity (98.3% [95% confidence interval {CI}, 90.7 to 99.7%]) and specificity (94.2% [95% CI, 87.9 to 97.8%]) were found after testing 162 precharacterized archived serum samples. There was perfect agreement between the use of the antibody panel in sequential and parallel testing algorithms, but substantial cost savings (23%) were obtained with the sequential strategy. A high rate of reactive VCA IgM results was found in primary cytomegalovirus (CMV) infections (60.7%). In summary, the Architect EBV antibody panel performs satisfactorily in the investigation of EBV IM in immunocompetent adolescents and young adults, and the application of an EBNA-1 IgG-based sequential testing algorithm is cost-effective in this diagnostic setting. Concomitant testing for CMV is strongly recommended to aid in the interpretation of EBV serological patterns. Copyright © 2014, American Society for Microbiology. All Rights Reserved.
Patel, Mauli; Kadakia, Kinjal; Haque, Tanzina
2014-01-01
The Architect EBV antibody panel is a new chemiluminescence immunoassay system used to determine the stage of Epstein-Barr virus (EBV) infection based on the detection of IgM and IgG antibodies to viral capsid antigen (VCA) and IgG antibodies against Epstein-Barr nuclear antigen 1 (EBNA-1). We evaluated its diagnostic accuracy in immunocompetent adolescents and young adults with clinical suspicion of infectious mononucleosis (IM) using the RecomLine EBV IgM and IgG immunoblots as the reference standard. In addition, the use of the antibody panel in a sequential testing algorithm based on initial EBNA-1 IgG analysis was assessed for cost-effectiveness. Finally, we investigated the degree of cross-reactivity of the VCA IgM marker during other primary viral infections that may present with an EBV IM-like picture. High sensitivity (98.3% [95% confidence interval {CI}, 90.7 to 99.7%]) and specificity (94.2% [95% CI, 87.9 to 97.8%]) were found after testing 162 precharacterized archived serum samples. There was perfect agreement between the use of the antibody panel in sequential and parallel testing algorithms, but substantial cost savings (23%) were obtained with the sequential strategy. A high rate of reactive VCA IgM results was found in primary cytomegalovirus (CMV) infections (60.7%). In summary, the Architect EBV antibody panel performs satisfactorily in the investigation of EBV IM in immunocompetent adolescents and young adults, and the application of an EBNA-1 IgG-based sequential testing algorithm is cost-effective in this diagnostic setting. Concomitant testing for CMV is strongly recommended to aid in the interpretation of EBV serological patterns. PMID:24695777
DOE Office of Scientific and Technical Information (OSTI.GOV)
Stoupin, Stanislav, E-mail: sstoupin@aps.anl.gov; Shvyd’ko, Yuri; Trakhtenberg, Emil
2016-07-27
We report progress on implementation and commissioning of sequential X-ray diffraction topography at 1-BM Optics Testing Beamline of the Advanced Photon Source to accommodate growing needs of strain characterization in diffractive crystal optics and other semiconductor single crystals. The setup enables evaluation of strain in single crystals in the nearly-nondispersive double-crystal geometry. Si asymmetric collimator crystals of different crystallographic orientations were designed, fabricated and characterized using in-house capabilities. Imaging the exit beam using digital area detectors permits rapid sequential acquisition of X-ray topographs at different angular positions on the rocking curve of a crystal under investigation. Results on sensitivity andmore » spatial resolution are reported based on experiments with high-quality Si and diamond crystals. The new setup complements laboratory-based X-ray topography capabilities of the Optics group at the Advanced Photon Source.« less
ON THE SUBJECT OF HYPOTHESIS TESTING
Ugoni, Antony
1993-01-01
In this paper, the definition of a statistical hypothesis is discussed, and the considerations which need to be addressed when testing a hypothesis. In particular, the p-value, significance level, and power of a test are reviewed. Finally, the often quoted confidence interval is given a brief introduction. PMID:17989768
Some consequences of using the Horsfall-Barratt scale for hypothesis testing
USDA-ARS?s Scientific Manuscript database
Comparing treatment effects by hypothesis testing is a common practice in plant pathology. Nearest percent estimates (NPEs) of disease severity were compared to Horsfall-Barratt (H-B) scale data to explore whether there was an effect of assessment method on hypothesis testing. A simulation model ba...
Hypothesis Testing in Task-Based Interaction
ERIC Educational Resources Information Center
Choi, Yujeong; Kilpatrick, Cynthia
2014-01-01
Whereas studies show that comprehensible output facilitates L2 learning, hypothesis testing has received little attention in Second Language Acquisition (SLA). Following Shehadeh (2003), we focus on hypothesis testing episodes (HTEs) in which learners initiate repair of their own speech in interaction. In the context of a one-way information gap…
Classroom-Based Strategies to Incorporate Hypothesis Testing in Functional Behavior Assessments
ERIC Educational Resources Information Center
Lloyd, Blair P.; Weaver, Emily S.; Staubitz, Johanna L.
2017-01-01
When results of descriptive functional behavior assessments are unclear, hypothesis testing can help school teams understand how the classroom environment affects a student's challenging behavior. This article describes two hypothesis testing strategies that can be used in classroom settings: structural analysis and functional analysis. For each…
Hypothesis Testing in the Real World
ERIC Educational Resources Information Center
Miller, Jeff
2017-01-01
Critics of null hypothesis significance testing suggest that (a) its basic logic is invalid and (b) it addresses a question that is of no interest. In contrast to (a), I argue that the underlying logic of hypothesis testing is actually extremely straightforward and compelling. To substantiate that, I present examples showing that hypothesis…
ERIC Educational Resources Information Center
Sullins, Walter L.
Five-hundred dichotomously scored response patterns were generated with sequentially independent (SI) items and 500 with dependent (SD) items for each of thirty-six combinations of sampling parameters (i.e., three test lengths, three sample sizes, and four item difficulty distributions). KR-20, KR-21, and Split-Half (S-H) reliabilities were…
Sequential cryogen spraying for heat flux control at the skin surface
NASA Astrophysics Data System (ADS)
Majaron, Boris; Aguilar, Guillermo; Basinger, Brooke; Randeberg, Lise L.; Svaasand, Lars O.; Lavernia, Enrique J.; Nelson, J. Stuart
2001-05-01
Heat transfer rate at the skin-air interface is of critical importance for the benefits of cryogen spray cooling in combination with laser therapy of shallow subsurface skin lesions, such as port-wine stain birthmarks. With some cryogen spray devices, a layer of liquid cryogen builds up on the skin surface during the spurt, which may impair heat transfer across the skin surface due to relatively low thermal conductivity and potentially higher temperature of the liquid cryogen layer as compared to the spray droplets. While the mass flux of cryogen delivery can be adjusted by varying the atomizing nozzle geometry, this may strongly affect other spray properties, such as lateral spread (cone), droplet size, velocity, and temperature distribution. We present here first experiments with sequential cryogen spraying, which may enable accurate mass flux control through variation of spray duty cycle, while minimally affecting other spray characteristics. The observed increase of cooling rate and efficiency at moderate duty cycle levels supports the above described hypothesis of isolating liquid layer, and demonstrates a novel approach to optimization of cryogen spray devices for individual laser dermatological applications.
2018-01-01
ABSTRACT Long-germ insects, such as the fruit fly Drosophila melanogaster, pattern their segments simultaneously, whereas short-germ insects, such as the beetle Tribolium castaneum, pattern their segments sequentially, from anterior to posterior. Although the two modes of segmentation at first appear quite distinct, much of this difference might simply reflect developmental heterochrony. We now show here that, in both Drosophila and Tribolium, segment patterning occurs within a common framework of sequential Caudal, Dichaete and Odd-paired expression. In Drosophila, these transcription factors are expressed like simple timers within the blastoderm, whereas in Tribolium they form wavefronts that sweep from anterior to posterior across the germband. In Drosophila, all three are known to regulate pair-rule gene expression and influence the temporal progression of segmentation. We propose that these regulatory roles are conserved in short-germ embryos, and that therefore the changing expression profiles of these genes across insects provide a mechanistic explanation for observed differences in the timing of segmentation. In support of this hypothesis, we demonstrate that Odd-paired is essential for segmentation in Tribolium, contrary to previous reports. PMID:29724758
Spacecraft Data Simulator for the test of level zero processing systems
NASA Technical Reports Server (NTRS)
Shi, Jeff; Gordon, Julie; Mirchandani, Chandru; Nguyen, Diem
1994-01-01
The Microelectronic Systems Branch (MSB) at Goddard Space Flight Center (GSFC) has developed a Spacecraft Data Simulator (SDS) to support the development, test, and verification of prototype and production Level Zero Processing (LZP) systems. Based on a disk array system, the SDS is capable of generating large test data sets up to 5 Gigabytes and outputting serial test data at rates up to 80 Mbps. The SDS supports data formats including NASA Communication (Nascom) blocks, Consultative Committee for Space Data System (CCSDS) Version 1 & 2 frames and packets, and all the Advanced Orbiting Systems (AOS) services. The capability to simulate both sequential and non-sequential time-ordered downlink data streams with errors and gaps is crucial to test LZP systems. This paper describes the system architecture, hardware and software designs, and test data designs. Examples of test data designs are included to illustrate the application of the SDS.
Vasconcelos, Karla Anacleto de; Frota, Silvana Maria Monte Coelho; Ruffino-Netto, Antonio; Kritski, Afrânio Lineu
2018-04-01
To investigate early detection of amikacin-induced ototoxicity in a population treated for multidrug-resistant tuberculosis (MDR-TB), by means of three different tests: pure-tone audiometry (PTA); high-frequency audiometry (HFA); and distortion-product otoacoustic emission (DPOAE) testing. This was a longitudinal prospective cohort study involving patients aged 18-69 years with a diagnosis of MDR-TB who had to receive amikacin for six months as part of their antituberculosis drug regimen for the first time. Hearing was assessed before treatment initiation and at two and six months after treatment initiation. Sequential statistics were used to analyze the results. We included 61 patients, but the final population consisted of 10 patients (7 men and 3 women) because of sequential analysis. Comparison of the test results obtained at two and six months after treatment initiation with those obtained at baseline revealed that HFA at two months and PTA at six months detected hearing threshold shifts consistent with ototoxicity. However, DPOAE testing did not detect such shifts. The statistical method used in this study makes it possible to conclude that, over the six-month period, amikacin-associated hearing threshold shifts were detected by HFA and PTA, and that DPOAE testing was not efficient in detecting such shifts.
Simultaneous Versus Sequential Presentation in Testing Recognition Memory for Faces.
Finley, Jason R; Roediger, Henry L; Hughes, Andrea D; Wahlheim, Christopher N; Jacoby, Larry L
2015-01-01
Three experiments examined the issue of whether faces could be better recognized in a simul- taneous test format (2-alternative forced choice [2AFC]) or a sequential test format (yes-no). All experiments showed that when target faces were present in the test, the simultaneous procedure led to superior performance (area under the ROC curve), whether lures were high or low in similarity to the targets. However, when a target-absent condition was used in which no lures resembled the targets but the lures were similar to each other, the simultaneous procedure yielded higher false alarm rates (Experiments 2 and 3) and worse overall performance (Experi- ment 3). This pattern persisted even when we excluded responses that participants opted to withhold rather than volunteer. We conclude that for the basic recognition procedures used in these experiments, simultaneous presentation of alternatives (2AFC) generally leads to better discriminability than does sequential presentation (yes-no) when a target is among the alterna- tives. However, our results also show that the opposite can occur when there is no target among the alternatives. An important future step is to see whether these patterns extend to more realistic eyewitness lineup procedures. The pictures used in the experiment are available online at http://www.press.uillinois.edu/journals/ajp/media/testing_recognition/.
Tsang, William W. N.; Gao, Kelly L.; Chan, K. M.; Purves, Sheila; Macfarlane, Duncan J.; Fong, Shirley S. M.
2015-01-01
Objective. To investigate the effects of sitting Tai Chi on muscle strength, balance control, and quality of life (QOL) among survivors with spinal cord injuries (SCI). Methods. Eleven SCI survivors participated in the sitting Tai Chi training (90 minutes/session, 2 times/week for 12 weeks) and eight SCI survivors acted as controls. Dynamic sitting balance was evaluated using limits of stability test and a sequential weight shifting test in sitting. Handgrip strength was also tested using a hand-held dynamometer. QOL was measured using the World Health Organization's Quality of Life Scale. Results. Tai Chi practitioners achieved significant improvements in their reaction time (P = 0.042); maximum excursion (P = 0.016); and directional control (P = 0.025) in the limits of stability test after training. In the sequential weight shifting test, they significantly improved their total time to sequentially hit the 12 targets (P = 0.035). Significant improvement in handgrip strength was also found among the Tai Chi practitioners (P = 0.049). However, no significant within and between-group differences were found in the QOL outcomes (P > 0.05). Conclusions. Twelve weeks of sitting Tai Chi training could improve the dynamic sitting balance and handgrip strength, but not QOL, of the SCI survivors. PMID:25688276
ERIC Educational Resources Information Center
Kwon, Yong-Ju; Jeong, Jin-Su; Park, Yun-Bok
2006-01-01
The purpose of the present study was to test the hypothesis that student's abductive reasoning skills play an important role in the generation of hypotheses on pendulum motion tasks. To test the hypothesis, a hypothesis-generating test on pendulum motion, and a prior-belief test about pendulum motion were developed and administered to a sample of…
Potential for leaching of arsenic from excavated rock after different drying treatments.
Li, Jining; Kosugi, Tomoya; Riya, Shohei; Hashimoto, Yohey; Hou, Hong; Terada, Akihiko; Hosomi, Masaaki
2016-07-01
Leaching of arsenic (As) from excavated rock subjected to different drying methods is compared using sequential leaching tests and rapid small-scale column tests combined with a sequential extraction procedure. Although the total As content in the rock was low (8.81 mg kg(-1)), its resulting concentration in the leachate when leached at a liquid-to-solid ratio of 10 L kg(-1) exceeded the environmental standard (10 μg L(-1)). As existed mainly in dissolved forms in the leachates. All of the drying procedures applied in this study increased the leaching of As, with freeze-drying leading to the largest increase. Water extraction of As using the two tests showed different leaching behaviors as a function of the liquid-to-solid ratio, and achieved average extractions of up to 35.7% and 25.8% total As, respectively. Dissolution of As from the mineral surfaces and subsequent re-adsorption controlled the short-term release of As; dissolution of Fe, Al, and dissolved organic carbon played important roles in long-term As leaching. Results of the sequential extraction procedure showed that use of 0.05 M (NH4)2SO4 underestimates the readily soluble As. Long-term water extraction removed almost all of the non-specifically sorbed As and most of the specifically sorbed As. The concept of pollution potential indices, which are easily determined by the sequential leaching test, is proposed in this study and is considered for possible use in assessing efficacy of treatment of excavated rocks. Copyright © 2016 Elsevier Ltd. All rights reserved.
Acute Oral Toxicity Up-And-Down-Procedure
The Up-and-Down Procedure is an alternative acute toxicity test that provides a way to determine the toxicity of chemicals with fewer test animals by using sequential dosing steps. Find out about this test procedure.
Spiegelhalter, David; Grigg, Olivia; Kinsman, Robin; Treasure, Tom
2003-02-01
To investigate the use of the risk-adjusted sequential probability ratio test in monitoring the cumulative occurrence of adverse clinical outcomes. Retrospective analysis of three longitudinal datasets. Patients aged 65 years and over under the care of Harold Shipman between 1979 and 1997, patients under 1 year of age undergoing paediatric heart surgery in Bristol Royal Infirmary between 1984 and 1995, adult patients receiving cardiac surgery from a team of cardiac surgeons in London,UK. Annual and 30-day mortality rates. Using reasonable boundaries, the procedure could have indicated an 'alarm' in Bristol after publication of the 1991 Cardiac Surgical Register, and in 1985 or 1997 for Harold Shipman depending on the data source and the comparator. The cardiac surgeons showed no significant deviation from expected performance. The risk-adjusted sequential probability test is simple to implement, can be applied in a variety of contexts, and might have been useful to detect specific instances of past divergent performance. The use of this and related techniques deserves further attention in the context of prospectively monitoring adverse clinical outcomes.
A sequential adaptation technique and its application to the Mark 12 IFF system
NASA Astrophysics Data System (ADS)
Bailey, John S.; Mallett, John D.; Sheppard, Duane J.; Warner, F. Neal; Adams, Robert
1986-07-01
Sequential adaptation uses only two sets of receivers, correlators, and A/D converters which are time multiplexed to effect spatial adaptation in a system with (N) adaptive degrees of freedom. This technique can substantially reduce the hardware cost over what is realizable in a parallel architecture. A three channel L-band version of the sequential adapter was built and tested for use with the MARK XII IFF (identify friend or foe) system. In this system the sequentially determined adaptive weights were obtained digitally but implemented at RF. As a result, many of the post RF hardware induced sources of error that normally limit cancellation, such as receiver mismatch, are removed by the feedback property. The result is a system that can yield high levels of cancellation and be readily retrofitted to currently fielded equipment.
16 CFR 1500.42 - Test for eye irritants.
Code of Federal Regulations, 2014 CFR
2014-01-01
..., including testing that does not require animals, are presented in the CPSC's animal testing policy set forth... conducted, a sequential testing strategy is recommended to reduce the number of test animals. Additionally... eye irritation. Both eyes of each animal in the test group shall be examined before testing, and only...
Making Knowledge Delivery Failsafe: Adding Step Zero in Hypothesis Testing
ERIC Educational Resources Information Center
Pan, Xia; Zhou, Qiang
2010-01-01
Knowledge of statistical analysis is increasingly important for professionals in modern business. For example, hypothesis testing is one of the critical topics for quality managers and team workers in Six Sigma training programs. Delivering the knowledge of hypothesis testing effectively can be an important step for the incapable learners or…
Impaired temporal contrast sensitivity in dyslexics is specific to retain-and-compare paradigms.
Ben-Yehudah, G; Sackett, E; Malchi-Ginzberg, L; Ahissar, M
2001-07-01
Developmental dyslexia is a specific reading disability that affects 5-10% of the population. Recent studies have suggested that dyslexics may experience a deficit in the visual magnocellular pathway. The most extensively studied prediction deriving from this hypothesis is impaired contrast sensitivity to transient, low-luminance stimuli at low spatial frequencies. However, the findings are inconsistent across studies and even seemingly contradictory. In the present study, we administered several different paradigms for assessing temporal contrast sensitivity, and found both impaired and normal contrast sensitivity within the same group of dyslexic participants. Under sequential presentation, in a temporal forced choice paradigm, dyslexics showed impaired sensitivity to both drifting and flickering gratings. However, under simultaneous presentation, with a spatial forced choice paradigm, dyslexics' sensitivity did not differ from that of the controls. Within each paradigm, dyslexics' sensitivity was poorer at higher temporal frequencies, consistent with the magnocellular hypothesis. These results suggest that a basic perceptual impairment in dyslexics may be their limited ability to retain-and-compare perceptual traces across brief intervals.
Testing of Hypothesis in Equivalence and Non Inferiority Trials-A Concept.
Juneja, Atul; Aggarwal, Abha R; Adhikari, Tulsi; Pandey, Arvind
2016-04-01
Establishing the appropriate hypothesis is one of the important steps for carrying out the statistical tests/analysis. Its understanding is important for interpreting the results of statistical analysis. The current communication attempts to provide the concept of testing of hypothesis in non inferiority and equivalence trials, where the null hypothesis is just reverse of what is set up for conventional superiority trials. It is similarly looked for rejection for establishing the fact the researcher is intending to prove. It is important to mention that equivalence or non inferiority cannot be proved by accepting the null hypothesis of no difference. Hence, establishing the appropriate statistical hypothesis is extremely important to arrive at meaningful conclusion for the set objectives in research.
Ruotolo, Francesco; Ruggiero, Gennaro; Vinciguerra, Michela; Iachini, Tina
2012-02-01
The aim of this research is to assess whether the crucial factor in determining the characteristics of blind people's spatial mental images is concerned with the visual impairment per se or the processing style that the dominant perceptual modalities used to acquire spatial information impose, i.e. simultaneous (vision) vs sequential (kinaesthesis). Participants were asked to learn six positions in a large parking area via movement alone (congenitally blind, adventitiously blind, blindfolded sighted) or with vision plus movement (simultaneous sighted, sequential sighted), and then to mentally scan between positions in the path. The crucial manipulation concerned the sequential sighted group. Their visual exploration was made sequential by putting visual obstacles within the pathway in such a way that they could not see simultaneously the positions along the pathway. The results revealed a significant time/distance linear relation in all tested groups. However, the linear component was lower in sequential sighted and blind participants, especially congenital. Sequential sighted and congenitally blind participants showed an almost overlapping performance. Differences between groups became evident when mentally scanning farther distances (more than 5m). This threshold effect could be revealing of processing limitations due to the need of integrating and updating spatial information. Overall, the results suggest that the characteristics of the processing style rather than the visual impairment per se affect blind people's spatial mental images. Copyright © 2011 Elsevier B.V. All rights reserved.
[Reflex epilepsy evoked by decision making: report of a case (author's transl)].
Mutani, R; Ganga, A; Agnetti, V
1980-01-01
A 17-year-old girl with a story of Gran Mal attacks occurring during lessons of mathematics or solving mathematical problems, was investigated with prolonged EEG recordings. During the sessions, relax periods were alternated with arithmetical or mathematical testing, with card or checkers games and solution of puzzles and crossword problems, and with different neuropsychological tests. EGG recordings were characterized by the appearance, on a normal background, of bilaterally synchronous and symmetrical spike-and-wave and polispike-and-wave discharges, associated with loss of consciousness. During relax their mean frequency was one/54 min., it doubled during execution of tests involved with nonsequential decision making, and was eight times as high (one/7 min.) during tests involving sequential decision making. Some tension, challenge and complexity of the performance were also important as precipitating factors. Their lack deprived sequential tests of their efficacy, while on the contrary their presence sometimes gave nonsequential tests full efficacy.
Sequential Probability Ratio Test for Collision Avoidance Maneuver Decisions
NASA Technical Reports Server (NTRS)
Carpenter, J. Russell; Markley, F. Landis
2010-01-01
When facing a conjunction between space objects, decision makers must chose whether to maneuver for collision avoidance or not. We apply a well-known decision procedure, the sequential probability ratio test, to this problem. We propose two approaches to the problem solution, one based on a frequentist method, and the other on a Bayesian method. The frequentist method does not require any prior knowledge concerning the conjunction, while the Bayesian method assumes knowledge of prior probability densities. Our results show that both methods achieve desired missed detection rates, but the frequentist method's false alarm performance is inferior to the Bayesian method's
ERIC Educational Resources Information Center
Hagan, John; Foster, Holly
2003-01-01
Data from the National Longitudinal Study of Adolescent Health on 11,506 high school students were used to test a gendered and age-graded sequential stress theory in which delinquency can play an additive and intervening role in adolescents' movement from early anger through rebellious or aggressive forms of behavior to later depressive symptoms…
ERIC Educational Resources Information Center
Cacola, Priscila; Roberson, Jerroed; Gabbard, Carl
2013-01-01
Studies show that as we enter older adulthood (greater than 64 years), our ability to mentally represent action in the form of using motor imagery declines. Using a chronometry paradigm to compare the movement duration of imagined and executed movements, we tested young-, middle-aged, and older adults on their ability to perform sequential finger…
16 CFR § 1500.42 - Test for eye irritants.
Code of Federal Regulations, 2013 CFR
2013-01-01
... before testing, and only those animals without eye defects or irritation shall be used. The animal is... substances, including testing that does not require animals, are presented in the CPSC's animal testing... conducted, a sequential testing strategy is recommended to reduce the number of test animals. Additionally...
Facio, Flavia M; Sapp, Julie C; Linn, Amy; Biesecker, Leslie G
2012-10-10
Massively-parallel sequencing (MPS) technologies create challenges for informed consent of research participants given the enormous scale of the data and the wide range of potential results. We propose that the consent process in these studies be based on whether they use MPS to test a hypothesis or to generate hypotheses. To demonstrate the differences in these approaches to informed consent, we describe the consent processes for two MPS studies. The purpose of our hypothesis-testing study is to elucidate the etiology of rare phenotypes using MPS. The purpose of our hypothesis-generating study is to test the feasibility of using MPS to generate clinical hypotheses, and to approach the return of results as an experimental manipulation. Issues to consider in both designs include: volume and nature of the potential results, primary versus secondary results, return of individual results, duty to warn, length of interaction, target population, and privacy and confidentiality. The categorization of MPS studies as hypothesis-testing versus hypothesis-generating can help to clarify the issue of so-called incidental or secondary results for the consent process, and aid the communication of the research goals to study participants.
An Exercise for Illustrating the Logic of Hypothesis Testing
ERIC Educational Resources Information Center
Lawton, Leigh
2009-01-01
Hypothesis testing is one of the more difficult concepts for students to master in a basic, undergraduate statistics course. Students often are puzzled as to why statisticians simply don't calculate the probability that a hypothesis is true. This article presents an exercise that forces students to lay out on their own a procedure for testing a…
ERIC Educational Resources Information Center
Wilcox, Rand R.; Serang, Sarfaraz
2017-01-01
The article provides perspectives on p values, null hypothesis testing, and alternative techniques in light of modern robust statistical methods. Null hypothesis testing and "p" values can provide useful information provided they are interpreted in a sound manner, which includes taking into account insights and advances that have…
Exposure Control Using Adaptive Multi-Stage Item Bundles.
ERIC Educational Resources Information Center
Luecht, Richard M.
This paper presents a multistage adaptive testing test development paradigm that promises to handle content balancing and other test development needs, psychometric reliability concerns, and item exposure. The bundled multistage adaptive testing (BMAT) framework is a modification of the computer-adaptive sequential testing framework introduced by…
Hypothesis Testing Using Spatially Dependent Heavy Tailed Multisensor Data
2014-12-01
Office of Research 113 Bowne Hall Syracuse, NY 13244 -1200 ABSTRACT HYPOTHESIS TESTING USING SPATIALLY DEPENDENT HEAVY-TAILED MULTISENSOR DATA Report...consistent with the null hypothesis of linearity and can be used to estimate the distribution of a test statistic that can discrimi- nate between the null... Test for nonlinearity. Histogram is generated using the surrogate data. The statistic of the original time series is represented by the solid line
Gabard-Durnam, Laurel Joy; Gee, Dylan Grace; Goff, Bonnie; Flannery, Jessica; Telzer, Eva; Humphreys, Kathryn Leigh; Lumian, Daniel Stephen; Fareri, Dominic Stephen; Caldera, Christina; Tottenham, Nim
2016-04-27
Although the functional architecture of the brain is indexed by resting-state connectivity networks, little is currently known about the mechanisms through which these networks assemble into stable mature patterns. The current study posits and tests the long-term phasic molding hypothesis that resting-state networks are gradually shaped by recurring stimulus-elicited connectivity across development by examining how both stimulus-elicited and resting-state functional connections of the human brain emerge over development at the systems level. Using a sequential design following 4- to 18-year-olds over a 2 year period, we examined the predictive associations between stimulus-elicited and resting-state connectivity in amygdala-cortical circuitry as an exemplar case (given this network's protracted development across these ages). Age-related changes in amygdala functional connectivity converged on the same regions of medial prefrontal cortex (mPFC) and inferior frontal gyrus when elicited by emotional stimuli and when measured at rest. Consistent with the long-term phasic molding hypothesis, prospective analyses for both connections showed that the magnitude of an individual's stimulus-elicited connectivity unidirectionally predicted resting-state functional connectivity 2 years later. For the amygdala-mPFC connection, only stimulus-elicited connectivity during childhood and the transition to adolescence shaped future resting-state connectivity, consistent with a sensitive period ending with adolescence for the amygdala-mPFC circuit. Together, these findings suggest that resting-state functional architecture may arise from phasic patterns of functional connectivity elicited by environmental stimuli over the course of development on the order of years. A fundamental issue in understanding the ontogeny of brain function is how resting-state (intrinsic) functional networks emerge and relate to stimulus-elicited functional connectivity. Here, we posit and test the long-term phasic molding hypothesis that resting-state network development is influenced by recurring stimulus-elicited connectivity through prospective examination of the developing human amygdala-cortical functional connections. Our results provide critical insight into how early environmental events sculpt functional network architecture across development and highlight childhood as a potential developmental period of heightened malleability for the amygdala-medial prefrontal cortex circuit. These findings have implications for how both positive and adverse experiences influence the developing brain and motivate future investigations of whether this molding mechanism reflects a general phenomenon of brain development. Copyright © 2016 the authors 0270-6474/16/364772-14$15.00/0.
Gee, Dylan Grace; Goff, Bonnie; Flannery, Jessica; Telzer, Eva; Humphreys, Kathryn Leigh; Lumian, Daniel Stephen; Fareri, Dominic Stephen; Caldera, Christina; Tottenham, Nim
2016-01-01
Although the functional architecture of the brain is indexed by resting-state connectivity networks, little is currently known about the mechanisms through which these networks assemble into stable mature patterns. The current study posits and tests the long-term phasic molding hypothesis that resting-state networks are gradually shaped by recurring stimulus-elicited connectivity across development by examining how both stimulus-elicited and resting-state functional connections of the human brain emerge over development at the systems level. Using a sequential design following 4- to 18-year-olds over a 2 year period, we examined the predictive associations between stimulus-elicited and resting-state connectivity in amygdala-cortical circuitry as an exemplar case (given this network's protracted development across these ages). Age-related changes in amygdala functional connectivity converged on the same regions of medial prefrontal cortex (mPFC) and inferior frontal gyrus when elicited by emotional stimuli and when measured at rest. Consistent with the long-term phasic molding hypothesis, prospective analyses for both connections showed that the magnitude of an individual's stimulus-elicited connectivity unidirectionally predicted resting-state functional connectivity 2 years later. For the amygdala-mPFC connection, only stimulus-elicited connectivity during childhood and the transition to adolescence shaped future resting-state connectivity, consistent with a sensitive period ending with adolescence for the amygdala-mPFC circuit. Together, these findings suggest that resting-state functional architecture may arise from phasic patterns of functional connectivity elicited by environmental stimuli over the course of development on the order of years. SIGNIFICANCE STATEMENT A fundamental issue in understanding the ontogeny of brain function is how resting-state (intrinsic) functional networks emerge and relate to stimulus-elicited functional connectivity. Here, we posit and test the long-term phasic molding hypothesis that resting-state network development is influenced by recurring stimulus-elicited connectivity through prospective examination of the developing human amygdala-cortical functional connections. Our results provide critical insight into how early environmental events sculpt functional network architecture across development and highlight childhood as a potential developmental period of heightened malleability for the amygdala-medial prefrontal cortex circuit. These findings have implications for how both positive and adverse experiences influence the developing brain and motivate future investigations of whether this molding mechanism reflects a general phenomenon of brain development. PMID:27122035
Flanagan, Emma C; Lagarde, Julien; Hahn, Valérie; Guichart-Gomez, Elodie; Sarazin, Marie; Hornberger, Michael; Bertoux, Maxime
2018-05-01
Environmental dependency syndrome (EDS), including utilization (UB) and imitation (IB) behaviors, is often reported in behavioral variant frontotemporal dementia (bvFTD). These behaviors are commonly attributed to executive dysfunction. However, inconsistent associations between EDS and poor executive performance has led to an alternative "social hypothesis," instead implicating patients' misinterpretation of the examiner's intention. We investigated the possible explanatory cognitive mechanisms of EDS in bvFTD by relating UB and IB to performance on tests of executive functioning and theory of mind (ToM). This study analyzed retrospective data of 32 bvFTD patients. Data included scores of UB and IB, various executive measures, and ToM assessment using the faux pas test, from which we extracted a mental attribution score. Of the patients, 15.6% and 40.6% exhibited UB and IB, respectively. We conducted an automatic linear modeling analysis with executive and mental attribution measures as predictor variables, and UB and IB sequentially considered as target variables. ToM mental attribution score, visual abstraction and flexibility measures from the Wisconsin Card Sorting Test, and motor sequence performance significantly (corrected ps < .05) predicted IB. No executive or ToM measures significantly predicted UB. These findings reveal a complex interaction between executive dysfunction and mental attribution deficits influencing the prevalence of EDS in bvFTD. Further investigation is required to improve our understanding of the mechanisms underlying these behaviors. (PsycINFO Database Record (c) 2018 APA, all rights reserved).
Kass, Nancy E; Taylor, Holly A; Ali, Joseph; Hallez, Kristina; Chaisson, Lelia
2015-02-01
Research suggests that participants do not always adequately understand studies. While some consent interventions increase understanding, methodologic challenges have been raised in studying consent outside of actual trial settings. This study examined the feasibility of testing two consent interventions in actual studies and measured effectiveness of interventions in improving understanding. Participants enrolling in any of eight ongoing clinical trials were sequentially assigned to one of three different informed consent strategies for enrollment in their clinical trial. Control participants received standard consent procedures for their trial. Participants in the first intervention arm received a bulleted fact sheet summarizing key study information. Participants in the second intervention arm received the bulleted fact sheet and also engaged in a feedback Q&A session. Later, patients answered closed- and open-ended questions to assess patient understanding and literacy. Descriptive statistics, Wilcoxon -Mann -Whitney and Kruskal-Wallis tests were generated to assess correlations; regression analysis determined predictors of understanding. 144 participants enrolled. Using regression analysis, participants receiving the second intervention scored 7.6 percentage points higher (p = .02) on open-ended questions about understanding than participants in the control, although unadjusted comparisons did not reach statistical significance. Our study supports the hypothesis that patients receiving both bulleted fact sheets and a Q&A session had higher understanding compared to standard consent. Fact sheets and short structured dialog are quick to administer and easy to replicate across studies and should be tested in larger samples. © The Author(s) 2014.
The role of responsibility and fear of guilt in hypothesis-testing.
Mancini, Francesco; Gangemi, Amelia
2006-12-01
Recent theories argue that both perceived responsibility and fear of guilt increase obsessive-like behaviours. We propose that hypothesis-testing might account for this effect. Both perceived responsibility and fear of guilt would influence subjects' hypothesis-testing, by inducing a prudential style. This style implies focusing on and confirming the worst hypothesis, and reiterating the testing process. In our experiment, we manipulated the responsibility and fear of guilt of 236 normal volunteers who executed a deductive task. The results show that perceived responsibility is the main factor that influenced individuals' hypothesis-testing. Fear of guilt has however a significant additive effect. Guilt-fearing participants preferred to carry on with the diagnostic process, even when faced with initial favourable evidence, whereas participants in the responsibility condition only did so when confronted with an unfavourable evidence. Implications for the understanding of obsessive-compulsive disorder (OCD) are discussed.
Rhythmic grouping biases constrain infant statistical learning
Hay, Jessica F.; Saffran, Jenny R.
2012-01-01
Linguistic stress and sequential statistical cues to word boundaries interact during speech segmentation in infancy. However, little is known about how the different acoustic components of stress constrain statistical learning. The current studies were designed to investigate whether intensity and duration each function independently as cues to initial prominence (trochaic-based hypothesis) or whether, as predicted by the Iambic-Trochaic Law (ITL), intensity and duration have characteristic and separable effects on rhythmic grouping (ITL-based hypothesis) in a statistical learning task. Infants were familiarized with an artificial language (Experiments 1 & 3) or a tone stream (Experiment 2) in which there was an alternation in either intensity or duration. In addition to potential acoustic cues, the familiarization sequences also contained statistical cues to word boundaries. In speech (Experiment 1) and non-speech (Experiment 2) conditions, 9-month-old infants demonstrated discrimination patterns consistent with an ITL-based hypothesis: intensity signaled initial prominence and duration signaled final prominence. The results of Experiment 3, in which 6.5-month-old infants were familiarized with the speech streams from Experiment 1, suggest that there is a developmental change in infants’ willingness to treat increased duration as a cue to word offsets in fluent speech. Infants’ perceptual systems interact with linguistic experience to constrain how infants learn from their auditory environment. PMID:23730217
Rise and fall of political complexity in island South-East Asia and the Pacific.
Currie, Thomas E; Greenhill, Simon J; Gray, Russell D; Hasegawa, Toshikazu; Mace, Ruth
2010-10-14
There is disagreement about whether human political evolution has proceeded through a sequence of incremental increases in complexity, or whether larger, non-sequential increases have occurred. The extent to which societies have decreased in complexity is also unclear. These debates have continued largely in the absence of rigorous, quantitative tests. We evaluated six competing models of political evolution in Austronesian-speaking societies using phylogenetic methods. Here we show that in the best-fitting model political complexity rises and falls in a sequence of small steps. This is closely followed by another model in which increases are sequential but decreases can be either sequential or in bigger drops. The results indicate that large, non-sequential jumps in political complexity have not occurred during the evolutionary history of these societies. This suggests that, despite the numerous contingent pathways of human history, there are regularities in cultural evolution that can be detected using computational phylogenetic methods.
Visual short-term memory for sequential arrays.
Kumar, Arjun; Jiang, Yuhong
2005-04-01
The capacity of visual short-term memory (VSTM) for a single visual display has been investigated in past research, but VSTM for multiple sequential arrays has been explored only recently. In this study, we investigate the capacity of VSTM across two sequential arrays separated by a variable stimulus onset asynchrony (SOA). VSTM for spatial locations (Experiment 1), colors (Experiments 2-4), orientations (Experiments 3 and 4), and conjunction of color and orientation (Experiment 4) were tested, with the SOA across the two sequential arrays varying from 100 to 1,500 msec. We find that VSTM for the trailing array is much better than VSTM for the leading array, but when averaged across the two arrays VSTM has a constant capacity independent of the SOA. We suggest that multiple displays compete for retention in VSTM and that separating information into two temporally discrete groups does not enhance the overall capacity of VSTM.
Physics-based, Bayesian sequential detection method and system for radioactive contraband
Candy, James V; Axelrod, Michael C; Breitfeller, Eric F; Chambers, David H; Guidry, Brian L; Manatt, Douglas R; Meyer, Alan W; Sale, Kenneth E
2014-03-18
A distributed sequential method and system for detecting and identifying radioactive contraband from highly uncertain (noisy) low-count, radionuclide measurements, i.e. an event mode sequence (EMS), using a statistical approach based on Bayesian inference and physics-model-based signal processing based on the representation of a radionuclide as a monoenergetic decomposition of monoenergetic sources. For a given photon event of the EMS, the appropriate monoenergy processing channel is determined using a confidence interval condition-based discriminator for the energy amplitude and interarrival time and parameter estimates are used to update a measured probability density function estimate for a target radionuclide. A sequential likelihood ratio test is then used to determine one of two threshold conditions signifying that the EMS is either identified as the target radionuclide or not, and if not, then repeating the process for the next sequential photon event of the EMS until one of the two threshold conditions is satisfied.
Yu, Yinan; Diamantaras, Konstantinos I; McKelvey, Tomas; Kung, Sun-Yuan
2018-02-01
In kernel-based classification models, given limited computational power and storage capacity, operations over the full kernel matrix becomes prohibitive. In this paper, we propose a new supervised learning framework using kernel models for sequential data processing. The framework is based on two components that both aim at enhancing the classification capability with a subset selection scheme. The first part is a subspace projection technique in the reproducing kernel Hilbert space using a CLAss-specific Subspace Kernel representation for kernel approximation. In the second part, we propose a novel structural risk minimization algorithm called the adaptive margin slack minimization to iteratively improve the classification accuracy by an adaptive data selection. We motivate each part separately, and then integrate them into learning frameworks for large scale data. We propose two such frameworks: the memory efficient sequential processing for sequential data processing and the parallelized sequential processing for distributed computing with sequential data acquisition. We test our methods on several benchmark data sets and compared with the state-of-the-art techniques to verify the validity of the proposed techniques.
Wild, Aaron T; Gandhi, Nishant; Chettiar, Sivarajan T; Aziz, Khaled; Gajula, Rajendra P; Williams, Russell D; Kumar, Rachit; Taparra, Kekoa; Zeng, Jing; Cades, Jessica A; Velarde, Esteban; Menon, Siddharth; Geschwind, Jean F; Cosgrove, David; Pawlik, Timothy M; Maitra, Anirban; Wong, John; Hales, Russell K; Torbenson, Michael S; Herman, Joseph M; Tran, Phuoc T
2013-01-01
Sorafenib (SOR) is the only systemic agent known to improve survival for hepatocellular carcinoma (HCC). However, SOR prolongs survival by less than 3 months and does not alter symptomatic progression. To improve outcomes, several phase I-II trials are currently examining SOR with radiation (RT) for HCC utilizing heterogeneous concurrent and sequential treatment regimens. Our study provides preclinical data characterizing the effects of concurrent versus sequential RT-SOR on HCC cells both in vitro and in vivo. Concurrent and sequential RT-SOR regimens were tested for efficacy among 4 HCC cell lines in vitro by assessment of clonogenic survival, apoptosis, cell cycle distribution, and γ-H2AX foci formation. Results were confirmed in vivo by evaluating tumor growth delay and performing immunofluorescence staining in a hind-flank xenograft model. In vitro, concurrent RT-SOR produced radioprotection in 3 of 4 cell lines, whereas sequential RT-SOR produced decreased colony formation among all 4. Sequential RT-SOR increased apoptosis compared to RT alone, while concurrent RT-SOR did not. Sorafenib induced reassortment into less radiosensitive phases of the cell cycle through G1-S delay and cell cycle slowing. More double-strand breaks (DSBs) persisted 24 h post-irradiation for RT alone versus concurrent RT-SOR. In vivo, sequential RT-SOR produced the greatest tumor growth delay, while concurrent RT-SOR was similar to RT alone. More persistent DSBs were observed in xenografts treated with sequential RT-SOR or RT alone versus concurrent RT-SOR. Sequential RT-SOR additionally produced a greater reduction in xenograft tumor vascularity and mitotic index than either concurrent RT-SOR or RT alone. In conclusion, sequential RT-SOR demonstrates greater efficacy against HCC than concurrent RT-SOR both in vitro and in vivo. These results may have implications for clinical decision-making and prospective trial design.
Chettiar, Sivarajan T.; Aziz, Khaled; Gajula, Rajendra P.; Williams, Russell D.; Kumar, Rachit; Taparra, Kekoa; Zeng, Jing; Cades, Jessica A.; Velarde, Esteban; Menon, Siddharth; Geschwind, Jean F.; Cosgrove, David; Pawlik, Timothy M.; Maitra, Anirban; Wong, John; Hales, Russell K.; Torbenson, Michael S.; Herman, Joseph M.; Tran, Phuoc T.
2013-01-01
Sorafenib (SOR) is the only systemic agent known to improve survival for hepatocellular carcinoma (HCC). However, SOR prolongs survival by less than 3 months and does not alter symptomatic progression. To improve outcomes, several phase I-II trials are currently examining SOR with radiation (RT) for HCC utilizing heterogeneous concurrent and sequential treatment regimens. Our study provides preclinical data characterizing the effects of concurrent versus sequential RT-SOR on HCC cells both in vitro and in vivo. Concurrent and sequential RT-SOR regimens were tested for efficacy among 4 HCC cell lines in vitro by assessment of clonogenic survival, apoptosis, cell cycle distribution, and γ-H2AX foci formation. Results were confirmed in vivo by evaluating tumor growth delay and performing immunofluorescence staining in a hind-flank xenograft model. In vitro, concurrent RT-SOR produced radioprotection in 3 of 4 cell lines, whereas sequential RT-SOR produced decreased colony formation among all 4. Sequential RT-SOR increased apoptosis compared to RT alone, while concurrent RT-SOR did not. Sorafenib induced reassortment into less radiosensitive phases of the cell cycle through G1-S delay and cell cycle slowing. More double-strand breaks (DSBs) persisted 24 h post-irradiation for RT alone versus concurrent RT-SOR. In vivo, sequential RT-SOR produced the greatest tumor growth delay, while concurrent RT-SOR was similar to RT alone. More persistent DSBs were observed in xenografts treated with sequential RT-SOR or RT alone versus concurrent RT-SOR. Sequential RT-SOR additionally produced a greater reduction in xenograft tumor vascularity and mitotic index than either concurrent RT-SOR or RT alone. In conclusion, sequential RT-SOR demonstrates greater efficacy against HCC than concurrent RT-SOR both in vitro and in vivo. These results may have implications for clinical decision-making and prospective trial design. PMID:23762417
Chiba, Yasutaka
2017-09-01
Fisher's exact test is commonly used to compare two groups when the outcome is binary in randomized trials. In the context of causal inference, this test explores the sharp causal null hypothesis (i.e. the causal effect of treatment is the same for all subjects), but not the weak causal null hypothesis (i.e. the causal risks are the same in the two groups). Therefore, in general, rejection of the null hypothesis by Fisher's exact test does not mean that the causal risk difference is not zero. Recently, Chiba (Journal of Biometrics and Biostatistics 2015; 6: 244) developed a new exact test for the weak causal null hypothesis when the outcome is binary in randomized trials; the new test is not based on any large sample theory and does not require any assumption. In this paper, we extend the new test; we create a version of the test applicable to a stratified analysis. The stratified exact test that we propose is general in nature and can be used in several approaches toward the estimation of treatment effects after adjusting for stratification factors. The stratified Fisher's exact test of Jung (Biometrical Journal 2014; 56: 129-140) tests the sharp causal null hypothesis. This test applies a crude estimator of the treatment effect and can be regarded as a special case of our proposed exact test. Our proposed stratified exact test can be straightforwardly extended to analysis of noninferiority trials and to construct the associated confidence interval. © 2017 WILEY-VCH Verlag GmbH & Co. KGaA, Weinheim.
McCombe Waller, Sandy; Whitall, Jill; Jenkins, Toye; Magder, Laurence S; Hanley, Daniel F; Goldberg, Andrew; Luft, Andreas R
2014-12-14
Recovering useful hand function after stroke is a major scientific challenge for patients with limited motor recovery. We hypothesized that sequential training beginning with proximal bilateral followed by unilateral task oriented training is superior to time-matched unilateral training alone. Proximal bilateral training could optimally prepare the motor system to respond to the more challenging task-oriented training. Twenty-six participants with moderate severity hemiparesis Intervention: PARTICIPANTS received either 6-weeks of bilateral proximal training followed sequentially by 6-weeks unilateral task-oriented training (COMBO) or 12-weeks of unilateral task-oriented training alone (SAEBO). A subset of 8 COMB0 and 9 SAEBO participants underwent three functional magnetic resonance imaging (fMRI) scans of hand and elbow movement every 6 weeks. Fugl-Meyer Upper extremity scale, Modified Wolf Motor Function Test, University of Maryland Arm Questionnaire for Stroke, Motor cortex activation (fMRI). The COMBO group demonstrated significantly greater gains between baseline and 12-weeks over all outcome measures (p = .018 based on a MANOVA test) and specifically in the Modified Wolf Motor Function test (time). Both groups demonstrated within-group gains on the Fugl-Meyer Upper Extremity test (impairment) and University of Maryland Arm Questionnaire for Stroke (functional use). fMRI subset analyses showed motor cortex (primary and premotor) activation during hand movement was significantly increased by sequential combination training but not by task-oriented training alone. Sequentially combining a proximal bilateral before a unilateral task-oriented training may be an effective way to facilitate gains in arm and hand function in those with moderate to severe paresis post-stroke compared to unilateral task oriented training alone.
A statistical test to show negligible trend
Philip M. Dixon; Joseph H.K. Pechmann
2005-01-01
The usual statistical tests of trend are inappropriate for demonstrating the absence of trend. This is because failure to reject the null hypothesis of no trend does not prove that null hypothesis. The appropriate statistical method is based on an equivalence test. The null hypothesis is that the trend is not zero, i.e., outside an a priori specified equivalence region...
Test apparatus for locating shorts during assembly of electrical buses
NASA Technical Reports Server (NTRS)
Deboo, G. J.; Devine, D. L. (Inventor)
1981-01-01
A test apparatus is described for locating electrical shorts that is especially suited for use while an electrical circuit is being fabricated or assembled. A ring counter derives input pulses from a square wave oscillator. The outputs of the counter are fed through transistors to an array of light emitting diodes. Each diode is connected to an electrical conductor, such as a bus bar, that is to be tested. In the absence of a short between the electrical conductors the diodes are sequentially illuminated. When a short occurs, a comparator/multivibrator circuit triggers an alarm and stops the oscillator and the sequential energization of the diodes. The two diodes that remain illuminated identify the electrical conductors that are shorted.
An extended sequential goodness-of-fit multiple testing method for discrete data.
Castro-Conde, Irene; Döhler, Sebastian; de Uña-Álvarez, Jacobo
2017-10-01
The sequential goodness-of-fit (SGoF) multiple testing method has recently been proposed as an alternative to the familywise error rate- and the false discovery rate-controlling procedures in high-dimensional problems. For discrete data, the SGoF method may be very conservative. In this paper, we introduce an alternative SGoF-type procedure that takes into account the discreteness of the test statistics. Like the original SGoF, our new method provides weak control of the false discovery rate/familywise error rate but attains false discovery rate levels closer to the desired nominal level, and thus it is more powerful. We study the performance of this method in a simulation study and illustrate its application to a real pharmacovigilance data set.
Sequential and simultaneous choices: testing the diet selection and sequential choice models.
Freidin, Esteban; Aw, Justine; Kacelnik, Alex
2009-03-01
We investigate simultaneous and sequential choices in starlings, using Charnov's Diet Choice Model (DCM) and Shapiro, Siller and Kacelnik's Sequential Choice Model (SCM) to integrate function and mechanism. During a training phase, starlings encountered one food-related option per trial (A, B or R) in random sequence and with equal probability. A and B delivered food rewards after programmed delays (shorter for A), while R ('rejection') moved directly to the next trial without reward. In this phase we measured latencies to respond. In a later, choice, phase, birds encountered the pairs A-B, A-R and B-R, the first implementing a simultaneous choice and the second and third sequential choices. The DCM predicts when R should be chosen to maximize intake rate, and SCM uses latencies of the training phase to predict choices between any pair of options in the choice phase. The predictions of both models coincided, and both successfully predicted the birds' preferences. The DCM does not deal with partial preferences, while the SCM does, and experimental results were strongly correlated to this model's predictions. We believe that the SCM may expose a very general mechanism of animal choice, and that its wider domain of success reflects the greater ecological significance of sequential over simultaneous choices.
The cost and cost-effectiveness of rapid testing strategies for yaws diagnosis and surveillance.
Fitzpatrick, Christopher; Asiedu, Kingsley; Sands, Anita; Gonzalez Pena, Tita; Marks, Michael; Mitja, Oriol; Meheus, Filip; Van der Stuyft, Patrick
2017-10-01
Yaws is a non-venereal treponemal infection caused by Treponema pallidum subspecies pertenue. The disease is targeted by WHO for eradication by 2020. Rapid diagnostic tests (RDTs) are envisaged for confirmation of clinical cases during treatment campaigns and for certification of the interruption of transmission. Yaws testing requires both treponemal (trep) and non-treponemal (non-trep) assays for diagnosis of current infection. We evaluate a sequential testing strategy (using a treponemal RDT before a trep/non-trep RDT) in terms of cost and cost-effectiveness, relative to a single-assay combined testing strategy (using the trep/non-trep RDT alone), for two use cases: individual diagnosis and community surveillance. We use cohort decision analysis to examine the diagnostic and cost outcomes. We estimate cost and cost-effectiveness of the alternative testing strategies at different levels of prevalence of past/current infection and current infection under each use case. We take the perspective of the global yaws eradication programme. We calculate the total number of correct diagnoses for each strategy over a range of plausible prevalences. We employ probabilistic sensitivity analysis (PSA) to account for uncertainty and report 95% intervals. At current prices of the treponemal and trep/non-trep RDTs, the sequential strategy is cost-saving for individual diagnosis at prevalence of past/current infection less than 85% (81-90); it is cost-saving for surveillance at less than 100%. The threshold price of the trep/non-trep RDT (below which the sequential strategy would no longer be cost-saving) is US$ 1.08 (1.02-1.14) for individual diagnosis at high prevalence of past/current infection (51%) and US$ 0.54 (0.52-0.56) for community surveillance at low prevalence (15%). We find that the sequential strategy is cost-saving for both diagnosis and surveillance in most relevant settings. In the absence of evidence assessing relative performance (sensitivity and specificity), cost-effectiveness is uncertain. However, the conditions under which the combined test only strategy might be more cost-effective than the sequential strategy are limited. A cheaper trep/non-trep RDT is needed, costing no more than US$ 0.50-1.00, depending on the use case. Our results will help enhance the cost-effectiveness of yaws programmes in the 13 countries known to be currently endemic. It will also inform efforts in the much larger group of 71 countries with a history of yaws, many of which will have to undertake surveillance to confirm the interruption of transmission.
Brief Lags in Interrupted Sequential Performance: Evaluating a Model and Model Evaluation Method
2015-01-05
rehearsal mechanism in the model. To evaluate the model we developed a simple new goodness-of-fit test based on analysis of variance that offers an...repeated step). Sequen- tial constraints are common in medicine, equipment maintenance, computer programming and technical support, data analysis ...legal analysis , accounting, and many other home and workplace environ- ments. Sequential constraints also play a role in such basic cognitive processes
Random sequential adsorption of cubes
NASA Astrophysics Data System (ADS)
Cieśla, Michał; Kubala, Piotr
2018-01-01
Random packings built of cubes are studied numerically using a random sequential adsorption algorithm. To compare the obtained results with previous reports, three different models of cube orientation sampling were used. Also, three different cube-cube intersection algorithms were tested to find the most efficient one. The study focuses on the mean saturated packing fraction as well as kinetics of packing growth. Microstructural properties of packings were analyzed using density autocorrelation function.
1998-06-01
4] By 2010, we should be able to change how we conduct the most intense joint operations. Instead of relying on massed forces and sequential ...not independent, sequential steps. Data probes to support the analysis phase were required to complete the logical models. This generated a need...Networks) Identify Granularity (System Level) - Establish Physical Bounds or Limits to Systems • Determine System Test Configuration and Lineup
Topics in the Sequential Design of Experiments
1992-03-01
decision , unless so designated by other documentation. 12a. DISTRIBUTION /AVAILABIIUTY STATEMENT 12b. DISTRIBUTION CODE Approved for public release...3 0 1992 D 14. SUBJECT TERMS 15. NUMBER OF PAGES12 Design of Experiments, Renewal Theory , Sequential Testing 1 2. PRICE CODE Limit Theory , Local...distributions for one parameter exponential families," by Michael Woodroofe. Stntca, 2 (1991), 91-112. [6] "A non linear renewal theory for a functional of
Liu, Rong
2017-01-01
Obtaining a fast and reliable decision is an important issue in brain-computer interfaces (BCI), particularly in practical real-time applications such as wheelchair or neuroprosthetic control. In this study, the EEG signals were firstly analyzed with a power projective base method. Then we were applied a decision-making model, the sequential probability ratio testing (SPRT), for single-trial classification of motor imagery movement events. The unique strength of this proposed classification method lies in its accumulative process, which increases the discriminative power as more and more evidence is observed over time. The properties of the method were illustrated on thirteen subjects' recordings from three datasets. Results showed that our proposed power projective method outperformed two benchmark methods for every subject. Moreover, with sequential classifier, the accuracies across subjects were significantly higher than that with nonsequential ones. The average maximum accuracy of the SPRT method was 84.1%, as compared with 82.3% accuracy for the sequential Bayesian (SB) method. The proposed SPRT method provides an explicit relationship between stopping time, thresholds, and error, which is important for balancing the time-accuracy trade-off. These results suggest SPRT would be useful in speeding up decision-making while trading off errors in BCI. PMID:29348781
Longitudinal Dimensionality of Adolescent Psychopathology: Testing the Differentiation Hypothesis
ERIC Educational Resources Information Center
Sterba, Sonya K.; Copeland, William; Egger, Helen L.; Costello, E. Jane; Erkanli, Alaattin; Angold, Adrian
2010-01-01
Background: The differentiation hypothesis posits that the underlying liability distribution for psychopathology is of low dimensionality in young children, inflating diagnostic comorbidity rates, but increases in dimensionality with age as latent syndromes become less correlated. This hypothesis has not been adequately tested with longitudinal…
A large scale test of the gaming-enhancement hypothesis.
Przybylski, Andrew K; Wang, John C
2016-01-01
A growing research literature suggests that regular electronic game play and game-based training programs may confer practically significant benefits to cognitive functioning. Most evidence supporting this idea, the gaming-enhancement hypothesis , has been collected in small-scale studies of university students and older adults. This research investigated the hypothesis in a general way with a large sample of 1,847 school-aged children. Our aim was to examine the relations between young people's gaming experiences and an objective test of reasoning performance. Using a Bayesian hypothesis testing approach, evidence for the gaming-enhancement and null hypotheses were compared. Results provided no substantive evidence supporting the idea that having preference for or regularly playing commercially available games was positively associated with reasoning ability. Evidence ranged from equivocal to very strong in support for the null hypothesis over what was predicted. The discussion focuses on the value of Bayesian hypothesis testing for investigating electronic gaming effects, the importance of open science practices, and pre-registered designs to improve the quality of future work.
Shi, Ruijia; Xu, Cunshuan
2011-06-01
The study of rat proteins is an indispensable task in experimental medicine and drug development. The function of a rat protein is closely related to its subcellular location. Based on the above concept, we construct the benchmark rat proteins dataset and develop a combined approach for predicting the subcellular localization of rat proteins. From protein primary sequence, the multiple sequential features are obtained by using of discrete Fourier analysis, position conservation scoring function and increment of diversity, and these sequential features are selected as input parameters of the support vector machine. By the jackknife test, the overall success rate of prediction is 95.6% on the rat proteins dataset. Our method are performed on the apoptosis proteins dataset and the Gram-negative bacterial proteins dataset with the jackknife test, the overall success rates are 89.9% and 96.4%, respectively. The above results indicate that our proposed method is quite promising and may play a complementary role to the existing predictors in this area.
NASA Technical Reports Server (NTRS)
Braun, W. R.
1981-01-01
Pseudo noise (PN) spread spectrum systems require a very accurate alignment between the PN code epochs at the transmitter and receiver. This synchronism is typically established through a two-step algorithm, including a coarse synchronization procedure and a fine synchronization procedure. A standard approach for the coarse synchronization is a sequential search over all code phases. The measurement of the power in the filtered signal is used to either accept or reject the code phase under test as the phase of the received PN code. This acquisition strategy, called a single dwell-time system, has been analyzed by Holmes and Chen (1977). A synopsis of the field of sequential analysis as it applies to the PN acquisition problem is provided. From this, the implementation of the variable dwell time algorithm as a sequential probability ratio test is developed. The performance of this algorithm is compared to the optimum detection algorithm and to the fixed dwell-time system.
Null but not void: considerations for hypothesis testing.
Shaw, Pamela A; Proschan, Michael A
2013-01-30
Standard statistical theory teaches us that once the null and alternative hypotheses have been defined for a parameter, the choice of the statistical test is clear. Standard theory does not teach us how to choose the null or alternative hypothesis appropriate to the scientific question of interest. Neither does it tell us that in some cases, depending on which alternatives are realistic, we may want to define our null hypothesis differently. Problems in statistical practice are frequently not as pristinely summarized as the classic theory in our textbooks. In this article, we present examples in statistical hypothesis testing in which seemingly simple choices are in fact rich with nuance that, when given full consideration, make the choice of the right hypothesis test much less straightforward. Published 2012. This article is a US Government work and is in the public domain in the USA.
Effect of climate-related mass extinctions on escalation in molluscs
NASA Astrophysics Data System (ADS)
Hansen, Thor A.; Kelley, Patricia H.; Melland, Vicky D.; Graham, Scott E.
1999-12-01
We test the hypothesis that escalated species (e.g., those with antipredatory adaptations such as heavy armor) are more vulnerable to extinctions caused by changes in climate. If this hypothesis is valid, recovery faunas after climate-related extinctions should include significantly fewer species with escalated shell characteristics, and escalated species should undergo greater rates of extinction than nonescalated species. This hypothesis is tested for the Cretaceous-Paleocene, Eocene-Oligocene, middle Miocene, and Pliocene-Pleistocene mass extinctions. Gastropod and bivalve molluscs from the U.S. coastal plain were evaluated for 10 shell characters that confer resistance to predators. Of 40 tests, one supported the hypothesis; highly ornamented gastropods underwent greater levels of Pliocene-Pleistocene extinction than did nonescalated species. All remaining tests were nonsignificant. The hypothesis that escalated species are more vulnerable to climate-related mass extinctions is not supported.
Sequential CFAR detectors using a dead-zone limiter
NASA Astrophysics Data System (ADS)
Tantaratana, Sawasd
1990-09-01
The performances of some proposed sequential constant-false-alarm-rate (CFAR) detectors are evaluated. The observations are passed through a dead-zone limiter, the output of which is -1, 0, or +1, depending on whether the input is less than -c, between -c and c, or greater than c, where c is a constant. The test statistic is the sum of the outputs. The test is performed on a reduced set of data (those with absolute value larger than c), with the test statistic being the sum of the signs of the reduced set of data. Both constant and linear boundaries are considered. Numerical results show a significant reduction of the average number of observations needed to achieve the same false alarm and detection probabilities as a fixed-sample-size CFAR detector using the same kind of test statistic.
NASA Technical Reports Server (NTRS)
LaMotte, Clifford E.; Pickard, Barbara G.
2004-01-01
Plant organs may respond to gravity by vertical (orthogravitropic), oblique (plagiogravitropic) or horizontal (diagravitropic) growth. Primary roots of maize (Zea mays L.) provide a good system for studying such behaviours because they are reportedly capable of displaying all three responses. In current work using maize seedlings of the Silver Queen cultivar, stabilisation of growth at an oblique orientation was commonplace. Hypothetically, plagiogravitropism may be accomplished either by a process we call graded orthogravitropism or by hunting about a sensed non-vertical setpoint. In graded orthotropism primary bending is unidirectional and depends on facilitative stimuli that determine its extent. The hallmark of the setpoint mechanism is restorative curvature of either sign following a displacement; both diagravitropism and orthogravitropism are based on setpoints. Roots settled in a plagiogravitropic orientation were tested with various illumination and displacement protocols designed to distinguish between these two hypotheses. The tests refuted the setpoint hypothesis and supported that of graded orthotropism. No evidence of diagravitropism could be found, thus, earlier claims were likely based on inadequately controlled observations of graded orthotropism. We propose that orthotropism is graded by the sequential action of dual gravity receptors: induction of a vectorial gravitropic response requires gravitational induction of a separate facilitative response, whose decay in the absence of fresh stimuli can brake gravitropism at plagiotropic angles.
González-Rivero, Manuel; Bozec, Yves-Marie; Chollett, Iliana; Ferrari, Renata; Schönberg, Christine H L; Mumby, Peter J
2016-05-01
Disturbance releases space and allows the growth of opportunistic species, excluded by the old stands, with a potential to alter community dynamics. In coral reefs, abundances of fast-growing, and disturbance-tolerant sponges are expected to increase and dominate as space becomes available following acute coral mortality events. Yet, an increase in abundance of these opportunistic species has been reported in only a few studies, suggesting certain mechanisms may be acting to regulate sponge populations. To gain insights into mechanisms of population control, we simulated the dynamics of the common reef-excavating sponge Cliona tenuis in the Caribbean using an individual-based model. An orthogonal hypothesis testing approach was used, where four candidate mechanisms-algal competition, stock-recruitment limitation, whole and partial mortality-were incorporated sequentially into the model and the results were tested against independent field observations taken over a decade in Belize, Central America. We found that releasing space after coral mortality can promote C. tenuis outbreaks, but such outbreaks can be curtailed by macroalgal competition. The asymmetrical competitive superiority of macroalgae, given by their capacity to pre-empt space and outcompete with the sponge in a size-dependant fashion, supports their capacity to steal the opportunity from other opportunists. While multiple system stages can be expected in coral reefs following intense perturbation macroalgae may prevent the growth of other space-occupiers, such as bioeroding sponges, under low grazing pressure.
EEG Classification with a Sequential Decision-Making Method in Motor Imagery BCI.
Liu, Rong; Wang, Yongxuan; Newman, Geoffrey I; Thakor, Nitish V; Ying, Sarah
2017-12-01
To develop subject-specific classifier to recognize mental states fast and reliably is an important issue in brain-computer interfaces (BCI), particularly in practical real-time applications such as wheelchair or neuroprosthetic control. In this paper, a sequential decision-making strategy is explored in conjunction with an optimal wavelet analysis for EEG classification. The subject-specific wavelet parameters based on a grid-search method were first developed to determine evidence accumulative curve for the sequential classifier. Then we proposed a new method to set the two constrained thresholds in the sequential probability ratio test (SPRT) based on the cumulative curve and a desired expected stopping time. As a result, it balanced the decision time of each class, and we term it balanced threshold SPRT (BTSPRT). The properties of the method were illustrated on 14 subjects' recordings from offline and online tests. Results showed the average maximum accuracy of the proposed method to be 83.4% and the average decision time of 2.77[Formula: see text]s, when compared with 79.2% accuracy and a decision time of 3.01[Formula: see text]s for the sequential Bayesian (SB) method. The BTSPRT method not only improves the classification accuracy and decision speed comparing with the other nonsequential or SB methods, but also provides an explicit relationship between stopping time, thresholds and error, which is important for balancing the speed-accuracy tradeoff. These results suggest that BTSPRT would be useful in explicitly adjusting the tradeoff between rapid decision-making and error-free device control.
Validation of antibiotic residue tests for dairy goats.
Zeng, S S; Hart, S; Escobar, E N; Tesfai, K
1998-03-01
The SNAP test, LacTek test (B-L and CEF), Charm Bacillus sterothermophilus var. calidolactis disk assay (BsDA), and Charm II Tablet Beta-lactam sequential test were validated using antibiotic-fortified and -incurred goat milk following the protocol for test kit validations of the U.S. Food and Drug Administration Center for Veterinary Medicine. SNAP, Charm BsDA, and Charm II Tablet Sequential tests were sensitive and reliable in detecting antibiotic residues in goat milk. All three assays showed greater than 90% sensitivity and specificity at tolerance and detection levels. However, caution should be taken in interpreting test results at detection levels. Because of the high sensitivity of these three tests, false-violative results could be obtained in goat milk containing antibiotic residues below the tolerance level. Goat milk testing positive by these tests must be confirmed using a more sophisticated methodology, such as high-performance liquid chromatography, before the milk is condemned. LacTek B-L test did not detect several antibiotics, including penicillin G, in goat milk at tolerance levels. However, LacTek CEF was excellent in detecting ceftiofur residue in goat milk.
On Restructurable Control System Theory
NASA Technical Reports Server (NTRS)
Athans, M.
1983-01-01
The state of stochastic system and control theory as it impacts restructurable control issues is addressed. The multivariable characteristics of the control problem are addressed. The failure detection/identification problem is discussed as a multi-hypothesis testing problem. Control strategy reconfiguration, static multivariable controls, static failure hypothesis testing, dynamic multivariable controls, fault-tolerant control theory, dynamic hypothesis testing, generalized likelihood ratio (GLR) methods, and adaptive control are discussed.
ERIC Educational Resources Information Center
Marmolejo-Ramos, Fernando; Cousineau, Denis
2017-01-01
The number of articles showing dissatisfaction with the null hypothesis statistical testing (NHST) framework has been progressively increasing over the years. Alternatives to NHST have been proposed and the Bayesian approach seems to have achieved the highest amount of visibility. In this last part of the special issue, a few alternative…
Radiation detection method and system using the sequential probability ratio test
Nelson, Karl E [Livermore, CA; Valentine, John D [Redwood City, CA; Beauchamp, Brock R [San Ramon, CA
2007-07-17
A method and system using the Sequential Probability Ratio Test to enhance the detection of an elevated level of radiation, by determining whether a set of observations are consistent with a specified model within a given bounds of statistical significance. In particular, the SPRT is used in the present invention to maximize the range of detection, by providing processing mechanisms for estimating the dynamic background radiation, adjusting the models to reflect the amount of background knowledge at the current point in time, analyzing the current sample using the models to determine statistical significance, and determining when the sample has returned to the expected background conditions.
Revised standards for statistical evidence.
Johnson, Valen E
2013-11-26
Recent advances in Bayesian hypothesis testing have led to the development of uniformly most powerful Bayesian tests, which represent an objective, default class of Bayesian hypothesis tests that have the same rejection regions as classical significance tests. Based on the correspondence between these two classes of tests, it is possible to equate the size of classical hypothesis tests with evidence thresholds in Bayesian tests, and to equate P values with Bayes factors. An examination of these connections suggest that recent concerns over the lack of reproducibility of scientific studies can be attributed largely to the conduct of significance tests at unjustifiably high levels of significance. To correct this problem, evidence thresholds required for the declaration of a significant finding should be increased to 25-50:1, and to 100-200:1 for the declaration of a highly significant finding. In terms of classical hypothesis tests, these evidence standards mandate the conduct of tests at the 0.005 or 0.001 level of significance.
Sequential detection of influenza epidemics by the Kolmogorov-Smirnov test
2012-01-01
Background Influenza is a well known and common human respiratory infection, causing significant morbidity and mortality every year. Despite Influenza variability, fast and reliable outbreak detection is required for health resource planning. Clinical health records, as published by the Diagnosticat database in Catalonia, host useful data for probabilistic detection of influenza outbreaks. Methods This paper proposes a statistical method to detect influenza epidemic activity. Non-epidemic incidence rates are modeled against the exponential distribution, and the maximum likelihood estimate for the decaying factor λ is calculated. The sequential detection algorithm updates the parameter as new data becomes available. Binary epidemic detection of weekly incidence rates is assessed by Kolmogorov-Smirnov test on the absolute difference between the empirical and the cumulative density function of the estimated exponential distribution with significance level 0 ≤ α ≤ 1. Results The main advantage with respect to other approaches is the adoption of a statistically meaningful test, which provides an indicator of epidemic activity with an associated probability. The detection algorithm was initiated with parameter λ0 = 3.8617 estimated from the training sequence (corresponding to non-epidemic incidence rates of the 2008-2009 influenza season) and sequentially updated. Kolmogorov-Smirnov test detected the following weeks as epidemic for each influenza season: 50−10 (2008-2009 season), 38−50 (2009-2010 season), weeks 50−9 (2010-2011 season) and weeks 3 to 12 for the current 2011-2012 season. Conclusions Real medical data was used to assess the validity of the approach, as well as to construct a realistic statistical model of weekly influenza incidence rates in non-epidemic periods. For the tested data, the results confirmed the ability of the algorithm to detect the start and the end of epidemic periods. In general, the proposed test could be applied to other data sets to quickly detect influenza outbreaks. The sequential structure of the test makes it suitable for implementation in many platforms at a low computational cost without requiring to store large data sets. PMID:23031321
Chromosomal and carcinogenic effects of sequential HZE and low-LET irradiations
NASA Astrophysics Data System (ADS)
Simonson, Dustin Mark
All persons are exposed to a natural background of ionizing radiations with different spatial patterns of energy deposition resulting in differential biologic response. Astronauts, aircrew and radioactive contamination clean-up personnel are exposed to particularly complex radiation spectra. The current method for calculating radiation-induced exposure limits in mixed radiation environments is based on the linear summation of non-threshold risks, a methodology grounded in the premise that each component of the radiation field acts independently of the presence of other components. The assumption of effect independence of in-vitro exposed samples was tested by evaluating the frequency of chromosome aberrations induced by sequential irradiation of immortalized human mammary epithelial cells with 1 GeV/nucleon 56Fe ions and 137Cs gamma-rays. Experimental response was found to be significantly less than calculated on the basis of effect independence, but only when 56Fe ions preceded the photon exposure. That there was order dependence is interpreted as evidence that response may not simply be a result of interactions between similar sublesions but rather may involve qualitatively different time-ordered parameters. The presence of this sub-additive response is phenomenologically similar to adaptive response, which had not been previously reported as a consequence to high-energy heavy ion irradiation. Calculations based on effect independence predict a significantly greater average number and lifetime cumulative incidence of breast cancers in female Sprague-Dawley rats irradiated with both 56Fe ions and 250 MeV protons than was experimentally observed. This finding supports the hypothesis that the presence of non-additive response is not exclusively an in vitro phenomenon. Results from an evaluation of mammary epithelial cell response induced in a rat cancer model are marginally consistent with the use of in vivo induced chromosome aberrations as a biomarker of breast cancer risk. There is also an apparent association between the two endpoints relating to dependence on radiation quality. In conclusion, these data demonstrate that sequential exposures to both HZE and low-LET radiation may result in chromosomal and carcinogenic response that is inconsistent with effect independence. These results provide evidence that the linear summation of health risks from mixed HZE and low-LET radiation fields may not accurately reflect true risk.
NASA Astrophysics Data System (ADS)
Murakami, H.; Chen, X.; Hahn, M. S.; Over, M. W.; Rockhold, M. L.; Vermeul, V.; Hammond, G. E.; Zachara, J. M.; Rubin, Y.
2010-12-01
Subsurface characterization for predicting groundwater flow and contaminant transport requires us to integrate large and diverse datasets in a consistent manner, and quantify the associated uncertainty. In this study, we sequentially assimilated multiple types of datasets for characterizing a three-dimensional heterogeneous hydraulic conductivity field at the Hanford 300 Area. The datasets included constant-rate injection tests, electromagnetic borehole flowmeter tests, lithology profile and tracer tests. We used the method of anchored distributions (MAD), which is a modular-structured Bayesian geostatistical inversion method. MAD has two major advantages over the other inversion methods. First, it can directly infer a joint distribution of parameters, which can be used as an input in stochastic simulations for prediction. In MAD, in addition to typical geostatistical structural parameters, the parameter vector includes multiple point values of the heterogeneous field, called anchors, which capture local trends and reduce uncertainty in the prediction. Second, MAD allows us to integrate the datasets sequentially in a Bayesian framework such that it updates the posterior distribution, as a new dataset is included. The sequential assimilation can decrease computational burden significantly. We applied MAD to assimilate different combinations of the datasets, and then compared the inversion results. For the injection and tracer test assimilation, we calculated temporal moments of pressure build-up and breakthrough curves, respectively, to reduce the data dimension. A massive parallel flow and transport code PFLOTRAN is used for simulating the tracer test. For comparison, we used different metrics based on the breakthrough curves not used in the inversion, such as mean arrival time, peak concentration and early arrival time. This comparison intends to yield the combined data worth, i.e. which combination of the datasets is the most effective for a certain metric, which will be useful for guiding the further characterization effort at the site and also the future characterization projects at the other sites.
NASA Astrophysics Data System (ADS)
Thompson, E. M.; Hewlett, J. B.; Baise, L. G.; Vogel, R. M.
2011-01-01
Annual maximum (AM) time series are incomplete (i.e., censored) when no events are included above the assumed censoring threshold (i.e., magnitude of completeness). We introduce a distrtibutional hypothesis test for left-censored Gumbel observations based on the probability plot correlation coefficient (PPCC). Critical values of the PPCC hypothesis test statistic are computed from Monte-Carlo simulations and are a function of sample size, censoring level, and significance level. When applied to a global catalog of earthquake observations, the left-censored Gumbel PPCC tests are unable to reject the Gumbel hypothesis for 45 of 46 seismic regions. We apply four different field significance tests for combining individual tests into a collective hypothesis test. None of the field significance tests are able to reject the global hypothesis that AM earthquake magnitudes arise from a Gumbel distribution. Because the field significance levels are not conclusive, we also compute the likelihood that these field significance tests are unable to reject the Gumbel model when the samples arise from a more complex distributional alternative. A power study documents that the censored Gumbel PPCC test is unable to reject some important and viable Generalized Extreme Value (GEV) alternatives. Thus, we cannot rule out the possibility that the global AM earthquake time series could arise from a GEV distribution with a finite upper bound, also known as a reverse Weibull distribution. Our power study also indicates that the binomial and uniform field significance tests are substantially more powerful than the more commonly used Bonferonni and false discovery rate multiple comparison procedures.
The Use of Test Results from ASA Workshops to Evaluate Workshop Effectiveness
ERIC Educational Resources Information Center
Donegan, Judith H.; And Others
1976-01-01
Results of test given to participants in six American Society of Anesthesiologists workshops were analyzed to determine whether attendance increased scores on sequential tests (before, immediately after, and three months later). Both workshop and control groups of anesthesiologists increased their scores with each successive test. (Editor/JT)
16 CFR 1500.41 - Method of testing primary irritant substances.
Code of Federal Regulations, 2014 CFR
2014-01-01
... corrosivity properties of substances, including testing that does not require animals, are presented in the CPSC's animal testing policy set forth in 16 CFR 1500.232. A weight-of-evidence analysis or a validated... conducted, a sequential testing strategy is recommended to reduce the number of test animals. The method of...
Robustness of Ability Estimation to Multidimensionality in CAST with Implications to Test Assembly
ERIC Educational Resources Information Center
Zhang, Yanwei; Nandakumar, Ratna
2006-01-01
Computer Adaptive Sequential Testing (CAST) is a test delivery model that combines features of the traditional conventional paper-and-pencil testing and item-based computerized adaptive testing (CAT). The basic structure of CAST is a panel composed of multiple testlets adaptively administered to examinees at different stages. Current applications…
Biostatistics Series Module 2: Overview of Hypothesis Testing.
Hazra, Avijit; Gogtay, Nithya
2016-01-01
Hypothesis testing (or statistical inference) is one of the major applications of biostatistics. Much of medical research begins with a research question that can be framed as a hypothesis. Inferential statistics begins with a null hypothesis that reflects the conservative position of no change or no difference in comparison to baseline or between groups. Usually, the researcher has reason to believe that there is some effect or some difference which is the alternative hypothesis. The researcher therefore proceeds to study samples and measure outcomes in the hope of generating evidence strong enough for the statistician to be able to reject the null hypothesis. The concept of the P value is almost universally used in hypothesis testing. It denotes the probability of obtaining by chance a result at least as extreme as that observed, even when the null hypothesis is true and no real difference exists. Usually, if P is < 0.05 the null hypothesis is rejected and sample results are deemed statistically significant. With the increasing availability of computers and access to specialized statistical software, the drudgery involved in statistical calculations is now a thing of the past, once the learning curve of the software has been traversed. The life sciences researcher is therefore free to devote oneself to optimally designing the study, carefully selecting the hypothesis tests to be applied, and taking care in conducting the study well. Unfortunately, selecting the right test seems difficult initially. Thinking of the research hypothesis as addressing one of five generic research questions helps in selection of the right hypothesis test. In addition, it is important to be clear about the nature of the variables (e.g., numerical vs. categorical; parametric vs. nonparametric) and the number of groups or data sets being compared (e.g., two or more than two) at a time. The same research question may be explored by more than one type of hypothesis test. While this may be of utility in highlighting different aspects of the problem, merely reapplying different tests to the same issue in the hope of finding a P < 0.05 is a wrong use of statistics. Finally, it is becoming the norm that an estimate of the size of any effect, expressed with its 95% confidence interval, is required for meaningful interpretation of results. A large study is likely to have a small (and therefore "statistically significant") P value, but a "real" estimate of the effect would be provided by the 95% confidence interval. If the intervals overlap between two interventions, then the difference between them is not so clear-cut even if P < 0.05. The two approaches are now considered complementary to one another.
Biostatistics Series Module 2: Overview of Hypothesis Testing
Hazra, Avijit; Gogtay, Nithya
2016-01-01
Hypothesis testing (or statistical inference) is one of the major applications of biostatistics. Much of medical research begins with a research question that can be framed as a hypothesis. Inferential statistics begins with a null hypothesis that reflects the conservative position of no change or no difference in comparison to baseline or between groups. Usually, the researcher has reason to believe that there is some effect or some difference which is the alternative hypothesis. The researcher therefore proceeds to study samples and measure outcomes in the hope of generating evidence strong enough for the statistician to be able to reject the null hypothesis. The concept of the P value is almost universally used in hypothesis testing. It denotes the probability of obtaining by chance a result at least as extreme as that observed, even when the null hypothesis is true and no real difference exists. Usually, if P is < 0.05 the null hypothesis is rejected and sample results are deemed statistically significant. With the increasing availability of computers and access to specialized statistical software, the drudgery involved in statistical calculations is now a thing of the past, once the learning curve of the software has been traversed. The life sciences researcher is therefore free to devote oneself to optimally designing the study, carefully selecting the hypothesis tests to be applied, and taking care in conducting the study well. Unfortunately, selecting the right test seems difficult initially. Thinking of the research hypothesis as addressing one of five generic research questions helps in selection of the right hypothesis test. In addition, it is important to be clear about the nature of the variables (e.g., numerical vs. categorical; parametric vs. nonparametric) and the number of groups or data sets being compared (e.g., two or more than two) at a time. The same research question may be explored by more than one type of hypothesis test. While this may be of utility in highlighting different aspects of the problem, merely reapplying different tests to the same issue in the hope of finding a P < 0.05 is a wrong use of statistics. Finally, it is becoming the norm that an estimate of the size of any effect, expressed with its 95% confidence interval, is required for meaningful interpretation of results. A large study is likely to have a small (and therefore “statistically significant”) P value, but a “real” estimate of the effect would be provided by the 95% confidence interval. If the intervals overlap between two interventions, then the difference between them is not so clear-cut even if P < 0.05. The two approaches are now considered complementary to one another. PMID:27057011
Saraf, Sanatan; Mathew, Thomas; Roy, Anindya
2015-01-01
For the statistical validation of surrogate endpoints, an alternative formulation is proposed for testing Prentice's fourth criterion, under a bivariate normal model. In such a setup, the criterion involves inference concerning an appropriate regression parameter, and the criterion holds if the regression parameter is zero. Testing such a null hypothesis has been criticized in the literature since it can only be used to reject a poor surrogate, and not to validate a good surrogate. In order to circumvent this, an equivalence hypothesis is formulated for the regression parameter, namely the hypothesis that the parameter is equivalent to zero. Such an equivalence hypothesis is formulated as an alternative hypothesis, so that the surrogate endpoint is statistically validated when the null hypothesis is rejected. Confidence intervals for the regression parameter and tests for the equivalence hypothesis are proposed using bootstrap methods and small sample asymptotics, and their performances are numerically evaluated and recommendations are made. The choice of the equivalence margin is a regulatory issue that needs to be addressed. The proposed equivalence testing formulation is also adopted for other parameters that have been proposed in the literature on surrogate endpoint validation, namely, the relative effect and proportion explained.
Test of association: which one is the most appropriate for my study?
Gonzalez-Chica, David Alejandro; Bastos, João Luiz; Duquia, Rodrigo Pereira; Bonamigo, Renan Rangel; Martínez-Mesa, Jeovany
2015-01-01
Hypothesis tests are statistical tools widely used for assessing whether or not there is an association between two or more variables. These tests provide a probability of the type 1 error (p-value), which is used to accept or reject the null study hypothesis. To provide a practical guide to help researchers carefully select the most appropriate procedure to answer the research question. We discuss the logic of hypothesis testing and present the prerequisites of each procedure based on practical examples.
Improving the Crossing-SIBTEST Statistic for Detecting Non-uniform DIF.
Chalmers, R Philip
2018-06-01
This paper demonstrates that, after applying a simple modification to Li and Stout's (Psychometrika 61(4):647-677, 1996) CSIBTEST statistic, an improved variant of the statistic could be realized. It is shown that this modified version of CSIBTEST has a more direct association with the SIBTEST statistic presented by Shealy and Stout (Psychometrika 58(2):159-194, 1993). In particular, the asymptotic sampling distributions and general interpretation of the effect size estimates are the same for SIBTEST and the new CSIBTEST. Given the more natural connection to SIBTEST, it is shown that Li and Stout's hypothesis testing approach is insufficient for CSIBTEST; thus, an improved hypothesis testing procedure is required. Based on the presented arguments, a new chi-squared-based hypothesis testing approach is proposed for the modified CSIBTEST statistic. Positive results from a modest Monte Carlo simulation study strongly suggest the original CSIBTEST procedure and randomization hypothesis testing approach should be replaced by the modified statistic and hypothesis testing method.
A 37-mm Ceramic Gun Nozzle Stress Analysis
2006-05-01
Figures iv List of Tables iv 1 . Introduction 1 2. Ceramic Nozzle Structure and Materials 1 3. Sequentially-Coupled and Fully-Coupled Thermal Stress...FEM Analysis 1 4. Ceramic Nozzle Thermal Stress Response 4 5. Ceramic Nozzle Dynamic FEM 7 6. Ceramic Nozzle Dynamic Responses and Discussions 8 7...candidate ceramics and the test fixture model components are listed in table 1 . 3. Sequentially-Coupled and Fully-Coupled Thermal Stress FEM Analysis
The effect of social support on quality of life for tinnitus sufferers.
Murphy, Colleen Eliza
2012-01-01
To examine the relationship between tinnitus severity, social support and three quality of life measures. Research into other conditions shows that social support helps achieve positive outcomes and improved quality of life. For tinnitus, research suggests social support does not impact on quality of life outcomes. However, research has been limited and the measures used have mixed tinnitus severity, tinnitus handicap and social support into one measure. The aim of this research was to examine the relationship using separate measures. One hundred fifty-four tinnitus sufferers (63.7% males, 36.3% females, Age M = 46.4, SD = 14.97) completed the assessment battery. Three sequential multiple regression analyses were conducted to test the hypothesis that social support moderates the effects of tinnitus severity on each of the dependent variables: tinnitus handicap, depression and general well-being. The severity of one's tinnitus significantly predicted tinnitus handicap, depression and general well-being, but social support did not moderate the relationship. Social support did have a direct relationship on level of depression and general well-being. Tinnitus handicaps appear to be unique but tinnitus sufferers do gain significant benefits from social support.
Neural correlates of foreign-language learning in childhood: a 3-year longitudinal ERP study.
Ojima, Shiro; Nakamura, Naoko; Matsuba-Kurita, Hiroko; Hoshino, Takahiro; Hagiwara, Hiroko
2011-01-01
A foreign language (a language not spoken in one's community) is difficult to master completely. Early introduction of foreign-language (FL) education during childhood is becoming a standard in many countries. However, the neural process of child FL learning still remains largely unknown. We longitudinally followed 322 school-age children with diverse FL proficiency for three consecutive years, and acquired children's ERP responses to FL words that were semantically congruous or incongruous with the preceding picture context. As FL proficiency increased, various ERP components previously reported in mother-tongue (L1) acquisition (such as a broad negativity, an N400, and a late positive component) appeared sequentially, critically in an identical order to L1 acquisition. This finding was supported not only by cross-sectional analyses of children at different proficiency levels but also by longitudinal analyses of the same children over time. Our data are consistent with the hypothesis that FL learning in childhood reproduces identical developmental stages in an identical order to L1 acquisition, suggesting that the nature of the child's brain itself may determine the normal course of FL learning. Future research should test the generalizability of the results in other aspects of language such as syntax.
Learning from graphically integrated 2D and 3D representations improves retention of neuroanatomy
NASA Astrophysics Data System (ADS)
Naaz, Farah
Visualizations in the form of computer-based learning environments are highly encouraged in science education, especially for teaching spatial material. Some spatial material, such as sectional neuroanatomy, is very challenging to learn. It involves learning the two dimensional (2D) representations that are sampled from the three dimensional (3D) object. In this study, a computer-based learning environment was used to explore the hypothesis that learning sectional neuroanatomy from a graphically integrated 2D and 3D representation will lead to better learning outcomes than learning from a sequential presentation. The integrated representation explicitly demonstrates the 2D-3D transformation and should lead to effective learning. This study was conducted using a computer graphical model of the human brain. There were two learning groups:
Ultrasensitive surveillance of sensors and processes
Wegerich, Stephan W.; Jarman, Kristin K.; Gross, Kenneth C.
2001-01-01
A method and apparatus for monitoring a source of data for determining an operating state of a working system. The method includes determining a sensor (or source of data) arrangement associated with monitoring the source of data for a system, activating a method for performing a sequential probability ratio test if the data source includes a single data (sensor) source, activating a second method for performing a regression sequential possibility ratio testing procedure if the arrangement includes a pair of sensors (data sources) with signals which are linearly or non-linearly related; activating a third method for performing a bounded angle ratio test procedure if the sensor arrangement includes multiple sensors and utilizing at least one of the first, second and third methods to accumulate sensor signals and determining the operating state of the system.
Ultrasensitive surveillance of sensors and processes
Wegerich, Stephan W.; Jarman, Kristin K.; Gross, Kenneth C.
1999-01-01
A method and apparatus for monitoring a source of data for determining an operating state of a working system. The method includes determining a sensor (or source of data) arrangement associated with monitoring the source of data for a system, activating a method for performing a sequential probability ratio test if the data source includes a single data (sensor) source, activating a second method for performing a regression sequential possibility ratio testing procedure if the arrangement includes a pair of sensors (data sources) with signals which are linearly or non-linearly related; activating a third method for performing a bounded angle ratio test procedure if the sensor arrangement includes multiple sensors and utilizing at least one of the first, second and third methods to accumulate sensor signals and determining the operating state of the system.
Cochran Q test with Turbo BASIC.
Seuc, A H
1995-01-01
A microcomputer program written in Turbo BASIC for the sequential application of the Cochran Q test is given. A clinical application where the test is used in order to explore the structure of the agreement between observers is also presented. A program listing is available on request.
16 CFR 1212.4 - Test protocol.
Code of Federal Regulations, 2010 CFR
2010-01-01
... participate. (6) Two children at a time shall participate in testing of surrogate multi-purpose lighters... at the same time. Two children at a time shall participate in testing of surrogate multi-purpose... appearance, including color. The surrogate multi-purpose lighters shall be labeled with sequential numbers...
Rispin, Amy; Farrar, David; Margosches, Elizabeth; Gupta, Kailash; Stitzel, Katherine; Carr, Gregory; Greene, Michael; Meyer, William; McCall, Deborah
2002-01-01
The authors have developed an improved version of the up-and-down procedure (UDP) as one of the replacements for the traditional acute oral toxicity test formerly used by the Organisation for Economic Co-operation and Development member nations to characterize industrial chemicals, pesticides, and their mixtures. This method improves the performance of acute testing for applications that use the median lethal dose (classic LD50) test while achieving significant reductions in animal use. It uses sequential dosing, together with sophisticated computer-assisted computational methods during the execution and calculation phases of the test. Staircase design, a form of sequential test design, can be applied to acute toxicity testing with its binary experimental endpoints (yes/no outcomes). The improved UDP provides a point estimate of the LD50 and approximate confidence intervals in addition to observed toxic signs for the substance tested. It does not provide information about the dose-response curve. Computer simulation was used to test performance of the UDP without the need for additional laboratory validation.
Sequential changes from minimal pancreatic inflammation to advanced alcoholic pancreatitis.
Noronha, M; Dreiling, D A; Bordalo, O
1983-11-01
A correlation of several clinical parameters and pancreatitis morphological alterations observed in chronic alcoholics with and without pancreatic is presented. Three groups of patients were studied: asymptomatic chronic alcoholics (24); non-alcoholic controls (10); and cases with advanced chronic pancreatitis (6). Clinical, biochemical and functional studies were performed. Morphological studies were made on surgical biopsy specimens in light and electron microscopy. The results of this study showed: 1) fat accumulates within pancreatic acinar cells in alcoholics drinking more than 80 g of ethanol per day; 2) ultrastructural changes found in acinar cells of the alcoholics are similar to those described for liver cells; 3) the alterations found in alcoholics without pancreatitis are also observed in those with advanced chronic pancreatitis. An attempt to correlate the sequential changes in the histopathology of alcoholic pancreatic disease with the clinical picture and secretory patterns was made. According to these observations, admitting the ultrastructural similarities between the liver and the pancreas and the recently demonstrated abnormalities of lipid metabolism in pancreatic cells in experimental animal research, the authors postulate a toxic-metabolic mechanism as a likely hypothesis for the pathogenesis of chronic alcoholic inflammation of the pancreas.
Crepidula Slipper Limpets Alter Sex Change in Response to Physical Contact with Conspecifics.
Carrillo-Baltodano, Allan; Collin, Rachel
2015-12-01
Chemical signaling, especially signaling with waterborne cues, is an important mode of communication between conspecifics of aquatic organisms. Although conspecific associations play an important role in sex allocation of sequential hermaphroditic slipper limpets, the mode of signaling is unknown. We tested the hypothesis that the effects of conspecifics on animal size and time of sex change in the tropical slipper limpet Crepidula cf. marginalis are mediated by waterborne cues. In our experiment, pairs of snails (one small and one large) were kept in cups, either together or partitioned off with fine or coarse mesh, or partitioned, but switched from side to side to allow contact with the cup mate's pedal mucus. The larger snails that were allowed contact with the smaller companions grew faster, and generally changed sex sooner, than did the larger snails in the barrier treatments, which allowed no physical contact. The smaller snails that were allowed contact with the larger cup mate delayed sex change compared to those separated from their cup mates. We were, therefore, able to reject the hypothesis that waterborne cues mediate communication between these snails. Our results suggest that the cue that affects size and time to sex change requires some kind of physical interaction that is lost when the snails are separated. Furthermore, contact with another snail's pedal mucus does not compensate for the loss of physical contact. Since males often attach to the shell of larger females, direct contact may mediate this kind of physical interaction via positional information, physical stimulation, or contact-based chemical communication. Whatever the cue, contact with conspecifics influences both partners, resulting in, surprisingly, a higher growth rate in the larger animal and delayed sex change in the smaller animal. © 2015 Marine Biological Laboratory.
Bharadwaj, Manushree; Pope, Carey; Davis, Michael; Katz, Stuart; Cook, Christian; Maxwell, Lara
2017-08-01
Heart rate recovery (HRR) describes the rapid deceleration of heart rate after strenuous exercise and is an indicator of parasympathetic tone. A reduction in parasympathetic tone occurs in patients with congestive heart failure, resulting in prolonged HRR. Acetylcholinesterase inhibitors, such as pyridostigmine, can enhance parasympathetic tone by increasing cholinergic input to the heart. The objective of this study was to develop a rodent model of HRR to test the hypothesis that subacute pyridostigmine administration decreases cholinesterase activity and accelerates HRR in rats. Ten days after implantation of radiotelemetry transmitters, male Sprague Dawley rats were randomized to control (CTL) or treated (PYR; 0.14 mg/mL pyridostigmine in the drinking water, 29 days) groups. Rats were exercised on a treadmill to record HRR, and blood samples were collected on days 0, 7, 14, and 28 of pyridostigmine administration. Total cholinesterase and acetylcholinesterase (AChE) activity in plasma was decreased by 32%-43% and 57%-80%, respectively, in PYR rats on days 7-28, while plasma butyrylcholinesterase activity did not significantly change. AChE activity in red blood cells was markedly reduced by 64%-66%. HRR recorded 1 minute after exercise was higher in the PYR group on days 7, 14 and 28, and on day 7 when HRR was estimated at 3 and 5 minutes. Autonomic tone was evaluated pharmacologically using sequential administration of muscarinic (atropine) and adrenergic (propranolol) blockers. Parasympathetic tone was increased in PYR rats as compared with the CTL group. These data support the study hypothesis that subacute pyridostigmine administration enhances HRR by increasing cardiac parasympathetic tone. © 2017 John Wiley & Sons Australia, Ltd.
Segers, L S; Nuding, S C; Ott, M M; Dean, J B; Bolser, D C; O'Connor, R; Morris, K F; Lindsey, B G
2015-01-01
Models of brain stem ventral respiratory column (VRC) circuits typically emphasize populations of neurons, each active during a particular phase of the respiratory cycle. We have proposed that "tonic" pericolumnar expiratory (t-E) neurons tune breathing during baroreceptor-evoked reductions and central chemoreceptor-evoked enhancements of inspiratory (I) drive. The aims of this study were to further characterize the coordinated activity of t-E neurons and test the hypothesis that peripheral chemoreceptors also modulate drive via inhibition of t-E neurons and disinhibition of their inspiratory neuron targets. Spike trains of 828 VRC neurons were acquired by multielectrode arrays along with phrenic nerve signals from 22 decerebrate, vagotomized, neuromuscularly blocked, artificially ventilated adult cats. Forty-eight of 191 t-E neurons fired synchronously with another t-E neuron as indicated by cross-correlogram central peaks; 32 of the 39 synchronous pairs were elements of groups with mutual pairwise correlations. Gravitational clustering identified fluctuations in t-E neuron synchrony. A network model supported the prediction that inhibitory populations with spike synchrony reduce target neuron firing probabilities, resulting in offset or central correlogram troughs. In five animals, stimulation of carotid chemoreceptors evoked changes in the firing rates of 179 of 240 neurons. Thirty-two neuron pairs had correlogram troughs consistent with convergent and divergent t-E inhibition of I cells and disinhibitory enhancement of drive. Four of 10 t-E neurons that responded to sequential stimulation of peripheral and central chemoreceptors triggered 25 cross-correlograms with offset features. The results support the hypothesis that multiple afferent systems dynamically tune inspiratory drive in part via coordinated t-E neurons. Copyright © 2015 the American Physiological Society.
Laine, L; Katz, P O; Johnson, D A; Ibegbu, I; Goldstein, M J; Chou, C; Rossiter, G; Lu, Y
2011-01-01
Current PPIs may not achieve desired outcomes in some GERD patients due to limited duration of acid inhibition. To evaluate a novel rabeprazole extended release (ER), which provides longer duration of drug exposure and acid suppression, in healing and symptomatic resolution of moderate-severe erosive oesophagitis. Patients with LA grade C or D oesophagitis were randomised to rabeprazole-ER 50 mg or esomeprazole 40 mg once daily in two identical 8-week double-blind trials (N = 2130). Two primary endpoints were tested sequentially: (1) healing by 8 weeks [hypothesis: rabeprazole-ER non-inferior to esomeprazole (non-inferiority margin = 8%)], (2) healing by 4 weeks [hypothesis: rabeprazole-ER superior to esomeprazole (P < 0.05)]. The secondary endpoint was sustained heartburn resolution at 4 weeks. Rabeprazole-ER was non-inferior to esomeprazole in week-8 healing (80.0% vs. 75.0%; 77.5% vs. 78.4%). Week-4 healing (54.8% vs. 50.3%; 50.9% vs. 50.7%) and sustained heartburn resolution (48.3% vs. 48.2%; 53.2% vs. 52.5%) were not significantly different. Post hoc combined results for grade D revealed rabeprazole-ER vs. esomeprazole differences in week-8 healing = 10.4% (95% CI: -1.4%, 22.2%) and week-4 healing = 12.0% (P = 0.048). Rabeprazole-ER is as effective as esomeprazole in healing moderate-severe oesophagitis and achieves similar rates of heartburn resolution. Subgroup analysis suggests the possibility of benefit in severe oesophagitis, but this requires further evaluation (ClinicalTrials.gov: NCT00658528 and NCT00658775). © 2010 Blackwell Publishing Ltd.
Norwood, Braxton A.; Bumanglag, Argyle V.; Osculati, Francesco; Sbarbati, Andrea; Marzola, Pasquina; Nicolato, Elena; Fabene, Paolo F.; Sloviter, Robert S.
2010-01-01
In refractory temporal lobe epilepsy, seizures often arise from a shrunken hippocampus exhibiting a pattern of selective neuron loss called “classic hippocampal sclerosis.” No single experimental injury has reproduced this specific pathology, suggesting that hippocampal atrophy might be a progressive “endstage” pathology resulting from years of spontaneous seizures. We posed the alternate hypothesis that classic hippocampal sclerosis results from a single excitatory event that has never been successfully modeled experimentally because convulsive status epilepticus, the insult most commonly used to produce epileptogenic brain injury, is too severe and necessarily terminated before the hippocampus receives the needed duration of excitation. We tested this hypothesis by producing prolonged hippocampal excitation in awake rats without causing convulsive status epilepticus. Two daily 30-minute episodes of perforant pathway stimulation in Sprague-Dawley rats increased granule cell paired-pulse inhibition, decreased epileptiform afterdischarge durations during 8 hours of subsequent stimulation, and prevented convulsive status epilepticus. Similarly, one 8-hour episode of reduced-intensity stimulation in Long-Evans rats, which are relatively resistant to developing status epilepticus, produced hippocampal discharges without causing status epilepticus. Both paradigms immediately produced the extensive neuronal injury that defines classic hippocampal sclerosis, without giving any clinical indication during the insult that an injury was being inflicted. Spontaneous hippocampal-onset seizures began 16–25 days post-injury, before hippocampal atrophy developed, as demonstrated by sequential magnetic resonance imaging. These results indicate that classic hippocampal sclerosis is uniquely produced by a single episode of clinically “cryptic” excitation. Epileptogenic insults may often involve prolonged excitation that goes undetected at the time of injury. PMID:20575073
Eldhuset, Toril D; Lange, Holger; de Wit, Helene A
2006-10-01
Toxic effects of aluminium (Al) on Picea abies (L.) Karst. (Norway spruce) trees are well documented in laboratory-scale experiments, but field-based evidence is scarce. This paper presents results on fine root growth and chemistry from a field manipulation experiment in a P. abies stand that was 45 years old when the experiment started in 1996. Different amounts of dissolved aluminium were added as AlCl3 by means of periodic irrigation during the growing season in the period 1997-2002. Potentially toxic concentrations of Al in the soil solution were obtained. Fine roots were studied from direct cores (1996) and sequential root ingrowth cores (1999, 2001, 2002) in the mineral soil (0-40 cm). We tested two hypotheses: (1) elevated concentration of Al in the root zone leads to significant changes in root biomass, partitioning into fine, coarse, living or dead fractions, and distribution with depth; (2) elevated Al concentration leads to a noticeable uptake of Al and reduced uptake of Ca and Mg; this results in Ca and Mg depletion in roots. Hypothesis 1 was only marginally supported, as just a few significant treatment effects on biomass were found. Hypothesis 2 was supported in part; Al addition led to increased root concentrations of Al in 1999 and 2002 and reduced Mg/Al in 1999. Comparison of roots from subsequent root samplings showed a decrease in Al and S over time. The results illustrated that 7 years of elevated Al(tot) concentrations in the soil solution up to 200 microM are not likely to affect root growth. We also discuss possible improvements of the experimental approach.
Segers, L. S.; Nuding, S. C.; Ott, M. M.; Dean, J. B.; Bolser, D. C.; O'Connor, R.; Morris, K. F.
2014-01-01
Models of brain stem ventral respiratory column (VRC) circuits typically emphasize populations of neurons, each active during a particular phase of the respiratory cycle. We have proposed that “tonic” pericolumnar expiratory (t-E) neurons tune breathing during baroreceptor-evoked reductions and central chemoreceptor-evoked enhancements of inspiratory (I) drive. The aims of this study were to further characterize the coordinated activity of t-E neurons and test the hypothesis that peripheral chemoreceptors also modulate drive via inhibition of t-E neurons and disinhibition of their inspiratory neuron targets. Spike trains of 828 VRC neurons were acquired by multielectrode arrays along with phrenic nerve signals from 22 decerebrate, vagotomized, neuromuscularly blocked, artificially ventilated adult cats. Forty-eight of 191 t-E neurons fired synchronously with another t-E neuron as indicated by cross-correlogram central peaks; 32 of the 39 synchronous pairs were elements of groups with mutual pairwise correlations. Gravitational clustering identified fluctuations in t-E neuron synchrony. A network model supported the prediction that inhibitory populations with spike synchrony reduce target neuron firing probabilities, resulting in offset or central correlogram troughs. In five animals, stimulation of carotid chemoreceptors evoked changes in the firing rates of 179 of 240 neurons. Thirty-two neuron pairs had correlogram troughs consistent with convergent and divergent t-E inhibition of I cells and disinhibitory enhancement of drive. Four of 10 t-E neurons that responded to sequential stimulation of peripheral and central chemoreceptors triggered 25 cross-correlograms with offset features. The results support the hypothesis that multiple afferent systems dynamically tune inspiratory drive in part via coordinated t-E neurons. PMID:25343784
Beck, Roswitha; Günther, Lisa; Xiong, Guoming; Potschka, Heidrun; Böning, Guido; Bartenstein, Peter; Brandt, Thomas; Jahn, Klaus; Dieterich, Marianne; Strupp, Michael; la Fougère, Christian; Zwergal, Andreas
2014-11-01
Early symptomatic treatment of acute unilateral vestibulopathy is thought to impede the course of ensuing central vestibular compensation (VC). Despite the great clinical importance of this hypothesis there is no experimental evidence of its validity. The present study addressed this question by investigating the direct effect of 4-aminopyridine (4-AP) on ocular motor and postural symptoms in acute unilateral vestibulopathy as well as its long-term consequences for VC in a rat model of chemical unilateral labyrinthectomy (UL). After UL, one group of Sprague-Dawley rats was treated with 4-AP p.o. (1mg/kg/day), another with 0.9% NaCl solution p.o. for 3days. Behavioural testing for symptoms of vestibular tone imbalance was done 1day before and 1, 2, 3, 5, 7, 9, 15, 21, and 30days after UL. In addition, sequential whole-brain [(18)F]-FDG-μPET was performed before and 1, 3, 7, 15, and 30days after UL to examine and visualize 4-AP-induced modulation of VC. Administration of 4-AP on days 1-3 significantly improved postural imbalance 2h after administration compared to that in controls. This effect was only transient. Remarkably, the 4-AP group had a prolonged and impaired course of postural compensation compared to that of controls. The μPET revealed a significant increase of regional cerebral glucose metabolism (rCGM) in the vestibulocerebellum 2h after administration of 4-AP. However, the 4-AP group exhibited a persistent asymmetry of rCGM after day 3 in the vestibular nuclei and posterolateral thalami. In conclusion, this study confirms the hypothesis that early pharmacological abatement of vestibular symptoms impedes VC. Copyright © 2014 Elsevier Inc. All rights reserved.
A Novel Ship-Tracking Method for GF-4 Satellite Sequential Images.
Yao, Libo; Liu, Yong; He, You
2018-06-22
The geostationary remote sensing satellite has the capability of wide scanning, persistent observation and operational response, and has tremendous potential for maritime target surveillance. The GF-4 satellite is the first geostationary orbit (GEO) optical remote sensing satellite with medium resolution in China. In this paper, a novel ship-tracking method in GF-4 satellite sequential imagery is proposed. The algorithm has three stages. First, a local visual saliency map based on local peak signal-to-noise ratio (PSNR) is used to detect ships in a single frame of GF-4 satellite sequential images. Second, the accuracy positioning of each potential target is realized by a dynamic correction using the rational polynomial coefficients (RPCs) and automatic identification system (AIS) data of ships. Finally, an improved multiple hypotheses tracking (MHT) algorithm with amplitude information is used to track ships by further removing the false targets, and to estimate ships’ motion parameters. The algorithm has been tested using GF-4 sequential images and AIS data. The results of the experiment demonstrate that the algorithm achieves good tracking performance in GF-4 satellite sequential images and estimates the motion information of ships accurately.
Costa, Marilia G; Barbosa, José C; Yamamoto, Pedro T
2007-01-01
The sequential sampling is characterized by using samples of variable sizes, and has the advantage of reducing sampling time and costs if compared to fixed-size sampling. To introduce an adequate management for orthezia, sequential sampling plans were developed for orchards under low and high infestation. Data were collected in Matão, SP, in commercial stands of the orange variety 'Pêra Rio', at five, nine and 15 years of age. Twenty samplings were performed in the whole area of each stand by observing the presence or absence of scales on plants, being plots comprised of ten plants. After observing that in all of the three stands the scale population was distributed according to the contagious model, fitting the Negative Binomial Distribution in most samplings, two sequential sampling plans were constructed according to the Sequential Likelihood Ratio Test (SLRT). To construct these plans an economic threshold of 2% was adopted and the type I and II error probabilities were fixed in alpha = beta = 0.10. Results showed that the maximum numbers of samples expected to determine control need were 172 and 76 samples for stands with low and high infestation, respectively.
The Importance of Teaching Power in Statistical Hypothesis Testing
ERIC Educational Resources Information Center
Olinsky, Alan; Schumacher, Phyllis; Quinn, John
2012-01-01
In this paper, we discuss the importance of teaching power considerations in statistical hypothesis testing. Statistical power analysis determines the ability of a study to detect a meaningful effect size, where the effect size is the difference between the hypothesized value of the population parameter under the null hypothesis and the true value…
The Relation between Parental Values and Parenting Behavior: A Test of the Kohn Hypothesis.
ERIC Educational Resources Information Center
Luster, Tom; And Others
1989-01-01
Used data on 65 mother-infant dyads to test Kohn's hypothesis concerning the relation between values and parenting behavior. Findings support Kohn's hypothesis that parents who value self-direction would emphasize supportive function of parenting and parents who value conformity would emphasize their obligations to impose restraints. (Author/NB)
Cognitive Biases in the Interpretation of Autonomic Arousal: A Test of the Construal Bias Hypothesis
ERIC Educational Resources Information Center
Ciani, Keith D.; Easter, Matthew A.; Summers, Jessica J.; Posada, Maria L.
2009-01-01
According to Bandura's construal bias hypothesis, derived from social cognitive theory, persons with the same heightened state of autonomic arousal may experience either pleasant or deleterious emotions depending on the strength of perceived self-efficacy. The current study tested this hypothesis by proposing that college students' preexisting…
Overgaard, Morten; Lindeløv, Jonas; Svejstrup, Stinna; Døssing, Marianne; Hvid, Tanja; Kauffmann, Oliver; Mouridsen, Kim
2013-01-01
This paper reports an experiment intended to test a particular hypothesis derived from blindsight research, which we name the “source misidentification hypothesis.” According to this hypothesis, a subject may be correct about a stimulus without being correct about how she had access to this knowledge (whether the stimulus was visual, auditory, or something else). We test this hypothesis in healthy subjects, asking them to report whether a masked stimulus was presented auditorily or visually, what the stimulus was, and how clearly they experienced the stimulus using the Perceptual Awareness Scale (PAS). We suggest that knowledge about perceptual modality may be a necessary precondition in order to issue correct reports of which stimulus was presented. Furthermore, we find that PAS ratings correlate with correctness, and that subjects are at chance level when reporting no conscious experience of the stimulus. To demonstrate that particular levels of reporting accuracy are obtained, we employ a statistical strategy, which operationally tests the hypothesis of non-equality, such that the usual rejection of the null-hypothesis admits the conclusion of equivalence. PMID:23508677
A large scale test of the gaming-enhancement hypothesis
Wang, John C.
2016-01-01
A growing research literature suggests that regular electronic game play and game-based training programs may confer practically significant benefits to cognitive functioning. Most evidence supporting this idea, the gaming-enhancement hypothesis, has been collected in small-scale studies of university students and older adults. This research investigated the hypothesis in a general way with a large sample of 1,847 school-aged children. Our aim was to examine the relations between young people’s gaming experiences and an objective test of reasoning performance. Using a Bayesian hypothesis testing approach, evidence for the gaming-enhancement and null hypotheses were compared. Results provided no substantive evidence supporting the idea that having preference for or regularly playing commercially available games was positively associated with reasoning ability. Evidence ranged from equivocal to very strong in support for the null hypothesis over what was predicted. The discussion focuses on the value of Bayesian hypothesis testing for investigating electronic gaming effects, the importance of open science practices, and pre-registered designs to improve the quality of future work. PMID:27896035
ERIC Educational Resources Information Center
SAW, J.G.
THIS PAPER DEALS WITH SOME TESTS OF HYPOTHESIS FREQUENTLY ENCOUNTERED IN THE ANALYSIS OF MULTIVARIATE DATA. THE TYPE OF HYPOTHESIS CONSIDERED IS THAT WHICH THE STATISTICIAN CAN ANSWER IN THE NEGATIVE OR AFFIRMATIVE. THE DOOLITTLE METHOD MAKES IT POSSIBLE TO EVALUATE THE DETERMINANT OF A MATRIX OF HIGH ORDER, TO SOLVE A MATRIX EQUATION, OR TO…
Kruschke, John K; Liddell, Torrin M
2018-02-01
In the practice of data analysis, there is a conceptual distinction between hypothesis testing, on the one hand, and estimation with quantified uncertainty on the other. Among frequentists in psychology, a shift of emphasis from hypothesis testing to estimation has been dubbed "the New Statistics" (Cumming 2014). A second conceptual distinction is between frequentist methods and Bayesian methods. Our main goal in this article is to explain how Bayesian methods achieve the goals of the New Statistics better than frequentist methods. The article reviews frequentist and Bayesian approaches to hypothesis testing and to estimation with confidence or credible intervals. The article also describes Bayesian approaches to meta-analysis, randomized controlled trials, and power analysis.
Sadeque, Farig; Xu, Dongfang; Bethard, Steven
2017-01-01
The 2017 CLEF eRisk pilot task focuses on automatically detecting depression as early as possible from a users’ posts to Reddit. In this paper we present the techniques employed for the University of Arizona team’s participation in this early risk detection shared task. We leveraged external information beyond the small training set, including a preexisting depression lexicon and concepts from the Unified Medical Language System as features. For prediction, we used both sequential (recurrent neural network) and non-sequential (support vector machine) models. Our models perform decently on the test data, and the recurrent neural models perform better than the non-sequential support vector machines while using the same feature sets. PMID:29075167
Parallelization of sequential Gaussian, indicator and direct simulation algorithms
NASA Astrophysics Data System (ADS)
Nunes, Ruben; Almeida, José A.
2010-08-01
Improving the performance and robustness of algorithms on new high-performance parallel computing architectures is a key issue in efficiently performing 2D and 3D studies with large amount of data. In geostatistics, sequential simulation algorithms are good candidates for parallelization. When compared with other computational applications in geosciences (such as fluid flow simulators), sequential simulation software is not extremely computationally intensive, but parallelization can make it more efficient and creates alternatives for its integration in inverse modelling approaches. This paper describes the implementation and benchmarking of a parallel version of the three classic sequential simulation algorithms: direct sequential simulation (DSS), sequential indicator simulation (SIS) and sequential Gaussian simulation (SGS). For this purpose, the source used was GSLIB, but the entire code was extensively modified to take into account the parallelization approach and was also rewritten in the C programming language. The paper also explains in detail the parallelization strategy and the main modifications. Regarding the integration of secondary information, the DSS algorithm is able to perform simple kriging with local means, kriging with an external drift and collocated cokriging with both local and global correlations. SIS includes a local correction of probabilities. Finally, a brief comparison is presented of simulation results using one, two and four processors. All performance tests were carried out on 2D soil data samples. The source code is completely open source and easy to read. It should be noted that the code is only fully compatible with Microsoft Visual C and should be adapted for other systems/compilers.
Reese, Tiffany A.; Bi, Kevin; Kambal, Amal; Filali-Mouhim, Ali; Beura, Lalit K.; Bürger, Matheus C.; Pulendran, Bali; Sekaly, Rafick; Jameson, Stephen C.; Masopust, David; Haining, W. Nicholas; Virgin, Herbert W.
2016-01-01
Summary Immune responses differ between laboratory mice and humans. Chronic infection with viruses and parasites are common in humans, but are absent in laboratory mice, and thus represent potential contributors to inter-species differences in immunity. To test this, we sequentially infected laboratory mice with herpesviruses, influenza, and an intestinal helminth, and compared their blood immune signatures to mock-infected mice before and after vaccination against Yellow Fever Virus (YFV-17D). Sequential infection altered pre- and post-vaccination gene expression, cytokines, and antibodies in blood. Sequential pathogen exposure induced gene signatures that recapitulated those seen in blood from pet store-raised versus laboratory mice, and adult versus cord blood in humans. Therefore basal and vaccine-induced murine immune responses are altered by infection with agents common outside of barrier facilities. This raises the possibility that we can improve mouse models of vaccination and immunity by selective microbial exposure of laboratory animals to mimic that of humans. PMID:27107939
Roemhild, Roderich; Barbosa, Camilo; Beardmore, Robert E; Jansen, Gunther; Schulenburg, Hinrich
2015-01-01
Antibiotic resistance is a growing concern to public health. New treatment strategies may alleviate the situation by slowing down the evolution of resistance. Here, we evaluated sequential treatment protocols using two fully independent laboratory-controlled evolution experiments with the human pathogen Pseudomonas aeruginosa PA14 and two pairs of clinically relevant antibiotics (doripenem/ciprofloxacin and cefsulodin/gentamicin). Our results consistently show that the sequential application of two antibiotics decelerates resistance evolution relative to monotherapy. Sequential treatment enhanced population extinction although we applied antibiotics at sublethal dosage. In both experiments, we identified an order effect of the antibiotics used in the sequential protocol, leading to significant variation in the long-term efficacy of the tested protocols. These variations appear to be caused by asymmetric evolutionary constraints, whereby adaptation to one drug slowed down adaptation to the other drug, but not vice versa. An understanding of such asymmetric constraints may help future development of evolutionary robust treatments against infectious disease. PMID:26640520
Three parameters optimizing closed-loop control in sequential segmental neuromuscular stimulation.
Zonnevijlle, E D; Somia, N N; Perez Abadia, G; Stremel, R W; Maldonado, C J; Werker, P M; Kon, M; Barker, J H
1999-05-01
In conventional dynamic myoplasties, the force generation is poorly controlled. This causes unnecessary fatigue of the transposed/transplanted electrically stimulated muscles and causes damage to the involved tissues. We introduced sequential segmental neuromuscular stimulation (SSNS) to reduce muscle fatigue by allowing part of the muscle to rest periodically while the other parts work. Despite this improvement, we hypothesize that fatigue could be further reduced in some applications of dynamic myoplasty if the muscles were made to contract according to need. The first necessary step is to gain appropriate control over the contractile activity of the dynamic myoplasty. Therefore, closed-loop control was tested on a sequentially stimulated neosphincter to strive for the best possible control over the amount of generated pressure. A selection of parameters was validated for optimizing control. We concluded that the frequency of corrections, the threshold for corrections, and the transition time are meaningful parameters in the controlling algorithm of the closed-loop control in a sequentially stimulated myoplasty.
Blocking for Sequential Political Experiments
Moore, Sally A.
2013-01-01
In typical political experiments, researchers randomize a set of households, precincts, or individuals to treatments all at once, and characteristics of all units are known at the time of randomization. However, in many other experiments, subjects “trickle in” to be randomized to treatment conditions, usually via complete randomization. To take advantage of the rich background data that researchers often have (but underutilize) in these experiments, we develop methods that use continuous covariates to assign treatments sequentially. We build on biased coin and minimization procedures for discrete covariates and demonstrate that our methods outperform complete randomization, producing better covariate balance in simulated data. We then describe how we selected and deployed a sequential blocking method in a clinical trial and demonstrate the advantages of our having done so. Further, we show how that method would have performed in two larger sequential political trials. Finally, we compare causal effect estimates from differences in means, augmented inverse propensity weighted estimators, and randomization test inversion. PMID:24143061
Crespi, Ilaria; Sulas, Maria Giovanna; Mora, Riccardo; Naldi, Paola; Vecchio, Domizia; Comi, Cristoforo; Cantello, Roberto; Bellomo, Giorgio
2017-03-01
Isoelectrofocusing (IEF) to detect oligoclonal bands (OBCs) in cerebrospinal fluid (CSF) is the gold standard approach for evaluating intrathecal immunoglobulin synthesis in multiple sclerosis (MS) but the kappa free light chain index (KFLCi) is emerging as an alternative marker, and the combined/sequential uses of IEF and KFLCi have never been challenged. CSF and serum albumin, IgG, kFLC and lFLC were measured by nephelometry; albumin, IgG and kFLC quotients as well as Link and kFLC indexes were calculated; OCBs were evaluated by immunofixation. A total of 150 consecutive patients: 48 with MS, 32 with other neurological inflammatory diseases (NID), 62 with neurological non-inflammatory diseases (NNID), and 8 without any detectable neurological disease (NND) were investigated. Both IEF and KFLCi showed a similar accuracy as diagnostic tests for multiple sclerosis. The high sensitivity and specificity associated with the lower cost of KFLCi suggested to use this test first, followed by IEF as a confirmative procedure. The sequential use of IEF and KFLCi showed high diagnostic efficiency with cost reduction of 43 and 21%, if compared to the contemporary use of both tests, or the unique use of IEF in all patients. The "sequential testing" using KFLCi followed by IEF in MS represents an optimal procedure with accurate performance and lower costs.
Final Technical Report 1976-1977. Systemwide Evaluation. Publication Number: 76.69.
ERIC Educational Resources Information Center
Austin Independent School District, TX. Office of Research and Evaluation.
A series of reports describes the activities of the Office of Research and Evaluation and compiles data descriptive of the Austin (Texas) Independent School District. This report consists of four appendices, one for each of four test batteries: California Achievement Tests, Sequential Tests of Educational Progress, Boehm Tests of Basic Concepts,…
Young, Anna M.; Cordier, Breanne; Mundry, Roger; Wright, Timothy F.
2014-01-01
In many social species group, members share acoustically similar calls. Functional hypotheses have been proposed for call sharing, but previous studies have been limited by an inability to distinguish among these hypotheses. We examined the function of vocal sharing in female budgerigars with a two-part experimental design that allowed us to distinguish between two functional hypotheses. The social association hypothesis proposes that shared calls help animals mediate affiliative and aggressive interactions, while the password hypothesis proposes that shared calls allow animals to distinguish group identity and exclude nonmembers. We also tested the labeling hypothesis, a mechanistic explanation which proposes that shared calls are used to address specific individuals within the sender–receiver relationship. We tested the social association hypothesis by creating four–member flocks of unfamiliar female budgerigars (Melopsittacus undulatus) and then monitoring the birds’ calls, social behaviors, and stress levels via fecal glucocorticoid metabolites. We tested the password hypothesis by moving immigrants into established social groups. To test the labeling hypothesis, we conducted additional recording sessions in which individuals were paired with different group members. The social association hypothesis was supported by the development of multiple shared call types in each cage and a correlation between the number of shared call types and the number of aggressive interactions between pairs of birds. We also found support for calls serving as a labeling mechanism using discriminant function analysis with a permutation procedure. Our results did not support the password hypothesis, as there was no difference in stress or directed behaviors between immigrant and control birds. PMID:24860236
Sachan, Prachee; Kumar, Nidhi; Sharma, Jagdish Prasad
2014-01-01
Background: Density of the drugs injected intrathecally is an important factor that influences spread in the cerebrospinal fluid. Mixing adjuvants with local anesthetics (LA) alters their density and hence their spread compared to when given sequentially in seperate syringes. Aims: To evaluate the efficacy of intrathecal administration of hyperbaric bupivacaine (HB) and clonidine as a mixture and sequentially in terms of block characteristics, hemodynamics, neonatal outcome, and postoperative pain. Setting and Design: Prospective randomized single blind study at a tertiary center from 2010 to 2012. Materials and Methods: Ninety full-term parturient scheduled for elective cesarean sections were divided into three groups on the basis of technique of intrathecal drug administration. Group M received mixture of 75 μg clonidine and 10 mg HB 0.5%. Group A received 75 μg clonidine after administration of 10 mg HB 0.5% through separate syringe. Group B received 75 μg clonidine before HB 0.5% (10 mg) through separate syringe. Statistical analysis used: Observational descriptive statistics, analysis of variance with Bonferroni multiple comparison post hoc test, and Chi-square test. Results: Time to achieve complete sensory and motor block was less in group A and B in which drugs were given sequentially. Duration of analgesia lasted longer in group B (474.3 ± 20.79 min) and group A (472.50 ± 22.11 min) than in group M (337 ± 18.22 min) with clinically insignificant influence on hemodynamic parameters and sedation. Conclusion: Sequential technique reduces time to achieve complete sensory and motor block, delays block regression, and significantly prolongs the duration of analgesia. However, it did not matter much whether clonidine was administered before or after HB. PMID:25886098
Tarhini, Mahdi; Fayyad-Kazan, Mohammad; Fayyad-Kazan, Hussein; Mokbel, Mahmoud; Nasreddine, Mohammad; Badran, Bassam; Kchour, Ghada
2018-04-01
Helicobacter Pylori (H. Pylori) is the most common cause of peptic ulcer disease (PUD) and represents a strong risk factor for gastric cancer. Treatment of H. Pylori is, therefore, a persistent need to avoid serious medical complications. Resistance to antibiotics remains to be the major challenge for H. Pylori eradication. In this study, we determined the prevalence of H. pylori infection and evaluated H. pylori eradication efficacy of bismuth-containing quadruple therapy (Pylera) versus 14-days sequential therapy in treatment naïve-Lebanese patients. 1030 patients, showing symptoms of peptic ulcer (PU) and gastritis, underwent 14 C-Urea Breath Test and esophagogastroduodenoscopy to examine H. Pylori infection and gastrointestinal disorders. Among the H. Pylori-positive patients 60 individuals were randomly selected, separated into two groups (each consisting of 30 patients) and treated with either bismuth-containing quadruple therapy or 14-days sequential therapy. We show that of the 1050 patients tested: 46.2% were H. pylori-positive, 55% had gastritis, 46.2% had both gastritis and H. pylori infection, 8.8% had gastritis but no H. pylori infection, 44.9% had neither gastritis nor H. pylori infection. Following the 14-days sequential therapy, the eradication rate was significantly higher than that obtained upon using bismuth-containing quadruple therapy [80% (24/30) versus 50% (15/30), χ 2 = 5.93, P = 0.015]. In conclusion, we determined H. pylori and gastritis prevalence among Lebanese PU-patients and showed that 14-days sequential therapy is more efficient than bismuth-containing quadruple therapy in terms of H. Pylori-eradication. Published by Elsevier Ltd.
Avery, Taliser R; Kulldorff, Martin; Vilk, Yury; Li, Lingling; Cheetham, T Craig; Dublin, Sascha; Davis, Robert L; Liu, Liyan; Herrinton, Lisa; Brown, Jeffrey S
2013-05-01
This study describes practical considerations for implementation of near real-time medical product safety surveillance in a distributed health data network. We conducted pilot active safety surveillance comparing generic divalproex sodium to historical branded product at four health plans from April to October 2009. Outcomes reported are all-cause emergency room visits and fractures. One retrospective data extract was completed (January 2002-June 2008), followed by seven prospective monthly extracts (January 2008-November 2009). To evaluate delays in claims processing, we used three analytic approaches: near real-time sequential analysis, sequential analysis with 1.5 month delay, and nonsequential (using final retrospective data). Sequential analyses used the maximized sequential probability ratio test. Procedural and logistical barriers to active surveillance were documented. We identified 6586 new users of generic divalproex sodium and 43,960 new users of the branded product. Quality control methods identified 16 extract errors, which were corrected. Near real-time extracts captured 87.5% of emergency room visits and 50.0% of fractures, which improved to 98.3% and 68.7% respectively with 1.5 month delay. We did not identify signals for either outcome regardless of extract timeframe, and slight differences in the test statistic and relative risk estimates were found. Near real-time sequential safety surveillance is feasible, but several barriers warrant attention. Data quality review of each data extract was necessary. Although signal detection was not affected by delay in analysis, when using a historical control group differential accrual between exposure and outcomes may theoretically bias near real-time risk estimates towards the null, causing failure to detect a signal. Copyright © 2013 John Wiley & Sons, Ltd.
Ping-Keng Jao; Yuan-Pin Lin; Yi-Hsuan Yang; Tzyy-Ping Jung
2015-08-01
An emerging challenge for emotion classification using electroencephalography (EEG) is how to effectively alleviate day-to-day variability in raw data. This study employed the robust principal component analysis (RPCA) to address the problem with a posed hypothesis that background or emotion-irrelevant EEG perturbations lead to certain variability across days and somehow submerge emotion-related EEG dynamics. The empirical results of this study evidently validated our hypothesis and demonstrated the RPCA's feasibility through the analysis of a five-day dataset of 12 subjects. The RPCA allowed tackling the sparse emotion-relevant EEG dynamics from the accompanied background perturbations across days. Sequentially, leveraging the RPCA-purified EEG trials from more days appeared to improve the emotion-classification performance steadily, which was not found in the case using the raw EEG features. Therefore, incorporating the RPCA with existing emotion-aware machine-learning frameworks on a longitudinal dataset of each individual may shed light on the development of a robust affective brain-computer interface (ABCI) that can alleviate ecological inter-day variability.
An Extension of RSS-based Model Comparison Tests for Weighted Least Squares
2012-08-22
use the model comparison test statistic to analyze the null hypothesis. Under the null hypothesis, the weighted least squares cost functional is JWLS ...q̂WLSH ) = 10.3040×106. Under the alternative hypothesis, the weighted least squares cost functional is JWLS (q̂WLS) = 8.8394 × 106. Thus the model
Hypothesis testing of scientific Monte Carlo calculations.
Wallerberger, Markus; Gull, Emanuel
2017-11-01
The steadily increasing size of scientific Monte Carlo simulations and the desire for robust, correct, and reproducible results necessitates rigorous testing procedures for scientific simulations in order to detect numerical problems and programming bugs. However, the testing paradigms developed for deterministic algorithms have proven to be ill suited for stochastic algorithms. In this paper we demonstrate explicitly how the technique of statistical hypothesis testing, which is in wide use in other fields of science, can be used to devise automatic and reliable tests for Monte Carlo methods, and we show that these tests are able to detect some of the common problems encountered in stochastic scientific simulations. We argue that hypothesis testing should become part of the standard testing toolkit for scientific simulations.
Hypothesis testing of scientific Monte Carlo calculations
NASA Astrophysics Data System (ADS)
Wallerberger, Markus; Gull, Emanuel
2017-11-01
The steadily increasing size of scientific Monte Carlo simulations and the desire for robust, correct, and reproducible results necessitates rigorous testing procedures for scientific simulations in order to detect numerical problems and programming bugs. However, the testing paradigms developed for deterministic algorithms have proven to be ill suited for stochastic algorithms. In this paper we demonstrate explicitly how the technique of statistical hypothesis testing, which is in wide use in other fields of science, can be used to devise automatic and reliable tests for Monte Carlo methods, and we show that these tests are able to detect some of the common problems encountered in stochastic scientific simulations. We argue that hypothesis testing should become part of the standard testing toolkit for scientific simulations.
Sex ratios in the two Germanies: a test of the economic stress hypothesis.
Catalano, Ralph A
2003-09-01
Literature describing temporal variation in the secondary sex ratio among humans reports an association between population stressors and declines in the odds of male birth. Explanations of this phenomenon draw on reports that stressed females spontaneously abort male more than female fetuses, and that stressed males exhibit reduced sperm motility. This work has led to the argument that population stress induced by a declining economy reduces the human sex ratio. No direct test of this hypothesis appears in the literature. Here, a test is offered based on a comparison of the sex ratio in East and West Germany for the years 1946 to 1999. The theory suggests that the East German sex ratio should be lower in 1991, when East Germany's economy collapsed, than expected from its own history and from the sex ratio in West Germany. The hypothesis is tested using time-series modelling methods. The data support the hypothesis. The sex ratio in East Germany was at its lowest in 1991. This first direct test supports the hypothesis that economic decline reduces the human sex ratio.
9 CFR 113.311 - Bovine Virus Diarrhea Vaccine.
Code of Federal Regulations, 2010 CFR
2010-01-01
... virus diarrhea post-challenge; or both, the Master Seed Virus is unsatisfactory. (6) A sequential test... virus diarrhea susceptible calves shall be used as test animals (20 vaccinates and five controls). Blood... serum dilution in a varying serum-constant virus neutralization test with less than 500 TCID50 of bovine...
Mutual Information Item Selection in Adaptive Classification Testing
ERIC Educational Resources Information Center
Weissman, Alexander
2007-01-01
A general approach for item selection in adaptive multiple-category classification tests is provided. The approach uses mutual information (MI), a special case of the Kullback-Leibler distance, or relative entropy. MI works efficiently with the sequential probability ratio test and alleviates the difficulties encountered with using other local-…
Pre-testing Orientation for the Disadvantaged.
ERIC Educational Resources Information Center
Mihalka, Joseph A.
A pre-testing orientation was incorporated into the Work Incentives Program, a pre-vocational program for disadvantaged youth. Test-taking skills were taught in seven and one half hours of instruction and a variety of methods were used to provide a sequential experience with distributed learning, positive reinforcement, and immediate feedback of…
New Testing Methods to Assess Technical Problem-Solving Ability.
ERIC Educational Resources Information Center
Hambleton, Ronald K.; And Others
Tests to assess problem-solving ability being provided for the Air Force are described, and some details on the development and validation of these computer-administered diagnostic achievement tests are discussed. Three measurement approaches were employed: (1) sequential problem solving; (2) context-free assessment of fundamental skills and…
9 CFR 113.309 - Bovine Parainfluenza3 Vaccine.
Code of Federal Regulations, 2011 CFR
2011-01-01
...-challenge for serum antibody studies. (6) Satisfactory Test Criteria: (i) All virus isolations attempts... develop antibody titers of 1:32 or greater by day 6 ±2 days post-challenge. (8) A sequential test... parainfluenza, susceptible calves shall be used as test animals (20 vaccinates and five controls). Blood samples...
9 CFR 113.309 - Bovine Parainfluenza3 Vaccine.
Code of Federal Regulations, 2014 CFR
2014-01-01
...-challenge for serum antibody studies. (6) Satisfactory Test Criteria: (i) All virus isolations attempts... develop antibody titers of 1:32 or greater by day 6 ±2 days post-challenge. (8) A sequential test... parainfluenza, susceptible calves shall be used as test animals (20 vaccinates and five controls). Blood samples...
9 CFR 113.309 - Bovine Parainfluenza3 Vaccine.
Code of Federal Regulations, 2013 CFR
2013-01-01
...-challenge for serum antibody studies. (6) Satisfactory Test Criteria: (i) All virus isolations attempts... develop antibody titers of 1:32 or greater by day 6 ±2 days post-challenge. (8) A sequential test... parainfluenza, susceptible calves shall be used as test animals (20 vaccinates and five controls). Blood samples...
10 CFR 71.73 - Hypothetical accident conditions.
Code of Federal Regulations, 2010 CFR
2010-01-01
... 10 Energy 2 2010-01-01 2010-01-01 false Hypothetical accident conditions. 71.73 Section 71.73... Package, Special Form, and LSA-III Tests 2 § 71.73 Hypothetical accident conditions. (a) Test procedures. Evaluation for hypothetical accident conditions is to be based on sequential application of the tests...
The Use of Tailored Testing with Instructional Programs. Final Report.
ERIC Educational Resources Information Center
Reckase, Mark D.
A computerized testing system was implemented in conjunction with the Radar Technician Training Course at the Naval Training Center, Great Lakes, Illinois. The feasibility of the system and students' attitudes toward it were examined. The system, a multilevel, microprocessor-based computer network, administered tests in a sequential, fixed length…
Testing for Factorial Invariance in the Context of Construct Validation
ERIC Educational Resources Information Center
Dimitrov, Dimiter M.
2010-01-01
This article describes the logic and procedures behind testing for factorial invariance across groups in the context of construct validation. The procedures include testing for configural, measurement, and structural invariance in the framework of multiple-group confirmatory factor analysis (CFA). The "forward" (sequential constraint imposition)…
Chew, Soo Hong; Li, King King; Chark, Robin; Zhong, Songfa
2008-01-01
This experimental economics study using brain imaging techniques investigates the risk-ambiguity distinction in relation to the source preference hypothesis (Fox & Tversky, 1995) in which identically distributed risks arising from different sources of uncertainty may engender distinct preferences for the same decision maker, contrary to classical economic thinking. The use of brain imaging enables sharper testing of the implications of different models of decision-making including Chew and Sagi's (2008) axiomatization of source preference. Using fMRI, brain activations were observed when subjects make 48 sequential binary choices among even-chance lotteries based on whether the trailing digits of a number of stock prices at market closing would be odd or even. Subsequently, subjects rate familiarity of the stock symbols. When contrasting brain activation from more familiar sources with those from less familiar ones, regions appearing to be more active include the putamen, medial frontal cortex, and superior temporal gyrus. ROI analysis showed that the activation patterns in the familiar-unfamiliar and unfamiliar-familiar contrasts are similar to those in the risk-ambiguity and ambiguity-risk contrasts reported by Hsu et al. (2005). This supports the conjecture that the risk-ambiguity distinction can be subsumed by the source preference hypothesis. Our odd-even design has the advantage of inducing the same "unambiguous" probability of half for each subject in each binary comparison. Our finding supports the implications of the Chew-Sagi model and rejects models based on global probabilistic sophistication, including rank-dependent models derived from non-additive probabilities, e.g., Choquet expected utility and cumulative prospect theory, as well as those based on multiple priors, e.g., alpha-maxmin. The finding in Hsu et al. (2005) that orbitofrontal cortex lesion patients display neither ambiguity aversion nor risk aversion offers further support to the Chew-Sagi model. Our finding also supports the Levy et al. (2007) contention of a single valuation system encompassing risk and ambiguity aversion. This is the first neuroimaging study of the source preference hypothesis using a design which can discriminate among decision models ranging from risk-based ones to those relying on multiple priors.
Understanding suicide terrorism: premature dismissal of the religious-belief hypothesis.
Liddle, James R; Machluf, Karin; Shackelford, Todd K
2010-07-06
We comment on work by Ginges, Hansen, and Norenzayan (2009), in which they compare two hypotheses for predicting individual support for suicide terrorism: the religious-belief hypothesis and the coalitional-commitment hypothesis. Although we appreciate the evidence provided in support of the coalitional-commitment hypothesis, we argue that their method of testing the religious-belief hypothesis is conceptually flawed, thus calling into question their conclusion that the religious-belief hypothesis has been disconfirmed. In addition to critiquing the methodology implemented by Ginges et al., we provide suggestions on how the religious-belief hypothesis may be properly tested. It is possible that the premature and unwarranted conclusions reached by Ginges et al. may deter researchers from examining the effect of specific religious beliefs on support for terrorism, and we hope that our comments can mitigate this possibility.
Testing Quantum Models of Conjunction Fallacy on the World Wide Web
NASA Astrophysics Data System (ADS)
Aerts, Diederik; Arguëlles, Jonito Aerts; Beltran, Lester; Beltran, Lyneth; de Bianchi, Massimiliano Sassoli; Sozzo, Sandro; Veloz, Tomas
2017-12-01
The `conjunction fallacy' has been extensively debated by scholars in cognitive science and, in recent times, the discussion has been enriched by the proposal of modeling the fallacy using the quantum formalism. Two major quantum approaches have been put forward: the first assumes that respondents use a two-step sequential reasoning and that the fallacy results from the presence of `question order effects'; the second assumes that respondents evaluate the cognitive situation as a whole and that the fallacy results from the `emergence of new meanings', as an `effect of overextension' in the conceptual conjunction. Thus, the question arises as to determine whether and to what extent conjunction fallacies would result from `order effects' or, instead, from `emergence effects'. To help clarify this situation, we propose to use the World Wide Web as an `information space' that can be interrogated both in a sequential and non-sequential way, to test these two quantum approaches. We find that `emergence effects', and not `order effects', should be considered the main cognitive mechanism producing the observed conjunction fallacies.
Kwon, Yong Hyun; Kwon, Jung Won; Lee, Myoung Hee
2015-01-01
[Purpose] The purpose of the current study was to compare the effectiveness of motor sequential learning according to two different types of practice schedules, distributed practice schedule (two 12-hour inter-trial intervals) and massed practice schedule (two 10-minute inter-trial intervals) using a serial reaction time (SRT) task. [Subjects and Methods] Thirty healthy subjects were recruited and then randomly and evenly assigned to either the distributed practice group or the massed practice group. All subjects performed three consecutive sessions of the SRT task following one of the two different types of practice schedules. Distributed practice was scheduled for two 12-hour inter-session intervals including sleeping time, whereas massed practice was administered for two 10-minute inter-session intervals. Response time (RT) and response accuracy (RA) were measured in at pre-test, mid-test, and post-test. [Results] For RT, univariate analysis demonstrated significant main effects in the within-group comparison of the three tests as well as the interaction effect of two groups × three tests, whereas the between-group comparison showed no significant effect. The results for RA showed no significant differences in neither the between-group comparison nor the interaction effect of two groups × three tests, whereas the within-group comparison of the three tests showed a significant main effect. [Conclusion] Distributed practice led to enhancement of motor skill acquisition at the first inter-session interval as well as at the second inter-interval the following day, compared to massed practice. Consequentially, the results of this study suggest that a distributed practice schedule can enhance the effectiveness of motor sequential learning in 1-day learning as well as for two days learning formats compared to massed practice. PMID:25931727
Feldman, Anatol G; Latash, Mark L
2005-02-01
Criticisms of the equilibrium point (EP) hypothesis have recently appeared that are based on misunderstandings of some of its central notions. Starting from such interpretations of the hypothesis, incorrect predictions are made and tested. When the incorrect predictions prove false, the hypothesis is claimed to be falsified. In particular, the hypothesis has been rejected based on the wrong assumptions that it conflicts with empirically defined joint stiffness values or that it is incompatible with violations of equifinality under certain velocity-dependent perturbations. Typically, such attempts use notions describing the control of movements of artificial systems in place of physiologically relevant ones. While appreciating constructive criticisms of the EP hypothesis, we feel that incorrect interpretations have to be clarified by reiterating what the EP hypothesis does and does not predict. We conclude that the recent claims of falsifying the EP hypothesis and the calls for its replacement by EMG-force control hypothesis are unsubstantiated. The EP hypothesis goes far beyond the EMG-force control view. In particular, the former offers a resolution for the famous posture-movement paradox while the latter fails to resolve it.
Inverse sequential detection of parameter changes in developing time series
NASA Technical Reports Server (NTRS)
Radok, Uwe; Brown, Timothy J.
1992-01-01
Progressive values of two probabilities are obtained for parameter estimates derived from an existing set of values and from the same set enlarged by one or more new values, respectively. One probability is that of erroneously preferring the second of these estimates for the existing data ('type 1 error'), while the second probability is that of erroneously accepting their estimates for the enlarged test ('type 2 error'). A more stable combined 'no change' probability which always falls between 0.5 and 0 is derived from the (logarithmic) width of the uncertainty region of an equivalent 'inverted' sequential probability ratio test (SPRT, Wald 1945) in which the error probabilities are calculated rather than prescribed. A parameter change is indicated when the compound probability undergoes a progressive decrease. The test is explicitly formulated and exemplified for Gaussian samples.
A novel approach for small sample size family-based association studies: sequential tests.
Ilk, Ozlem; Rajabli, Farid; Dungul, Dilay Ciglidag; Ozdag, Hilal; Ilk, Hakki Gokhan
2011-08-01
In this paper, we propose a sequential probability ratio test (SPRT) to overcome the problem of limited samples in studies related to complex genetic diseases. The results of this novel approach are compared with the ones obtained from the traditional transmission disequilibrium test (TDT) on simulated data. Although TDT classifies single-nucleotide polymorphisms (SNPs) to only two groups (SNPs associated with the disease and the others), SPRT has the flexibility of assigning SNPs to a third group, that is, those for which we do not have enough evidence and should keep sampling. It is shown that SPRT results in smaller ratios of false positives and negatives, as well as better accuracy and sensitivity values for classifying SNPs when compared with TDT. By using SPRT, data with small sample size become usable for an accurate association analysis.
Action perception as hypothesis testing.
Donnarumma, Francesco; Costantini, Marcello; Ambrosini, Ettore; Friston, Karl; Pezzulo, Giovanni
2017-04-01
We present a novel computational model that describes action perception as an active inferential process that combines motor prediction (the reuse of our own motor system to predict perceived movements) and hypothesis testing (the use of eye movements to disambiguate amongst hypotheses). The system uses a generative model of how (arm and hand) actions are performed to generate hypothesis-specific visual predictions, and directs saccades to the most informative places of the visual scene to test these predictions - and underlying hypotheses. We test the model using eye movement data from a human action observation study. In both the human study and our model, saccades are proactive whenever context affords accurate action prediction; but uncertainty induces a more reactive gaze strategy, via tracking the observed movements. Our model offers a novel perspective on action observation that highlights its active nature based on prediction dynamics and hypothesis testing. Copyright © 2017 The Authors. Published by Elsevier Ltd.. All rights reserved.
NASA Astrophysics Data System (ADS)
Gao, J.; Lythe, M. B.
1996-06-01
This paper presents the principle of the Maximum Cross-Correlation (MCC) approach in detecting translational motions within dynamic fields from time-sequential remotely sensed images. A C program implementing the approach is presented and illustrated in a flowchart. The program is tested with a pair of sea-surface temperature images derived from Advanced Very High Resolution Radiometer (AVHRR) images near East Cape, New Zealand. Results show that the mean currents in the region have been detected satisfactorily with the approach.
A high speed sequential decoder
NASA Technical Reports Server (NTRS)
Lum, H., Jr.
1972-01-01
The performance and theory of operation for the High Speed Hard Decision Sequential Decoder are delineated. The decoder is a forward error correction system which is capable of accepting data from binary-phase-shift-keyed and quadriphase-shift-keyed modems at input data rates up to 30 megabits per second. Test results show that the decoder is capable of maintaining a composite error rate of 0.00001 at an input E sub b/N sub o of 5.6 db. This performance has been obtained with minimum circuit complexity.
Some sequential, distribution-free pattern classification procedures with applications
NASA Technical Reports Server (NTRS)
Poage, J. L.
1971-01-01
Some sequential, distribution-free pattern classification techniques are presented. The decision problem to which the proposed classification methods are applied is that of discriminating between two kinds of electroencephalogram responses recorded from a human subject: spontaneous EEG and EEG driven by a stroboscopic light stimulus at the alpha frequency. The classification procedures proposed make use of the theory of order statistics. Estimates of the probabilities of misclassification are given. The procedures were tested on Gaussian samples and the EEG responses.
Confidence intervals for single-case effect size measures based on randomization test inversion.
Michiels, Bart; Heyvaert, Mieke; Meulders, Ann; Onghena, Patrick
2017-02-01
In the current paper, we present a method to construct nonparametric confidence intervals (CIs) for single-case effect size measures in the context of various single-case designs. We use the relationship between a two-sided statistical hypothesis test at significance level α and a 100 (1 - α) % two-sided CI to construct CIs for any effect size measure θ that contain all point null hypothesis θ values that cannot be rejected by the hypothesis test at significance level α. This method of hypothesis test inversion (HTI) can be employed using a randomization test as the statistical hypothesis test in order to construct a nonparametric CI for θ. We will refer to this procedure as randomization test inversion (RTI). We illustrate RTI in a situation in which θ is the unstandardized and the standardized difference in means between two treatments in a completely randomized single-case design. Additionally, we demonstrate how RTI can be extended to other types of single-case designs. Finally, we discuss a few challenges for RTI as well as possibilities when using the method with other effect size measures, such as rank-based nonoverlap indices. Supplementary to this paper, we provide easy-to-use R code, which allows the user to construct nonparametric CIs according to the proposed method.
ERIC Educational Resources Information Center
Dickin, Katherine L.; Lent, Megan; Lu, Angela H.; Sequeira, Joran; Dollahite, Jamie S.
2012-01-01
Objective: To develop and test a brief measure of changes in eating, active play, and parenting practices after an intervention to help parents shape children's choices and home environments. Design: Sequential phases of development and testing: expert panel review, cognitive testing interviews, field testing, test-retest study, and assessment of…
Code of Federal Regulations, 2012 CFR
2012-01-01
... sequential performances of a material control test which is designed to detect anomalies potentially... capability required by § 74.53. Material control test means a comparison of a pre-established alarm threshold... into practical application for experimental and demonstration purposes, including the experimental...
Code of Federal Regulations, 2013 CFR
2013-01-01
... sequential performances of a material control test which is designed to detect anomalies potentially... capability required by § 74.53. Material control test means a comparison of a pre-established alarm threshold... into practical application for experimental and demonstration purposes, including the experimental...
Code of Federal Regulations, 2010 CFR
2010-01-01
... sequential performances of a material control test which is designed to detect anomalies potentially... capability required by § 74.53. Material control test means a comparison of a pre-established alarm threshold... into practical application for experimental and demonstration purposes, including the experimental...
Code of Federal Regulations, 2011 CFR
2011-01-01
... sequential performances of a material control test which is designed to detect anomalies potentially... capability required by § 74.53. Material control test means a comparison of a pre-established alarm threshold... into practical application for experimental and demonstration purposes, including the experimental...
Code of Federal Regulations, 2014 CFR
2014-01-01
... sequential performances of a material control test which is designed to detect anomalies potentially... capability required by § 74.53. Material control test means a comparison of a pre-established alarm threshold... into practical application for experimental and demonstration purposes, including the experimental...
ERIC Educational Resources Information Center
Besken, Miri
2016-01-01
The perceptual fluency hypothesis claims that items that are easy to perceive at encoding induce an illusion that they will be easier to remember, despite the finding that perception does not generally affect recall. The current set of studies tested the predictions of the perceptual fluency hypothesis with a picture generation manipulation.…
Adolescents' Body Image Trajectories: A Further Test of the Self-Equilibrium Hypothesis
ERIC Educational Resources Information Center
Morin, Alexandre J. S.; Maïano, Christophe; Scalas, L. Francesca; Janosz, Michel; Litalien, David
2017-01-01
The self-equilibrium hypothesis underlines the importance of having a strong core self, which is defined as a high and developmentally stable self-concept. This study tested this hypothesis in relation to body image (BI) trajectories in a sample of 1,006 adolescents (M[subscript age] = 12.6, including 541 males and 465 females) across a 4-year…
ERIC Educational Resources Information Center
Trafimow, David
2017-01-01
There has been much controversy over the null hypothesis significance testing procedure, with much of the criticism centered on the problem of inverse inference. Specifically, p gives the probability of the finding (or one more extreme) given the null hypothesis, whereas the null hypothesis significance testing procedure involves drawing a…
ERIC Educational Resources Information Center
Lee, Jungmin
2016-01-01
This study tested the Bennett hypothesis by examining whether four-year colleges changed listed tuition and fees, the amount of institutional grants per student, and room and board charges after their states implemented statewide merit-based aid programs. According to the Bennett hypothesis, increases in government financial aid make it easier for…
Human female orgasm as evolved signal: a test of two hypotheses.
Ellsworth, Ryan M; Bailey, Drew H
2013-11-01
We present the results of a study designed to empirically test predictions derived from two hypotheses regarding human female orgasm behavior as an evolved communicative trait or signal. One hypothesis tested was the female fidelity hypothesis, which posits that human female orgasm signals a woman's sexual satisfaction and therefore her likelihood of future fidelity to a partner. The other was sire choice hypothesis, which posits that women's orgasm behavior signals increased chances of fertilization. To test the two hypotheses of human female orgasm, we administered a questionnaire to 138 females and 121 males who reported that they were currently in a romantic relationship. Key predictions of the female fidelity hypothesis were not supported. In particular, orgasm was not associated with female sexual fidelity nor was orgasm associated with male perceptions of partner sexual fidelity. However, faked orgasm was associated with female sexual infidelity and lower male relationship satisfaction. Overall, results were in greater support of the sire choice signaling hypothesis than the female fidelity hypothesis. Results also suggest that male satisfaction with, investment in, and sexual fidelity to a mate are benefits that favored the selection of orgasmic signaling in ancestral females.
Luo, Liqun; Zhao, Wei; Weng, Tangmei
2016-01-01
The Trivers-Willard hypothesis predicts that high-status parents will bias their investment to sons, whereas low-status parents will bias their investment to daughters. Among humans, tests of this hypothesis have yielded mixed results. This study tests the hypothesis using data collected among contemporary peasants in Central South China. We use current family status (rated by our informants) and father's former class identity (assigned by the Chinese Communist Party in the early 1950s) as measures of parental status, and proportion of sons in offspring and offspring's years of education as measures of parental investment. Results show that (i) those families with a higher former class identity such as landlord and rich peasant tend to have a higher socioeconomic status currently, (ii) high-status parents are more likely to have sons than daughters among their biological offspring, and (iii) in higher-status families, the years of education obtained by sons exceed that obtained by daughters to a larger extent than in lower-status families. Thus, the first assumption and the two predictions of the hypothesis are supported by this study. This article contributes a contemporary Chinese case to the testing of the Trivers-Willard hypothesis.
Hypothesis testing of a change point during cognitive decline among Alzheimer's disease patients.
Ji, Ming; Xiong, Chengjie; Grundman, Michael
2003-10-01
In this paper, we present a statistical hypothesis test for detecting a change point over the course of cognitive decline among Alzheimer's disease patients. The model under the null hypothesis assumes a constant rate of cognitive decline over time and the model under the alternative hypothesis is a general bilinear model with an unknown change point. When the change point is unknown, however, the null distribution of the test statistics is not analytically tractable and has to be simulated by parametric bootstrap. When the alternative hypothesis that a change point exists is accepted, we propose an estimate of its location based on the Akaike's Information Criterion. We applied our method to a data set from the Neuropsychological Database Initiative by implementing our hypothesis testing method to analyze Mini Mental Status Exam scores based on a random-slope and random-intercept model with a bilinear fixed effect. Our result shows that despite large amount of missing data, accelerated decline did occur for MMSE among AD patients. Our finding supports the clinical belief of the existence of a change point during cognitive decline among AD patients and suggests the use of change point models for the longitudinal modeling of cognitive decline in AD research.
NASA Astrophysics Data System (ADS)
Menne, Matthew J.; Williams, Claude N., Jr.
2005-10-01
An evaluation of three hypothesis test statistics that are commonly used in the detection of undocumented changepoints is described. The goal of the evaluation was to determine whether the use of multiple tests could improve undocumented, artificial changepoint detection skill in climate series. The use of successive hypothesis testing is compared to optimal approaches, both of which are designed for situations in which multiple undocumented changepoints may be present. In addition, the importance of the form of the composite climate reference series is evaluated, particularly with regard to the impact of undocumented changepoints in the various component series that are used to calculate the composite.In a comparison of single test changepoint detection skill, the composite reference series formulation is shown to be less important than the choice of the hypothesis test statistic, provided that the composite is calculated from the serially complete and homogeneous component series. However, each of the evaluated composite series is not equally susceptible to the presence of changepoints in its components, which may be erroneously attributed to the target series. Moreover, a reference formulation that is based on the averaging of the first-difference component series is susceptible to random walks when the composition of the component series changes through time (e.g., values are missing), and its use is, therefore, not recommended. When more than one test is required to reject the null hypothesis of no changepoint, the number of detected changepoints is reduced proportionately less than the number of false alarms in a wide variety of Monte Carlo simulations. Consequently, a consensus of hypothesis tests appears to improve undocumented changepoint detection skill, especially when reference series homogeneity is violated. A consensus of successive hypothesis tests using a semihierarchic splitting algorithm also compares favorably to optimal solutions, even when changepoints are not hierarchic.
A Person Fit Test for IRT Models for Polytomous Items
ERIC Educational Resources Information Center
Glas, C. A. W.; Dagohoy, Anna Villa T.
2007-01-01
A person fit test based on the Lagrange multiplier test is presented for three item response theory models for polytomous items: the generalized partial credit model, the sequential model, and the graded response model. The test can also be used in the framework of multidimensional ability parameters. It is shown that the Lagrange multiplier…
EXSPRT: An Expert Systems Approach to Computer-Based Adaptive Testing.
ERIC Educational Resources Information Center
Frick, Theodore W.; And Others
Expert systems can be used to aid decision making. A computerized adaptive test (CAT) is one kind of expert system, although it is not commonly recognized as such. A new approach, termed EXSPRT, was devised that combines expert systems reasoning and sequential probability ratio test stopping rules. EXSPRT-R uses random selection of test items,…
How Big Is Big Enough? Sample Size Requirements for CAST Item Parameter Estimation
ERIC Educational Resources Information Center
Chuah, Siang Chee; Drasgow, Fritz; Luecht, Richard
2006-01-01
Adaptive tests offer the advantages of reduced test length and increased accuracy in ability estimation. However, adaptive tests require large pools of precalibrated items. This study looks at the development of an item pool for 1 type of adaptive administration: the computer-adaptive sequential test. An important issue is the sample size required…
Using Serial and Discrete Digit Naming to Unravel Word Reading Processes
Altani, Angeliki; Protopapas, Athanassios; Georgiou, George K.
2018-01-01
During reading acquisition, word recognition is assumed to undergo a developmental shift from slow serial/sublexical processing of letter strings to fast parallel processing of whole word forms. This shift has been proposed to be detected by examining the size of the relationship between serial- and discrete-trial versions of word reading and rapid naming tasks. Specifically, a strong association between serial naming of symbols and single word reading suggests that words are processed serially, whereas a strong association between discrete naming of symbols and single word reading suggests that words are processed in parallel as wholes. In this study, 429 Grade 1, 3, and 5 English-speaking Canadian children were tested on serial and discrete digit naming and word reading. Across grades, single word reading was more strongly associated with discrete naming than with serial naming of digits, indicating that short high-frequency words are processed as whole units early in the development of reading ability in English. In contrast, serial naming was not a unique predictor of single word reading across grades, suggesting that within-word sequential processing was not required for the successful recognition for this set of words. Factor mixture analysis revealed that our participants could be clustered into two classes, namely beginning and more advanced readers. Serial naming uniquely predicted single word reading only among the first class of readers, indicating that novice readers rely on a serial strategy to decode words. Yet, a considerable proportion of Grade 1 students were assigned to the second class, evidently being able to process short high-frequency words as unitized symbols. We consider these findings together with those from previous studies to challenge the hypothesis of a binary distinction between serial/sublexical and parallel/lexical processing in word reading. We argue instead that sequential processing in word reading operates on a continuum, depending on the level of reading proficiency, the degree of orthographic transparency, and word-specific characteristics. PMID:29706918
Using Serial and Discrete Digit Naming to Unravel Word Reading Processes.
Altani, Angeliki; Protopapas, Athanassios; Georgiou, George K
2018-01-01
During reading acquisition, word recognition is assumed to undergo a developmental shift from slow serial/sublexical processing of letter strings to fast parallel processing of whole word forms. This shift has been proposed to be detected by examining the size of the relationship between serial- and discrete-trial versions of word reading and rapid naming tasks. Specifically, a strong association between serial naming of symbols and single word reading suggests that words are processed serially, whereas a strong association between discrete naming of symbols and single word reading suggests that words are processed in parallel as wholes. In this study, 429 Grade 1, 3, and 5 English-speaking Canadian children were tested on serial and discrete digit naming and word reading. Across grades, single word reading was more strongly associated with discrete naming than with serial naming of digits, indicating that short high-frequency words are processed as whole units early in the development of reading ability in English. In contrast, serial naming was not a unique predictor of single word reading across grades, suggesting that within-word sequential processing was not required for the successful recognition for this set of words. Factor mixture analysis revealed that our participants could be clustered into two classes, namely beginning and more advanced readers. Serial naming uniquely predicted single word reading only among the first class of readers, indicating that novice readers rely on a serial strategy to decode words. Yet, a considerable proportion of Grade 1 students were assigned to the second class, evidently being able to process short high-frequency words as unitized symbols. We consider these findings together with those from previous studies to challenge the hypothesis of a binary distinction between serial/sublexical and parallel/lexical processing in word reading. We argue instead that sequential processing in word reading operates on a continuum, depending on the level of reading proficiency, the degree of orthographic transparency, and word-specific characteristics.
Fast-responding liquid crystal light-valve technology for color-sequential display applications
NASA Astrophysics Data System (ADS)
Janssen, Peter J.; Konovalov, Victor A.; Muravski, Anatoli A.; Yakovenko, Sergei Y.
1996-04-01
A color sequential projection system has some distinct advantages over conventional systems which make it uniquely suitable for consumer TV as well as high performance professional applications such as computer monitors and electronic cinema. A fast responding light-valve is, clearly, essential for a good performing system. Response speed of transmissive LC lightvalves has been marginal thus far for good color rendition. Recently, Sevchenko Institute has made some very fast reflective LC cells which were evaluated at Philips Labs. These devices showed sub millisecond-large signal-response times, even at room temperature, and produced good color in a projector emulation testbed. In our presentation we describe our highly efficient color sequential projector and demonstrate its operation on video tape. Next we discuss light-valve requirements and reflective light-valve test results.
Bayesian Methods for Determining the Importance of Effects
USDA-ARS?s Scientific Manuscript database
Criticisms have plagued the frequentist null-hypothesis significance testing (NHST) procedure since the day it was created from the Fisher Significance Test and Hypothesis Test of Jerzy Neyman and Egon Pearson. Alternatives to NHST exist in frequentist statistics, but competing methods are also avai...
Testing for purchasing power parity in the long-run for ASEAN-5
NASA Astrophysics Data System (ADS)
Choji, Niri Martha; Sek, Siok Kun
2017-04-01
For more than a decade, there has been a substantial interest in testing for the validity of the purchasing power parity (PPP) hypothesis empirically. This paper performs a test on revealing a long-run relative Purchasing Power Parity for a group of ASEAN-5 countries for the period of 1996-2016 using monthly data. For this purpose, we used the Pedroni co-integration method to test for the long-run hypothesis of purchasing power parity. We first tested for the stationarity of the variables and found that the variables are non-stationary at levels but stationary at first difference. Results of the Pedroni test rejected the null hypothesis of no co-integration meaning that we have enough evidence to support PPP in the long-run for the ASEAN-5 countries over the period of 1996-2016. In other words, the rejection of null hypothesis implies a long-run relation between nominal exchange rates and relative prices.
UNIFORMLY MOST POWERFUL BAYESIAN TESTS
Johnson, Valen E.
2014-01-01
Uniformly most powerful tests are statistical hypothesis tests that provide the greatest power against a fixed null hypothesis among all tests of a given size. In this article, the notion of uniformly most powerful tests is extended to the Bayesian setting by defining uniformly most powerful Bayesian tests to be tests that maximize the probability that the Bayes factor, in favor of the alternative hypothesis, exceeds a specified threshold. Like their classical counterpart, uniformly most powerful Bayesian tests are most easily defined in one-parameter exponential family models, although extensions outside of this class are possible. The connection between uniformly most powerful tests and uniformly most powerful Bayesian tests can be used to provide an approximate calibration between p-values and Bayes factors. Finally, issues regarding the strong dependence of resulting Bayes factors and p-values on sample size are discussed. PMID:24659829
Simultaneous sequential monitoring of efficacy and safety led to masking of effects.
van Eekelen, Rik; de Hoop, Esther; van der Tweel, Ingeborg
2016-08-01
Usually, sequential designs for clinical trials are applied on the primary (=efficacy) outcome. In practice, other outcomes (e.g., safety) will also be monitored and influence the decision whether to stop a trial early. Implications of simultaneous monitoring on trial decision making are yet unclear. This study examines what happens to the type I error, power, and required sample sizes when one efficacy outcome and one correlated safety outcome are monitored simultaneously using sequential designs. We conducted a simulation study in the framework of a two-arm parallel clinical trial. Interim analyses on two outcomes were performed independently and simultaneously on the same data sets using four sequential monitoring designs, including O'Brien-Fleming and Triangular Test boundaries. Simulations differed in values for correlations and true effect sizes. When an effect was present in both outcomes, competition was introduced, which decreased power (e.g., from 80% to 60%). Futility boundaries for the efficacy outcome reduced overall type I errors as well as power for the safety outcome. Monitoring two correlated outcomes, given that both are essential for early trial termination, leads to masking of true effects. Careful consideration of scenarios must be taken into account when designing sequential trials. Simulation results can help guide trial design. Copyright © 2016 Elsevier Inc. All rights reserved.
Chang, Young-Soo; Hong, Sung Hwa; Kim, Eun Yeon; Choi, Ji Eun; Chung, Won-Ho; Cho, Yang-Sun; Moon, Il Joon
2018-05-18
Despite recent advancement in the prediction of cochlear implant outcome, the benefit of bilateral procedures compared to bimodal stimulation and how we predict speech perception outcomes of sequential bilateral cochlear implant based on bimodal auditory performance in children remain unclear. This investigation was performed: (1) to determine the benefit of sequential bilateral cochlear implant and (2) to identify the associated factors for the outcome of sequential bilateral cochlear implant. Observational and retrospective study. We retrospectively analyzed 29 patients with sequential cochlear implant following bimodal-fitting condition. Audiological evaluations were performed; the categories of auditory performance scores, speech perception with monosyllable and disyllables words, and the Korean version of Ling. Audiological evaluations were performed before sequential cochlear implant with the bimodal fitting condition (CI1+HA) and one year after the sequential cochlear implant with bilateral cochlear implant condition (CI1+CI2). The good Performance Group (GP) was defined as follows; 90% or higher in monosyllable and bisyllable tests with auditory-only condition or 20% or higher improvement of the scores with CI1+CI2. Age at first implantation, inter-implant interval, categories of auditory performance score, and various comorbidities were analyzed by logistic regression analysis. Compared to the CI1+HA, CI1+CI2 provided significant benefit in categories of auditory performance, speech perception, and Korean version of Ling results. Preoperative categories of auditory performance scores were the only associated factor for being GP (odds ratio=4.38, 95% confidence interval - 95%=1.07-17.93, p=0.04). The children with limited language development in bimodal condition should be considered as the sequential bilateral cochlear implant and preoperative categories of auditory performance score could be used as the predictor in speech perception after sequential cochlear implant. Copyright © 2018 Associação Brasileira de Otorrinolaringologia e Cirurgia Cérvico-Facial. Published by Elsevier Editora Ltda. All rights reserved.
Cullington, H E; Bele, D; Brinton, J C; Cooper, S; Daft, M; Harding, J; Hatton, N; Humphries, J; Lutman, M E; Maddocks, J; Maggs, J; Millward, K; O'Donoghue, G; Patel, S; Rajput, K; Salmon, V; Sear, T; Speers, A; Wheeler, A; Wilson, K
2017-01-01
To assess longitudinal outcomes in a large and varied population of children receiving bilateral cochlear implants both simultaneously and sequentially. This observational non-randomized service evaluation collected localization and speech recognition in noise data from simultaneously and sequentially implanted children at four time points: before bilateral cochlear implants or before the sequential implant, 1 year, 2 years, and 3 years after bilateral implants. No inclusion criteria were applied, so children with additional difficulties, cochleovestibular anomalies, varying educational placements, 23 different home languages, a full range of outcomes and varying device use were included. 1001 children were included: 465 implanted simultaneously and 536 sequentially, representing just over 50% of children receiving bilateral implants in the UK in this period. In simultaneously implanted children the median age at implant was 2.1 years; 7% were implanted at less than 1 year of age. In sequentially implanted children the interval between implants ranged from 0.1 to 14.5 years. Children with simultaneous bilateral implants localized better than those with one implant. On average children receiving a second (sequential) cochlear implant showed improvement in localization and listening in background noise after 1 year of bilateral listening. The interval between sequential implants had no effect on localization improvement although a smaller interval gave more improvement in speech recognition in noise. Children with sequential implants on average were able to use their second device to obtain spatial release from masking after 2 years of bilateral listening. Although ranges were large, bilateral cochlear implants on average offered an improvement in localization and speech perception in noise over unilateral implants. These data represent the diverse population of children with bilateral cochlear implants in the UK from 2010 to 2012. Predictions of outcomes for individual patients are not possible from these data. However, there are no indications to preclude children with long inter-implant interval having the chance of a second cochlear implant.
[Experimental testing of Pflüger's reflex hypothesis of menstruation in late 19th century].
Simmer, H H
1980-07-01
Pflüger's hypothesis of a nerve reflex as the cause of menstruation published in 1865 and accepted by many, nonetheless did not lead to experimental investigations for 25 years. According to this hypothesis the nerve reflex starts in the ovary by an increase of the intraovarian pressure by the growing follicles. In 1884 Adolph Kehrer proposed a program to test the nerve reflex, but only in 1890, Cohnstein artificially increased the intraovarian pressure in women by bimanual compression from the outside and the vagina. His results were not convincing. Six years later, Strassmann injected fluids into ovaries of animals and obtained changes in the uterus resembling those of oestrus. His results seemed to verify a prognosis derived from Pflüger's hypothesis. Thus, after a long interval, that hypothesis had become a paradigma. Though reasons can be given for the delay, it is little understood, why experimental testing started so late.
When Null Hypothesis Significance Testing Is Unsuitable for Research: A Reassessment.
Szucs, Denes; Ioannidis, John P A
2017-01-01
Null hypothesis significance testing (NHST) has several shortcomings that are likely contributing factors behind the widely debated replication crisis of (cognitive) neuroscience, psychology, and biomedical science in general. We review these shortcomings and suggest that, after sustained negative experience, NHST should no longer be the default, dominant statistical practice of all biomedical and psychological research. If theoretical predictions are weak we should not rely on all or nothing hypothesis tests. Different inferential methods may be most suitable for different types of research questions. Whenever researchers use NHST they should justify its use, and publish pre-study power calculations and effect sizes, including negative findings. Hypothesis-testing studies should be pre-registered and optimally raw data published. The current statistics lite educational approach for students that has sustained the widespread, spurious use of NHST should be phased out.
When Null Hypothesis Significance Testing Is Unsuitable for Research: A Reassessment
Szucs, Denes; Ioannidis, John P. A.
2017-01-01
Null hypothesis significance testing (NHST) has several shortcomings that are likely contributing factors behind the widely debated replication crisis of (cognitive) neuroscience, psychology, and biomedical science in general. We review these shortcomings and suggest that, after sustained negative experience, NHST should no longer be the default, dominant statistical practice of all biomedical and psychological research. If theoretical predictions are weak we should not rely on all or nothing hypothesis tests. Different inferential methods may be most suitable for different types of research questions. Whenever researchers use NHST they should justify its use, and publish pre-study power calculations and effect sizes, including negative findings. Hypothesis-testing studies should be pre-registered and optimally raw data published. The current statistics lite educational approach for students that has sustained the widespread, spurious use of NHST should be phased out. PMID:28824397
40 CFR 53.35 - Test procedure for Class II and Class III methods for PM2.5 and PM-2.5
Code of Federal Regulations, 2010 CFR
2010-07-01
... reference method samplers shall be of single-filter design (not multi-filter, sequential sample design... and multiplicative bias (comparative slope and intercept). (1) For each test site, calculate the mean...
40 CFR 53.35 - Test procedure for Class II and Class III methods for PM2.5 and PM-2.5
Code of Federal Regulations, 2011 CFR
2011-07-01
... reference method samplers shall be of single-filter design (not multi-filter, sequential sample design... and multiplicative bias (comparative slope and intercept). (1) For each test site, calculate the mean...
40 CFR 53.35 - Test procedure for Class II and Class III methods for PM2.5 and PM−2.5.
Code of Federal Regulations, 2012 CFR
2012-07-01
... reference method samplers shall be of single-filter design (not multi-filter, sequential sample design... and multiplicative bias (comparative slope and intercept). (1) For each test site, calculate the mean...
Langella, M; Colarieti, L; Ambrosini, M V; Giuditta, A
1992-02-01
Female adult rats were trained for a two-way active avoidance task (4 h), and allowed free sleep (3 h). Control rats (C) were left in their home cages during the acquisition period. Dural electrodes and an intraventricular cannula, implanted one week in advance, were used for EEG recording during the period of sleep and for the injection of [3H]thymidine at the beginning of the training session, respectively. Rats were killed at the end of the sleep period, and the DNA-specific activity was determined in the main brain regions and in liver. Correlations among sleep, behavioral and biochemical variables were assessed using Spearman's nonparametric method. In learning rats (L), the number of avoidances was negatively correlated with SS-W variables, and positively correlated with SS-PS variables (episodes of synchronized sleep followed by wakefulness or paradoxical sleep, respectively) and with PS variables. An inverse pattern of correlations was shown by the number of escapes or freezings. No correlations occurred in rats unable to achieve the learning criterion (NL). In L rats, the specific activity of brain DNA was negatively correlated with SS-W variables and positively correlated with SS-PS variables, while essentially no correlation concerned PS variables. On the other hand, in NL rats, comparable correlations were positive with SS-W variables and negative with SS-PS and PS variables. Few and weak correlations occurred in C rats. The data support a role of SS in brain information processing, as postulated by the sequential hypothesis on the function of sleep. In addition, they suggest that the elimination of nonadaptive memory traces may require several SS-W episodes and a terminal SS-PS episode. During PS episodes, adaptive memory traces cleared of nonadaptive components may be copied in more suitable brain sites.
Testing fundamental ecological concepts with a Pythium-Prunus pathosystem
USDA-ARS?s Scientific Manuscript database
The study of plant-pathogen interactions has enabled tests of basic ecological concepts on plant community assembly (Janzen-Connell Hypothesis) and plant invasion (Enemy Release Hypothesis). We used a field experiment to (#1) test whether Pythium effects depended on host (seedling) density and/or d...
9 CFR 113.309 - Bovine Parainfluenza3 Vaccine.
Code of Federal Regulations, 2010 CFR
2010-01-01
... develop antibody titers of 1:32 or greater by day 6 ±2 days post-challenge. (8) A sequential test... parainfluenza, susceptible calves shall be used as test animals (20 vaccinates and five controls). Blood samples... negative at a 1:2 final serum dilution in a varying serum constant virus neutralization test with less than...
9 CFR 113.309 - Bovine Parainfluenza3 Vaccine.
Code of Federal Regulations, 2012 CFR
2012-01-01
... develop antibody titers of 1:32 or greater by day 6 ±2 days post-challenge. (8) A sequential test... parainfluenza, susceptible calves shall be used as test animals (20 vaccinates and five controls). Blood samples... negative at a 1:2 final serum dilution in a varying serum constant virus neutralization test with less than...
Yu, Yang; Zhang, Fan; Gao, Ming-Xin; Li, Hai-Tao; Li, Jing-Xing; Song, Wei; Huang, Xin-Sheng; Gu, Cheng-Xiong
2013-01-01
OBJECTIVES Intraoperative transit time flow measurement (TTFM) is widely used to assess anastomotic quality in coronary artery bypass grafting (CABG). However, in sequential vein grafting, the flow characteristics collected by the conventional TTFM method are usually associated with total graft flow and might not accurately indicate the quality of every distal anastomosis in a sequential graft. The purpose of our study was to examine a new TTFM method that could assess the quality of each distal anastomosis in a sequential graft more reliably than the conventional TTFM approach. METHODS Two TTFM methods were tested in 84 patients who underwent sequential saphenous off-pump CABG in Beijing An Zhen Hospital between April and August 2012. In the conventional TTFM method, normal blood flow in the sequential graft was maintained during the measurement, and the flow probe was placed a few centimetres above the anastomosis to be evaluated. In the new method, blood flow in the sequential graft was temporarily reduced during the measurement by placing an atraumatic bulldog clamp at the graft a few centimetres distal to the anastomosis to be evaluated, while the position of the flow probe remained the same as in the conventional method. This new TTFM method was named the flow reduction TTFM. Graft flow parameters measured by both methods were compared. RESULTS Compared with the conventional TTFM, the flow reduction TTFM resulted in significantly lower mean graft blood flow (P < 0.05); in contrast, yielded significantly higher pulsatility index (P < 0.05). Diastolic filling was not significantly different between the two methods and was >50% in both cases. Interestingly, the flow reduction TTFM identified two defective middle distal anastomoses that the conventional TTFM failed to detect. Graft flows near the defective distal anastomoses were improved substantially after revision. CONCLUSIONS In this study, we found that temporary reduction of graft flow during TTFM seemed to enhance the sensitivity of TTFM to less-than-critical anastomotic defects in a sequential graft and to improve the overall accuracy of the intraoperative assessment of anastomotic quality in sequential vein grafting. PMID:24000314
A checklist to facilitate objective hypothesis testing in social psychology research.
Washburn, Anthony N; Morgan, G Scott; Skitka, Linda J
2015-01-01
Social psychology is not a very politically diverse area of inquiry, something that could negatively affect the objectivity of social psychological theory and research, as Duarte et al. argue in the target article. This commentary offers a number of checks to help researchers uncover possible biases and identify when they are engaging in hypothesis confirmation and advocacy instead of hypothesis testing.
Nan Liu; Hai Ren; Sufen Yuan; Qinfeng Guo; Long Yang
2013-01-01
The relative importance of facilitation and competition between pairwise plants across abiotic stress gradients as predicted by the stress-gradient hypothesis has been confirmed in arid and temperate ecosystems, but the hypothesis has rarely been tested in tropical systems, particularly across nutrient gradients. The current research examines the interactions between a...
Phase II Clinical Trials: D-methionine to Reduce Noise-Induced Hearing Loss
2012-03-01
loss (NIHL) and tinnitus in our troops. Hypotheses: Primary Hypothesis: Administration of oral D-methionine prior to and during weapons...reduce or prevent noise-induced tinnitus . Primary outcome to test the primary hypothesis: Pure tone air-conduction thresholds. Primary outcome to...test the secondary hypothesis: Tinnitus questionnaires. Specific Aims: 1. To determine whether administering oral D-methionine (D-met) can
An analog scrambler for speech based on sequential permutations in time and frequency
NASA Astrophysics Data System (ADS)
Cox, R. V.; Jayant, N. S.; McDermott, B. J.
Permutation of speech segments is an operation that is frequently used in the design of scramblers for analog speech privacy. In this paper, a sequential procedure for segment permutation is considered. This procedure can be extended to two dimensional permutation of time segments and frequency bands. By subjective testing it is shown that this combination gives a residual intelligibility for spoken digits of 20 percent with a delay of 256 ms. (A lower bound for this test would be 10 percent). The complexity of implementing such a system is considered and the issues of synchronization and channel equalization are addressed. The computer simulation results for the system using both real and simulated channels are examined.
An omnibus test for the global null hypothesis.
Futschik, Andreas; Taus, Thomas; Zehetmayer, Sonja
2018-01-01
Global hypothesis tests are a useful tool in the context of clinical trials, genetic studies, or meta-analyses, when researchers are not interested in testing individual hypotheses, but in testing whether none of the hypotheses is false. There are several possibilities how to test the global null hypothesis when the individual null hypotheses are independent. If it is assumed that many of the individual null hypotheses are false, combination tests have been recommended to maximize power. If, however, it is assumed that only one or a few null hypotheses are false, global tests based on individual test statistics are more powerful (e.g. Bonferroni or Simes test). However, usually there is no a priori knowledge on the number of false individual null hypotheses. We therefore propose an omnibus test based on cumulative sums of the transformed p-values. We show that this test yields an impressive overall performance. The proposed method is implemented in an R-package called omnibus.
Greenberg, E. Robert; Anderson, Garnet L.; Morgan, Douglas R.; Torres, Javier; Chey, William D.; Bravo, Luis Eduardo; Dominguez, Ricardo L.; Ferreccio, Catterina; Herrero, Rolando; Lazcano-Ponce, Eduardo C.; Meza-Montenegro, Mercedes María; Peña, Rodolfo; Peña, Edgar M.; Salazar-Martínez, Eduardo; Correa, Pelayo; Martínez, María Elena; Valdivieso, Manuel; Goodman, Gary E.; Crowley, John J.; Baker, Laurence H.
2011-01-01
Summary Background Evidence from Europe, Asia, and North America suggests that standard three-drug regimens of a proton pump inhibitor plus amoxicillin and clarithromycin are significantly less effective for eradicating Helicobacter pylori (H. pylori) infection than five-day concomitant and ten-day sequential four-drug regimens that include a nitroimidazole. These four-drug regimens also entail fewer antibiotic doses and thus may be suitable for eradication programs in low-resource settings. Studies are limited from Latin America, however, where the burden of H. pylori-associated diseases is high. Methods We randomised 1463 men and women ages 21–65 selected from general populations in Chile, Colombia, Costa Rica, Honduras, Nicaragua, and Mexico (two sites) who tested positive for H. pylori by a urea breath test (UBT) to: 14 days of lansoprazole, amoxicillin, and clarithromycin (standard therapy); five days of lansoprazole, amoxicillin, clarithromycin, and metronidazole (concomitant therapy); or five days of lansoprazole and amoxicillin followed by five of lansoprazole, clarithromycin, and metronidazole (sequential therapy). Eradication was assessed by UBT six–eight weeks after randomisation. Findings In intention-to-treat analyses, the probability of eradication with standard therapy was 82·2%, which was 8·6% higher (95% adjusted CI: 2·6%, 14·5%) than with concomitant therapy (73·6%) and 5·6% higher (95% adjusted CI: −0·04%, 11·6%) than with sequential therapy (76·5%). In analyses limited to the 1314 participants who adhered to their assigned therapy, the probabilities of eradication were 87·1%, 78·7%, and 81·1% with standard, concomitant, and sequential therapies, respectively. Neither four-drug regimen was significantly better than standard triple therapy in any of the seven sites. Interpretation Standard 14-day triple-drug therapy is preferable to five-day concomitant or ten-day sequential four-drug regimens as empiric therapy for H. pylori among diverse Latin American populations. Funding Bill & Melinda Gates Foundation and US National Institutes of Health. PMID:21777974
Explorations in Statistics: Hypothesis Tests and P Values
ERIC Educational Resources Information Center
Curran-Everett, Douglas
2009-01-01
Learning about statistics is a lot like learning about science: the learning is more meaningful if you can actively explore. This second installment of "Explorations in Statistics" delves into test statistics and P values, two concepts fundamental to the test of a scientific null hypothesis. The essence of a test statistic is that it compares what…
Planned Hypothesis Tests Are Not Necessarily Exempt from Multiplicity Adjustment
ERIC Educational Resources Information Center
Frane, Andrew V.
2015-01-01
Scientific research often involves testing more than one hypothesis at a time, which can inflate the probability that a Type I error (false discovery) will occur. To prevent this Type I error inflation, adjustments can be made to the testing procedure that compensate for the number of tests. Yet many researchers believe that such adjustments are…
ERIC Educational Resources Information Center
Malda, Maike; van de Vijver, Fons J. R.; Temane, Q. Michael
2010-01-01
In this study, cross-cultural differences in cognitive test scores are hypothesized to depend on a test's cultural complexity (Cultural Complexity Hypothesis: CCH), here conceptualized as its content familiarity, rather than on its cognitive complexity (Spearman's Hypothesis: SH). The content familiarity of tests assessing short-term memory,…
Maissan, Francois; Pool, Jan; Stutterheim, Eric; Wittink, Harriet; Ostelo, Raymond
2018-06-02
Neck pain is the fourth major cause of disability worldwide but sufficient evidence regarding treatment is not available. This study is a first exploratory attempt to gain insight into and consensus on the clinical reasoning of experts in patients with non-specific neck pain. First, we aimed to inventory expert opinions regarding the indication for physiotherapy when, other than neck pain, no positive signs and symptoms and no positive diagnostic tests are present. Secondly, we aimed to determine which measurement instruments are being used and when they are used to support and objectify the clinical reasoning process. Finally, we wanted to establish consensus among experts regarding the use of unimodal interventions in patients with non-specific neck pain, i.e. their sequential linear clinical reasoning. A Delphi study. A Web-based Delphi study was conducted. Fifteen experts (teachers and researchers) participated. Pain alone was deemed not be an indication for physiotherapy treatment. PROMs are mainly used for evaluative purposes and physical tests for diagnostic and evaluative purposes. Eighteen different variants of sequential linear clinical reasoning were investigated within our Delphi study. Only 6 out of 18 variants of sequential linear clinical reasoning reached more than 50% consensus. Pain alone is not an indication for physiotherapy. Insight has been obtained into which measurement instruments are used and when they are used. Consensus about sequential linear lines of clinical reasoning was poor. Copyright © 2018 Elsevier Ltd. All rights reserved.
A technique for sequential segmental neuromuscular stimulation with closed loop feedback control.
Zonnevijlle, Erik D H; Abadia, Gustavo Perez; Somia, Naveen N; Kon, Moshe; Barker, John H; Koenig, Steven; Ewert, D L; Stremel, Richard W
2002-01-01
In dynamic myoplasty, dysfunctional muscle is assisted or replaced with skeletal muscle from a donor site. Electrical stimulation is commonly used to train and animate the skeletal muscle to perform its new task. Due to simultaneous tetanic contractions of the entire myoplasty, muscles are deprived of perfusion and fatigue rapidly, causing long-term problems such as excessive scarring and muscle ischemia. Sequential stimulation contracts part of the muscle while other parts rest, thus significantly improving blood perfusion. However, the muscle still fatigues. In this article, we report a test of the feasibility of using closed-loop control to economize the contractions of the sequentially stimulated myoplasty. A simple stimulation algorithm was developed and tested on a sequentially stimulated neo-sphincter designed from a canine gracilis muscle. Pressure generated in the lumen of the myoplasty neo-sphincter was used as feedback to regulate the stimulation signal via three control parameters, thereby optimizing the performance of the myoplasty. Additionally, we investigated and compared the efficiency of amplitude and frequency modulation techniques. Closed-loop feedback enabled us to maintain target pressures within 10% deviation using amplitude modulation and optimized control parameters (correction frequency = 4 Hz, correction threshold = 4%, and transition time = 0.3 s). The large-scale stimulation/feedback setup was unfit for chronic experimentation, but can be used as a blueprint for a small-scale version to unveil the theoretical benefits of closed-loop control in chronic experimentation.
NASA Astrophysics Data System (ADS)
Cuntz, Matthias; Mai, Juliane; Zink, Matthias; Thober, Stephan; Kumar, Rohini; Schäfer, David; Schrön, Martin; Craven, John; Rakovec, Oldrich; Spieler, Diana; Prykhodko, Vladyslav; Dalmasso, Giovanni; Musuuza, Jude; Langenberg, Ben; Attinger, Sabine; Samaniego, Luis
2015-08-01
Environmental models tend to require increasing computational time and resources as physical process descriptions are improved or new descriptions are incorporated. Many-query applications such as sensitivity analysis or model calibration usually require a large number of model evaluations leading to high computational demand. This often limits the feasibility of rigorous analyses. Here we present a fully automated sequential screening method that selects only informative parameters for a given model output. The method requires a number of model evaluations that is approximately 10 times the number of model parameters. It was tested using the mesoscale hydrologic model mHM in three hydrologically unique European river catchments. It identified around 20 informative parameters out of 52, with different informative parameters in each catchment. The screening method was evaluated with subsequent analyses using all 52 as well as only the informative parameters. Subsequent Sobol's global sensitivity analysis led to almost identical results yet required 40% fewer model evaluations after screening. mHM was calibrated with all and with only informative parameters in the three catchments. Model performances for daily discharge were equally high in both cases with Nash-Sutcliffe efficiencies above 0.82. Calibration using only the informative parameters needed just one third of the number of model evaluations. The universality of the sequential screening method was demonstrated using several general test functions from the literature. We therefore recommend the use of the computationally inexpensive sequential screening method prior to rigorous analyses on complex environmental models.