Estimating the Proportion of True Null Hypotheses Using the Pattern of Observed p-values
Tong, Tiejun; Feng, Zeny; Hilton, Julia S.; Zhao, Hongyu
2013-01-01
Estimating the proportion of true null hypotheses, π0, has attracted much attention in the recent statistical literature. Besides its apparent relevance for a set of specific scientific hypotheses, an accurate estimate of this parameter is key for many multiple testing procedures. Most existing methods for estimating π0 in the literature are motivated from the independence assumption of test statistics, which is often not true in reality. Simulations indicate that most existing estimators in the presence of the dependence among test statistics can be poor, mainly due to the increase of variation in these estimators. In this paper, we propose several data-driven methods for estimating π0 by incorporating the distribution pattern of the observed p-values as a practical approach to address potential dependence among test statistics. Specifically, we use a linear fit to give a data-driven estimate for the proportion of true-null p-values in (λ, 1] over the whole range [0, 1] instead of using the expected proportion at 1 − λ. We find that the proposed estimators may substantially decrease the variance of the estimated true null proportion and thus improve the overall performance. PMID:24078762
Estimating the Proportion of True Null Hypotheses Using the Pattern of Observed p-values.
Tong, Tiejun; Feng, Zeny; Hilton, Julia S; Zhao, Hongyu
2013-01-01
Estimating the proportion of true null hypotheses, π 0 , has attracted much attention in the recent statistical literature. Besides its apparent relevance for a set of specific scientific hypotheses, an accurate estimate of this parameter is key for many multiple testing procedures. Most existing methods for estimating π 0 in the literature are motivated from the independence assumption of test statistics, which is often not true in reality. Simulations indicate that most existing estimators in the presence of the dependence among test statistics can be poor, mainly due to the increase of variation in these estimators. In this paper, we propose several data-driven methods for estimating π 0 by incorporating the distribution pattern of the observed p -values as a practical approach to address potential dependence among test statistics. Specifically, we use a linear fit to give a data-driven estimate for the proportion of true-null p -values in (λ, 1] over the whole range [0, 1] instead of using the expected proportion at 1 - λ. We find that the proposed estimators may substantially decrease the variance of the estimated true null proportion and thus improve the overall performance.
Statistical power analysis in wildlife research
Steidl, R.J.; Hayes, J.P.
1997-01-01
Statistical power analysis can be used to increase the efficiency of research efforts and to clarify research results. Power analysis is most valuable in the design or planning phases of research efforts. Such prospective (a priori) power analyses can be used to guide research design and to estimate the number of samples necessary to achieve a high probability of detecting biologically significant effects. Retrospective (a posteriori) power analysis has been advocated as a method to increase information about hypothesis tests that were not rejected. However, estimating power for tests of null hypotheses that were not rejected with the effect size observed in the study is incorrect; these power estimates will always be a??0.50 when bias adjusted and have no relation to true power. Therefore, retrospective power estimates based on the observed effect size for hypothesis tests that were not rejected are misleading; retrospective power estimates are only meaningful when based on effect sizes other than the observed effect size, such as those effect sizes hypothesized to be biologically significant. Retrospective power analysis can be used effectively to estimate the number of samples or effect size that would have been necessary for a completed study to have rejected a specific null hypothesis. Simply presenting confidence intervals can provide additional information about null hypotheses that were not rejected, including information about the size of the true effect and whether or not there is adequate evidence to 'accept' a null hypothesis as true. We suggest that (1) statistical power analyses be routinely incorporated into research planning efforts to increase their efficiency, (2) confidence intervals be used in lieu of retrospective power analyses for null hypotheses that were not rejected to assess the likely size of the true effect, (3) minimum biologically significant effect sizes be used for all power analyses, and (4) if retrospective power estimates are to be reported, then the I?-level, effect sizes, and sample sizes used in calculations must also be reported.
Estimating the proportion of true null hypotheses when the statistics are discrete.
Dialsingh, Isaac; Austin, Stefanie R; Altman, Naomi S
2015-07-15
In high-dimensional testing problems π0, the proportion of null hypotheses that are true is an important parameter. For discrete test statistics, the P values come from a discrete distribution with finite support and the null distribution may depend on an ancillary statistic such as a table margin that varies among the test statistics. Methods for estimating π0 developed for continuous test statistics, which depend on a uniform or identical null distribution of P values, may not perform well when applied to discrete testing problems. This article introduces a number of π0 estimators, the regression and 'T' methods that perform well with discrete test statistics and also assesses how well methods developed for or adapted from continuous tests perform with discrete tests. We demonstrate the usefulness of these estimators in the analysis of high-throughput biological RNA-seq and single-nucleotide polymorphism data. implemented in R. © The Author 2015. Published by Oxford University Press. All rights reserved. For Permissions, please e-mail: journals.permissions@oup.com.
Chen, Xiongzhi; Doerge, Rebecca W; Heyse, Joseph F
2018-05-11
We consider multiple testing with false discovery rate (FDR) control when p values have discrete and heterogeneous null distributions. We propose a new estimator of the proportion of true null hypotheses and demonstrate that it is less upwardly biased than Storey's estimator and two other estimators. The new estimator induces two adaptive procedures, that is, an adaptive Benjamini-Hochberg (BH) procedure and an adaptive Benjamini-Hochberg-Heyse (BHH) procedure. We prove that the adaptive BH (aBH) procedure is conservative nonasymptotically. Through simulation studies, we show that these procedures are usually more powerful than their nonadaptive counterparts and that the adaptive BHH procedure is usually more powerful than the aBH procedure and a procedure based on randomized p-value. The adaptive procedures are applied to a study of HIV vaccine efficacy, where they identify more differentially polymorphic positions than the BH procedure at the same FDR level. © 2018 WILEY-VCH Verlag GmbH & Co. KGaA, Weinheim.
P value and the theory of hypothesis testing: an explanation for new researchers.
Biau, David Jean; Jolles, Brigitte M; Porcher, Raphaël
2010-03-01
In the 1920s, Ronald Fisher developed the theory behind the p value and Jerzy Neyman and Egon Pearson developed the theory of hypothesis testing. These distinct theories have provided researchers important quantitative tools to confirm or refute their hypotheses. The p value is the probability to obtain an effect equal to or more extreme than the one observed presuming the null hypothesis of no effect is true; it gives researchers a measure of the strength of evidence against the null hypothesis. As commonly used, investigators will select a threshold p value below which they will reject the null hypothesis. The theory of hypothesis testing allows researchers to reject a null hypothesis in favor of an alternative hypothesis of some effect. As commonly used, investigators choose Type I error (rejecting the null hypothesis when it is true) and Type II error (accepting the null hypothesis when it is false) levels and determine some critical region. If the test statistic falls into that critical region, the null hypothesis is rejected in favor of the alternative hypothesis. Despite similarities between the two, the p value and the theory of hypothesis testing are different theories that often are misunderstood and confused, leading researchers to improper conclusions. Perhaps the most common misconception is to consider the p value as the probability that the null hypothesis is true rather than the probability of obtaining the difference observed, or one that is more extreme, considering the null is true. Another concern is the risk that an important proportion of statistically significant results are falsely significant. Researchers should have a minimum understanding of these two theories so that they are better able to plan, conduct, interpret, and report scientific experiments.
Huang, Peng; Ou, Ai-hua; Piantadosi, Steven; Tan, Ming
2014-11-01
We discuss the problem of properly defining treatment superiority through the specification of hypotheses in clinical trials. The need to precisely define the notion of superiority in a one-sided hypothesis test problem has been well recognized by many authors. Ideally designed null and alternative hypotheses should correspond to a partition of all possible scenarios of underlying true probability models P={P(ω):ω∈Ω} such that the alternative hypothesis Ha={P(ω):ω∈Ωa} can be inferred upon the rejection of null hypothesis Ho={P(ω):ω∈Ω(o)} However, in many cases, tests are carried out and recommendations are made without a precise definition of superiority or a specification of alternative hypothesis. Moreover, in some applications, the union of probability models specified by the chosen null and alternative hypothesis does not constitute a completed model collection P (i.e., H(o)∪H(a) is smaller than P). This not only imposes a strong non-validated assumption of the underlying true models, but also leads to different superiority claims depending on which test is used instead of scientific plausibility. Different ways to partition P fro testing treatment superiority often have different implications on sample size, power, and significance in both efficacy and comparative effectiveness trial design. Such differences are often overlooked. We provide a theoretical framework for evaluating the statistical properties of different specification of superiority in typical hypothesis testing. This can help investigators to select proper hypotheses for treatment comparison inclinical trial design. Copyright © 2014 Elsevier Inc. All rights reserved.
The Importance of Teaching Power in Statistical Hypothesis Testing
ERIC Educational Resources Information Center
Olinsky, Alan; Schumacher, Phyllis; Quinn, John
2012-01-01
In this paper, we discuss the importance of teaching power considerations in statistical hypothesis testing. Statistical power analysis determines the ability of a study to detect a meaningful effect size, where the effect size is the difference between the hypothesized value of the population parameter under the null hypothesis and the true value…
Wilkinson, Michael
2014-03-01
Decisions about support for predictions of theories in light of data are made using statistical inference. The dominant approach in sport and exercise science is the Neyman-Pearson (N-P) significance-testing approach. When applied correctly it provides a reliable procedure for making dichotomous decisions for accepting or rejecting zero-effect null hypotheses with known and controlled long-run error rates. Type I and type II error rates must be specified in advance and the latter controlled by conducting an a priori sample size calculation. The N-P approach does not provide the probability of hypotheses or indicate the strength of support for hypotheses in light of data, yet many scientists believe it does. Outcomes of analyses allow conclusions only about the existence of non-zero effects, and provide no information about the likely size of true effects or their practical/clinical value. Bayesian inference can show how much support data provide for different hypotheses, and how personal convictions should be altered in light of data, but the approach is complicated by formulating probability distributions about prior subjective estimates of population effects. A pragmatic solution is magnitude-based inference, which allows scientists to estimate the true magnitude of population effects and how likely they are to exceed an effect magnitude of practical/clinical importance, thereby integrating elements of subjective Bayesian-style thinking. While this approach is gaining acceptance, progress might be hastened if scientists appreciate the shortcomings of traditional N-P null hypothesis significance testing.
An omnibus test for the global null hypothesis.
Futschik, Andreas; Taus, Thomas; Zehetmayer, Sonja
2018-01-01
Global hypothesis tests are a useful tool in the context of clinical trials, genetic studies, or meta-analyses, when researchers are not interested in testing individual hypotheses, but in testing whether none of the hypotheses is false. There are several possibilities how to test the global null hypothesis when the individual null hypotheses are independent. If it is assumed that many of the individual null hypotheses are false, combination tests have been recommended to maximize power. If, however, it is assumed that only one or a few null hypotheses are false, global tests based on individual test statistics are more powerful (e.g. Bonferroni or Simes test). However, usually there is no a priori knowledge on the number of false individual null hypotheses. We therefore propose an omnibus test based on cumulative sums of the transformed p-values. We show that this test yields an impressive overall performance. The proposed method is implemented in an R-package called omnibus.
Projecting adverse event incidence rates using empirical Bayes methodology.
Ma, Guoguang Julie; Ganju, Jitendra; Huang, Jing
2016-08-01
Although there is considerable interest in adverse events observed in clinical trials, projecting adverse event incidence rates in an extended period can be of interest when the trial duration is limited compared to clinical practice. A naïve method for making projections might involve modeling the observed rates into the future for each adverse event. However, such an approach overlooks the information that can be borrowed across all the adverse event data. We propose a method that weights each projection using a shrinkage factor; the adverse event-specific shrinkage is a probability, based on empirical Bayes methodology, estimated from all the adverse event data, reflecting evidence in support of the null or non-null hypotheses. Also proposed is a technique to estimate the proportion of true nulls, called the common area under the density curves, which is a critical step in arriving at the shrinkage factor. The performance of the method is evaluated by projecting from interim data and then comparing the projected results with observed results. The method is illustrated on two data sets. © The Author(s) 2013.
Dudoit, Sandrine; Gilbert, Houston N.; van der Laan, Mark J.
2014-01-01
Summary This article proposes resampling-based empirical Bayes multiple testing procedures for controlling a broad class of Type I error rates, defined as generalized tail probability (gTP) error rates, gTP(q, g) = Pr(g(Vn, Sn) > q), and generalized expected value (gEV) error rates, gEV(g) = E[g(Vn, Sn)], for arbitrary functions g(Vn, Sn) of the numbers of false positives Vn and true positives Sn. Of particular interest are error rates based on the proportion g(Vn, Sn) = Vn/(Vn + Sn) of Type I errors among the rejected hypotheses, such as the false discovery rate (FDR), FDR = E[Vn/(Vn + Sn)]. The proposed procedures offer several advantages over existing methods. They provide Type I error control for general data generating distributions, with arbitrary dependence structures among variables. Gains in power are achieved by deriving rejection regions based on guessed sets of true null hypotheses and null test statistics randomly sampled from joint distributions that account for the dependence structure of the data. The Type I error and power properties of an FDR-controlling version of the resampling-based empirical Bayes approach are investigated and compared to those of widely-used FDR-controlling linear step-up procedures in a simulation study. The Type I error and power trade-off achieved by the empirical Bayes procedures under a variety of testing scenarios allows this approach to be competitive with or outperform the Storey and Tibshirani (2003) linear step-up procedure, as an alternative to the classical Benjamini and Hochberg (1995) procedure. PMID:18932138
Bias and variance reduction in estimating the proportion of true-null hypotheses
Cheng, Yebin; Gao, Dexiang; Tong, Tiejun
2015-01-01
When testing a large number of hypotheses, estimating the proportion of true nulls, denoted by \\documentclass[12pt]{minimal} \\usepackage{amsmath} \\usepackage{wasysym} \\usepackage{amsfonts} \\usepackage{amssymb} \\usepackage{amsbsy} \\usepackage{upgreek} \\usepackage{mathrsfs} \\setlength{\\oddsidemargin}{-69pt} \\begin{document} }{}$\\pi _0$\\end{document}, becomes increasingly important. This quantity has many applications in practice. For instance, a reliable estimate of \\documentclass[12pt]{minimal} \\usepackage{amsmath} \\usepackage{wasysym} \\usepackage{amsfonts} \\usepackage{amssymb} \\usepackage{amsbsy} \\usepackage{upgreek} \\usepackage{mathrsfs} \\setlength{\\oddsidemargin}{-69pt} \\begin{document} }{}$\\pi _0$\\end{document} can eliminate the conservative bias of the Benjamini–Hochberg procedure on controlling the false discovery rate. It is known that most methods in the literature for estimating \\documentclass[12pt]{minimal} \\usepackage{amsmath} \\usepackage{wasysym} \\usepackage{amsfonts} \\usepackage{amssymb} \\usepackage{amsbsy} \\usepackage{upgreek} \\usepackage{mathrsfs} \\setlength{\\oddsidemargin}{-69pt} \\begin{document} }{}$\\pi _0$\\end{document} are conservative. Recently, some attempts have been paid to reduce such estimation bias. Nevertheless, they are either over bias corrected or suffering from an unacceptably large estimation variance. In this paper, we propose a new method for estimating \\documentclass[12pt]{minimal} \\usepackage{amsmath} \\usepackage{wasysym} \\usepackage{amsfonts} \\usepackage{amssymb} \\usepackage{amsbsy} \\usepackage{upgreek} \\usepackage{mathrsfs} \\setlength{\\oddsidemargin}{-69pt} \\begin{document} }{}$\\pi _0$\\end{document} that aims to reduce the bias and variance of the estimation simultaneously. To achieve this, we first utilize the probability density functions of false-null \\documentclass[12pt]{minimal} \\usepackage{amsmath} \\usepackage{wasysym} \\usepackage{amsfonts} \\usepackage{amssymb} \\usepackage{amsbsy} \\usepackage{upgreek} \\usepackage{mathrsfs} \\setlength{\\oddsidemargin}{-69pt} \\begin{document} }{}$p$\\end{document}-values and then propose a novel algorithm to estimate the quantity of \\documentclass[12pt]{minimal} \\usepackage{amsmath} \\usepackage{wasysym} \\usepackage{amsfonts} \\usepackage{amssymb} \\usepackage{amsbsy} \\usepackage{upgreek} \\usepackage{mathrsfs} \\setlength{\\oddsidemargin}{-69pt} \\begin{document} }{}$\\pi _0$\\end{document}. The statistical behavior of the proposed estimator is also investigated. Finally, we carry out extensive simulation studies and several real data analysis to evaluate the performance of the proposed estimator. Both simulated and real data demonstrate that the proposed method may improve the existing literature significantly. PMID:24963010
Juswardy, Budi; Xiao, Feng; Alameh, Kamal
2009-03-16
This paper proposes a novel Opto-VLSI-based tunable true-time delay generation unit for adaptively steering the nulls of microwave phased array antennas. Arbitrary single or multiple true-time delays can simultaneously be synthesized for each antenna element by slicing an RF-modulated broadband optical source and routing specific sliced wavebands through an Opto-VLSI processor to a high-dispersion fiber. Experimental results are presented, which demonstrate the principle of the true-time delay unit through the generation of 5 arbitrary true-time delays of up to 2.5 ns each. (c) 2009 Optical Society of America
Employee Attitudes of the Organizational Culture: Assessment of a TQM implementation Process
1990-09-01
68 Analysis of Hypotheses..............70 Null Hypotheses Hola -Holi .............. 71 T-test...............................71 Summary of...Analyses for Hola -Holi 72 Null Hypotheses Ho2a-Ho2i .............. 73 One-way Analysis of Variance ......... 73 Summary of Analyses for Ho2a-Ho2i .... 77...supervisory and non- supervisory organizational members with regard to their attitudes about the organization’s quality culture. Hola . There will be no
The Importance of Proving the Null
Gallistel, C. R.
2010-01-01
Null hypotheses are simple, precise, and theoretically important. Conventional statistical analysis cannot support them; Bayesian analysis can. The challenge in a Bayesian analysis is to formulate a suitably vague alternative, because the vaguer the alternative is (the more it spreads out the unit mass of prior probability), the more the null is favored. A general solution is a sensitivity analysis: Compute the odds for or against the null as a function of the limit(s) on the vagueness of the alternative. If the odds on the null approach 1 from above as the hypothesized maximum size of the possible effect approaches 0, then the data favor the null over any vaguer alternative to it. The simple computations and the intuitive graphic representation of the analysis are illustrated by the analysis of diverse examples from the current literature. They pose 3 common experimental questions: (a) Are 2 means the same? (b) Is performance at chance? (c) Are factors additive? PMID:19348549
Bundschuh, Mirco; Newman, Michael C; Zubrod, Jochen P; Seitz, Frank; Rosenfeldt, Ricki R; Schulz, Ralf
2015-03-01
We argued recently that the positive predictive value (PPV) and the negative predictive value (NPV) are valuable metrics to include during null hypothesis significance testing: They inform the researcher about the probability of statistically significant and non-significant test outcomes actually being true. Although commonly misunderstood, a reported p value estimates only the probability of obtaining the results or more extreme results if the null hypothesis of no effect was true. Calculations of the more informative PPV and NPV require a priori estimate of the probability (R). The present document discusses challenges of estimating R.
In the Beginning-There Is the Introduction-and Your Study Hypothesis.
Vetter, Thomas R; Mascha, Edward J
2017-05-01
Writing a manuscript for a medical journal is very akin to writing a newspaper article-albeit a scholarly one. Like any journalist, you have a story to tell. You need to tell your story in a way that is easy to follow and makes a compelling case to the reader. Although recommended since the beginning of the 20th century, the conventional Introduction-Methods-Results-And-Discussion (IMRAD) scientific reporting structure has only been the standard since the 1980s. The Introduction should be focused and succinct in communicating the significance, background, rationale, study aims or objectives, and the primary (and secondary, if appropriate) study hypotheses. Hypothesis testing involves posing both a null and an alternative hypothesis. The null hypothesis proposes that no difference or association exists on the outcome variable of interest between the interventions or groups being compared. The alternative hypothesis is the opposite of the null hypothesis and thus typically proposes that a difference in the population does exist between the groups being compared on the parameter of interest. Most investigators seek to reject the null hypothesis because of their expectation that the studied intervention does result in a difference between the study groups or that the association of interest does exist. Therefore, in most clinical and basic science studies and manuscripts, the alternative hypothesis is stated, not the null hypothesis. Also, in the Introduction, the alternative hypothesis is typically stated in the direction of interest, or the expected direction. However, when assessing the association of interest, researchers typically look in both directions (ie, favoring 1 group or the other) by conducting a 2-tailed statistical test because the true direction of the effect is typically not known, and either direction would be important to report.
Bias in error estimation when using cross-validation for model selection.
Varma, Sudhir; Simon, Richard
2006-02-23
Cross-validation (CV) is an effective method for estimating the prediction error of a classifier. Some recent articles have proposed methods for optimizing classifiers by choosing classifier parameter values that minimize the CV error estimate. We have evaluated the validity of using the CV error estimate of the optimized classifier as an estimate of the true error expected on independent data. We used CV to optimize the classification parameters for two kinds of classifiers; Shrunken Centroids and Support Vector Machines (SVM). Random training datasets were created, with no difference in the distribution of the features between the two classes. Using these "null" datasets, we selected classifier parameter values that minimized the CV error estimate. 10-fold CV was used for Shrunken Centroids while Leave-One-Out-CV (LOOCV) was used for the SVM. Independent test data was created to estimate the true error. With "null" and "non null" (with differential expression between the classes) data, we also tested a nested CV procedure, where an inner CV loop is used to perform the tuning of the parameters while an outer CV is used to compute an estimate of the error. The CV error estimate for the classifier with the optimal parameters was found to be a substantially biased estimate of the true error that the classifier would incur on independent data. Even though there is no real difference between the two classes for the "null" datasets, the CV error estimate for the Shrunken Centroid with the optimal parameters was less than 30% on 18.5% of simulated training data-sets. For SVM with optimal parameters the estimated error rate was less than 30% on 38% of "null" data-sets. Performance of the optimized classifiers on the independent test set was no better than chance. The nested CV procedure reduces the bias considerably and gives an estimate of the error that is very close to that obtained on the independent testing set for both Shrunken Centroids and SVM classifiers for "null" and "non-null" data distributions. We show that using CV to compute an error estimate for a classifier that has itself been tuned using CV gives a significantly biased estimate of the true error. Proper use of CV for estimating true error of a classifier developed using a well defined algorithm requires that all steps of the algorithm, including classifier parameter tuning, be repeated in each CV loop. A nested CV procedure provides an almost unbiased estimate of the true error.
The importance of proving the null.
Gallistel, C R
2009-04-01
Null hypotheses are simple, precise, and theoretically important. Conventional statistical analysis cannot support them; Bayesian analysis can. The challenge in a Bayesian analysis is to formulate a suitably vague alternative, because the vaguer the alternative is (the more it spreads out the unit mass of prior probability), the more the null is favored. A general solution is a sensitivity analysis: Compute the odds for or against the null as a function of the limit(s) on the vagueness of the alternative. If the odds on the null approach 1 from above as the hypothesized maximum size of the possible effect approaches 0, then the data favor the null over any vaguer alternative to it. The simple computations and the intuitive graphic representation of the analysis are illustrated by the analysis of diverse examples from the current literature. They pose 3 common experimental questions: (a) Are 2 means the same? (b) Is performance at chance? (c) Are factors additive? (c) 2009 APA, all rights reserved
The Importance of Proving the Null
ERIC Educational Resources Information Center
Gallistel, C. R.
2009-01-01
Null hypotheses are simple, precise, and theoretically important. Conventional statistical analysis cannot support them; Bayesian analysis can. The challenge in a Bayesian analysis is to formulate a suitably vague alternative, because the vaguer the alternative is (the more it spreads out the unit mass of prior probability), the more the null is…
Developing the research hypothesis.
Toledo, Alexander H; Flikkema, Robert; Toledo-Pereyra, Luis H
2011-01-01
The research hypothesis is needed for a sound and well-developed research study. The research hypothesis contributes to the solution of the research problem. Types of research hypotheses include inductive and deductive, directional and non-directional, and null and alternative hypotheses. Rejecting the null hypothesis and accepting the alternative hypothesis is the basis for building a good research study. This work reviews the most important aspects of organizing and establishing an efficient and complete hypothesis.
Précis of statistical significance: rationale, validity, and utility.
Chow, S L
1998-04-01
The null-hypothesis significance-test procedure (NHSTP) is defended in the context of the theory-corroboration experiment, as well as the following contrasts: (a) substantive hypotheses versus statistical hypotheses, (b) theory corroboration versus statistical hypothesis testing, (c) theoretical inference versus statistical decision, (d) experiments versus nonexperimental studies, and (e) theory corroboration versus treatment assessment. The null hypothesis can be true because it is the hypothesis that errors are randomly distributed in data. Moreover, the null hypothesis is never used as a categorical proposition. Statistical significance means only that chance influences can be excluded as an explanation of data; it does not identify the nonchance factor responsible. The experimental conclusion is drawn with the inductive principle underlying the experimental design. A chain of deductive arguments gives rise to the theoretical conclusion via the experimental conclusion. The anomalous relationship between statistical significance and the effect size often used to criticize NHSTP is more apparent than real. The absolute size of the effect is not an index of evidential support for the substantive hypothesis. Nor is the effect size, by itself, informative as to the practical importance of the research result. Being a conditional probability, statistical power cannot be the a priori probability of statistical significance. The validity of statistical power is debatable because statistical significance is determined with a single sampling distribution of the test statistic based on H0, whereas it takes two distributions to represent statistical power or effect size. Sample size should not be determined in the mechanical manner envisaged in power analysis. It is inappropriate to criticize NHSTP for nonstatistical reasons. At the same time, neither effect size, nor confidence interval estimate, nor posterior probability can be used to exclude chance as an explanation of data. Neither can any of them fulfill the nonstatistical functions expected of them by critics.
Explorations in statistics: hypothesis tests and P values.
Curran-Everett, Douglas
2009-06-01
Learning about statistics is a lot like learning about science: the learning is more meaningful if you can actively explore. This second installment of Explorations in Statistics delves into test statistics and P values, two concepts fundamental to the test of a scientific null hypothesis. The essence of a test statistic is that it compares what we observe in the experiment to what we expect to see if the null hypothesis is true. The P value associated with the magnitude of that test statistic answers this question: if the null hypothesis is true, what proportion of possible values of the test statistic are at least as extreme as the one I got? Although statisticians continue to stress the limitations of hypothesis tests, there are two realities we must acknowledge: hypothesis tests are ingrained within science, and the simple test of a null hypothesis can be useful. As a result, it behooves us to explore the notions of hypothesis tests, test statistics, and P values.
Conditioning Military Women for Optimal Performance: Effects of Contraceptive Use
1997-10-01
female soldier with three months of uncomplicated birth control . This contraceptive technique is worthy of study because it is used by an ever...of birth control (EU-OV) B. time: pre-training/heat acclimation post-training/heat acclimation 10 Null Hypotheses A. Null Hypotheses Associated with...IgG = immunoglobulin G birth control = eight subjects using either oral contraceptive (n = 7) or Depo Provera (n = 1) combined no birth control = eumenorrheic
The ranking probability approach and its usage in design and analysis of large-scale studies.
Kuo, Chia-Ling; Zaykin, Dmitri
2013-01-01
In experiments with many statistical tests there is need to balance type I and type II error rates while taking multiplicity into account. In the traditional approach, the nominal [Formula: see text]-level such as 0.05 is adjusted by the number of tests, [Formula: see text], i.e., as 0.05/[Formula: see text]. Assuming that some proportion of tests represent "true signals", that is, originate from a scenario where the null hypothesis is false, power depends on the number of true signals and the respective distribution of effect sizes. One way to define power is for it to be the probability of making at least one correct rejection at the assumed [Formula: see text]-level. We advocate an alternative way of establishing how "well-powered" a study is. In our approach, useful for studies with multiple tests, the ranking probability [Formula: see text] is controlled, defined as the probability of making at least [Formula: see text] correct rejections while rejecting hypotheses with [Formula: see text] smallest P-values. The two approaches are statistically related. Probability that the smallest P-value is a true signal (i.e., [Formula: see text]) is equal to the power at the level [Formula: see text], to an very good excellent approximation. Ranking probabilities are also related to the false discovery rate and to the Bayesian posterior probability of the null hypothesis. We study properties of our approach when the effect size distribution is replaced for convenience by a single "typical" value taken to be the mean of the underlying distribution. We conclude that its performance is often satisfactory under this simplification; however, substantial imprecision is to be expected when [Formula: see text] is very large and [Formula: see text] is small. Precision is largely restored when three values with the respective abundances are used instead of a single typical effect size value.
Kuiper, Rebecca M; Nederhoff, Tim; Klugkist, Irene
2015-05-01
In this paper, the performance of six types of techniques for comparisons of means is examined. These six emerge from the distinction between the method employed (hypothesis testing, model selection using information criteria, or Bayesian model selection) and the set of hypotheses that is investigated (a classical, exploration-based set of hypotheses containing equality constraints on the means, or a theory-based limited set of hypotheses with equality and/or order restrictions). A simulation study is conducted to examine the performance of these techniques. We demonstrate that, if one has specific, a priori specified hypotheses, confirmation (i.e., investigating theory-based hypotheses) has advantages over exploration (i.e., examining all possible equality-constrained hypotheses). Furthermore, examining reasonable order-restricted hypotheses has more power to detect the true effect/non-null hypothesis than evaluating only equality restrictions. Additionally, when investigating more than one theory-based hypothesis, model selection is preferred over hypothesis testing. Because of the first two results, we further examine the techniques that are able to evaluate order restrictions in a confirmatory fashion by examining their performance when the homogeneity of variance assumption is violated. Results show that the techniques are robust to heterogeneity when the sample sizes are equal. When the sample sizes are unequal, the performance is affected by heterogeneity. The size and direction of the deviations from the baseline, where there is no heterogeneity, depend on the effect size (of the means) and on the trend in the group variances with respect to the ordering of the group sizes. Importantly, the deviations are less pronounced when the group variances and sizes exhibit the same trend (e.g., are both increasing with group number). © 2014 The British Psychological Society.
Concerns regarding a call for pluralism of information theory and hypothesis testing
Lukacs, P.M.; Thompson, W.L.; Kendall, W.L.; Gould, W.R.; Doherty, P.F.; Burnham, K.P.; Anderson, D.R.
2007-01-01
1. Stephens et al . (2005) argue for `pluralism? in statistical analysis, combining null hypothesis testing and information-theoretic (I-T) methods. We show that I-T methods are more informative even in single variable problems and we provide an ecological example. 2. I-T methods allow inferences to be made from multiple models simultaneously. We believe multimodel inference is the future of data analysis, which cannot be achieved with null hypothesis-testing approaches. 3. We argue for a stronger emphasis on critical thinking in science in general and less reliance on exploratory data analysis and data dredging. Deriving alternative hypotheses is central to science; deriving a single interesting science hypothesis and then comparing it to a default null hypothesis (e.g. `no difference?) is not an efficient strategy for gaining knowledge. We think this single-hypothesis strategy has been relied upon too often in the past. 4. We clarify misconceptions presented by Stephens et al . (2005). 5. We think inference should be made about models, directly linked to scientific hypotheses, and their parameters conditioned on data, Prob(Hj| data). I-T methods provide a basis for this inference. Null hypothesis testing merely provides a probability statement about the data conditioned on a null model, Prob(data |H0). 6. Synthesis and applications. I-T methods provide a more informative approach to inference. I-T methods provide a direct measure of evidence for or against hypotheses and a means to consider simultaneously multiple hypotheses as a basis for rigorous inference. Progress in our science can be accelerated if modern methods can be used intelligently; this includes various I-T and Bayesian methods.
Some controversial multiple testing problems in regulatory applications.
Hung, H M James; Wang, Sue-Jane
2009-01-01
Multiple testing problems in regulatory applications are often more challenging than the problems of handling a set of mathematical symbols representing multiple null hypotheses under testing. In the union-intersection setting, it is important to define a family of null hypotheses relevant to the clinical questions at issue. The distinction between primary endpoint and secondary endpoint needs to be considered properly in different clinical applications. Without proper consideration, the widely used sequential gate keeping strategies often impose too many logical restrictions to make sense, particularly to deal with the problem of testing multiple doses and multiple endpoints, the problem of testing a composite endpoint and its component endpoints, and the problem of testing superiority and noninferiority in the presence of multiple endpoints. Partitioning the null hypotheses involved in closed testing into clinical relevant orderings or sets can be a viable alternative to resolving the illogical problems requiring more attention from clinical trialists in defining the clinical hypotheses or clinical question(s) at the design stage. In the intersection-union setting there is little room for alleviating the stringency of the requirement that each endpoint must meet the same intended alpha level, unless the parameter space under the null hypothesis can be substantially restricted. Such restriction often requires insurmountable justification and usually cannot be supported by the internal data. Thus, a possible remedial approach to alleviate the possible conservatism as a result of this requirement is a group-sequential design strategy that starts with a conservative sample size planning and then utilizes an alpha spending function to possibly reach the conclusion early.
Recognition of VLSI Module Isomorphism
1990-03-01
forthforth->next; 6.5 else{ prev4=prev4->next; forth=forth->next; if (header-. nenI ->tai==third){ header-.nevrI->tail=prev3; prev3->next=NULL; end...end=TRUE; if (header-. nenI ->head=third){ header-.newn->head=third->next; I if((third!=prev3)&&(finished!=TRUE)){ prev3->next=prev3->next->next; third
Suggestions for presenting the results of data analyses
Anderson, David R.; Link, William A.; Johnson, Douglas H.; Burnham, Kenneth P.
2001-01-01
We give suggestions for the presentation of research results from frequentist, information-theoretic, and Bayesian analysis paradigms, followed by several general suggestions. The information-theoretic and Bayesian methods offer alternative approaches to data analysis and inference compared to traditionally used methods. Guidance is lacking on the presentation of results under these alternative procedures and on nontesting aspects of classical frequentists methods of statistical analysis. Null hypothesis testing has come under intense criticism. We recommend less reporting of the results of statistical tests of null hypotheses in cases where the null is surely false anyway, or where the null hypothesis is of little interest to science or management.
ERIC Educational Resources Information Center
Tryon, Warren W.; Lewis, Charles
2008-01-01
Evidence of group matching frequently takes the form of a nonsignificant test of statistical difference. Theoretical hypotheses of no difference are also tested in this way. These practices are flawed in that null hypothesis statistical testing provides evidence against the null hypothesis and failing to reject H[subscript 0] is not evidence…
How to talk about protein‐level false discovery rates in shotgun proteomics
The, Matthew; Tasnim, Ayesha
2016-01-01
A frequently sought output from a shotgun proteomics experiment is a list of proteins that we believe to have been present in the analyzed sample before proteolytic digestion. The standard technique to control for errors in such lists is to enforce a preset threshold for the false discovery rate (FDR). Many consider protein‐level FDRs a difficult and vague concept, as the measurement entities, spectra, are manifestations of peptides and not proteins. Here, we argue that this confusion is unnecessary and provide a framework on how to think about protein‐level FDRs, starting from its basic principle: the null hypothesis. Specifically, we point out that two competing null hypotheses are used concurrently in today's protein inference methods, which has gone unnoticed by many. Using simulations of a shotgun proteomics experiment, we show how confusing one null hypothesis for the other can lead to serious discrepancies in the FDR. Furthermore, we demonstrate how the same simulations can be used to verify FDR estimates of protein inference methods. In particular, we show that, for a simple protein inference method, decoy models can be used to accurately estimate protein‐level FDRs for both competing null hypotheses. PMID:27503675
A toy Penrose inequality and its proof
NASA Astrophysics Data System (ADS)
Bengtsson, Ingemar; Jakobsson, Emma
2016-12-01
We formulate and prove a toy version of the Penrose inequality. The formulation mimics the original Penrose inequality in which the scenario is the following: a shell of null dust collapses in Minkowski space and a marginally trapped surface forms on it. Through a series of arguments relying on established assumptions, an inequality relating the area of this surface to the total energy of the shell is formulated. Then a further reformulation turns the inequality into a statement relating the area and the outer null expansion of a class of surfaces in Minkowski space itself. The inequality has been proven to hold true in many special cases, but there is no proof in general. In the toy version here presented, an analogous inequality in (2 + 1)-dimensional anti-de Sitter space turns out to hold true.
Bayes Factor Approaches for Testing Interval Null Hypotheses
ERIC Educational Resources Information Center
Morey, Richard D.; Rouder, Jeffrey N.
2011-01-01
Psychological theories are statements of constraint. The role of hypothesis testing in psychology is to test whether specific theoretical constraints hold in data. Bayesian statistics is well suited to the task of finding supporting evidence for constraint, because it allows for comparing evidence for 2 hypotheses against each another. One issue…
Significance levels for studies with correlated test statistics.
Shi, Jianxin; Levinson, Douglas F; Whittemore, Alice S
2008-07-01
When testing large numbers of null hypotheses, one needs to assess the evidence against the global null hypothesis that none of the hypotheses is false. Such evidence typically is based on the test statistic of the largest magnitude, whose statistical significance is evaluated by permuting the sample units to simulate its null distribution. Efron (2007) has noted that correlation among the test statistics can induce substantial interstudy variation in the shapes of their histograms, which may cause misleading tail counts. Here, we show that permutation-based estimates of the overall significance level also can be misleading when the test statistics are correlated. We propose that such estimates be conditioned on a simple measure of the spread of the observed histogram, and we provide a method for obtaining conditional significance levels. We justify this conditioning using the conditionality principle described by Cox and Hinkley (1974). Application of the method to gene expression data illustrates the circumstances when conditional significance levels are needed.
Bayes factor and posterior probability: Complementary statistical evidence to p-value.
Lin, Ruitao; Yin, Guosheng
2015-09-01
As a convention, a p-value is often computed in hypothesis testing and compared with the nominal level of 0.05 to determine whether to reject the null hypothesis. Although the smaller the p-value, the more significant the statistical test, it is difficult to perceive the p-value in a probability scale and quantify it as the strength of the data against the null hypothesis. In contrast, the Bayesian posterior probability of the null hypothesis has an explicit interpretation of how strong the data support the null. We make a comparison of the p-value and the posterior probability by considering a recent clinical trial. The results show that even when we reject the null hypothesis, there is still a substantial probability (around 20%) that the null is true. Not only should we examine whether the data would have rarely occurred under the null hypothesis, but we also need to know whether the data would be rare under the alternative. As a result, the p-value only provides one side of the information, for which the Bayes factor and posterior probability may offer complementary evidence. Copyright © 2015 Elsevier Inc. All rights reserved.
A two-hypothesis approach to establishing a life detection/biohazard protocol for planetary samples
NASA Astrophysics Data System (ADS)
Conley, Catharine; Steele, Andrew
2016-07-01
The COSPAR policy on performing a biohazard assessment on samples brought from Mars to Earth is framed in the context of a concern for false-positive results. However, as noted during the 2012 Workshop for Life Detection in Samples from Mars (ref. Kminek et al., 2014), a more significant concern for planetary samples brought to Earth is false-negative results, because an undetected biohazard could increase risk to the Earth. This is the reason that stringent contamination control must be a high priority for all Category V Restricted Earth Return missions. A useful conceptual framework for addressing these concerns involves two complementary 'null' hypotheses: testing both of them, together, would allow statistical and community confidence to be developed regarding one or the other conclusion. As noted above, false negatives are of primary concern for safety of the Earth, so the 'Earth Safety null hypothesis' -- that must be disproved to assure low risk to the Earth from samples introduced by Category V Restricted Earth Return missions -- is 'There is native life in these samples.' False positives are of primary concern for Astrobiology, so the 'Astrobiology null hypothesis' -- that must be disproved in order to demonstrate the existence of extraterrestrial life is 'There is no life in these samples.' The presence of Earth contamination would render both of these hypotheses more difficult to disprove. Both these hypotheses can be tested following a strict science protocol; analyse, interprete, test the hypotheses and repeat. The science measurements undertaken are then done in an iterative fashion that responds to discovery with both hypotheses testable from interpretation of the scientific data. This is a robust, community involved activity that ensures maximum science return with minimal sample use.
Szucs, Denes; Ioannidis, John P A
2017-03-01
We have empirically assessed the distribution of published effect sizes and estimated power by analyzing 26,841 statistical records from 3,801 cognitive neuroscience and psychology papers published recently. The reported median effect size was D = 0.93 (interquartile range: 0.64-1.46) for nominally statistically significant results and D = 0.24 (0.11-0.42) for nonsignificant results. Median power to detect small, medium, and large effects was 0.12, 0.44, and 0.73, reflecting no improvement through the past half-century. This is so because sample sizes have remained small. Assuming similar true effect sizes in both disciplines, power was lower in cognitive neuroscience than in psychology. Journal impact factors negatively correlated with power. Assuming a realistic range of prior probabilities for null hypotheses, false report probability is likely to exceed 50% for the whole literature. In light of our findings, the recently reported low replication success in psychology is realistic, and worse performance may be expected for cognitive neuroscience.
Hypothesis testing and earthquake prediction.
Jackson, D D
1996-04-30
Requirements for testing include advance specification of the conditional rate density (probability per unit time, area, and magnitude) or, alternatively, probabilities for specified intervals of time, space, and magnitude. Here I consider testing fully specified hypotheses, with no parameter adjustments or arbitrary decisions allowed during the test period. Because it may take decades to validate prediction methods, it is worthwhile to formulate testable hypotheses carefully in advance. Earthquake prediction generally implies that the probability will be temporarily higher than normal. Such a statement requires knowledge of "normal behavior"--that is, it requires a null hypothesis. Hypotheses can be tested in three ways: (i) by comparing the number of actual earth-quakes to the number predicted, (ii) by comparing the likelihood score of actual earthquakes to the predicted distribution, and (iii) by comparing the likelihood ratio to that of a null hypothesis. The first two tests are purely self-consistency tests, while the third is a direct comparison of two hypotheses. Predictions made without a statement of probability are very difficult to test, and any test must be based on the ratio of earthquakes in and out of the forecast regions.
Hypothesis testing and earthquake prediction.
Jackson, D D
1996-01-01
Requirements for testing include advance specification of the conditional rate density (probability per unit time, area, and magnitude) or, alternatively, probabilities for specified intervals of time, space, and magnitude. Here I consider testing fully specified hypotheses, with no parameter adjustments or arbitrary decisions allowed during the test period. Because it may take decades to validate prediction methods, it is worthwhile to formulate testable hypotheses carefully in advance. Earthquake prediction generally implies that the probability will be temporarily higher than normal. Such a statement requires knowledge of "normal behavior"--that is, it requires a null hypothesis. Hypotheses can be tested in three ways: (i) by comparing the number of actual earth-quakes to the number predicted, (ii) by comparing the likelihood score of actual earthquakes to the predicted distribution, and (iii) by comparing the likelihood ratio to that of a null hypothesis. The first two tests are purely self-consistency tests, while the third is a direct comparison of two hypotheses. Predictions made without a statement of probability are very difficult to test, and any test must be based on the ratio of earthquakes in and out of the forecast regions. PMID:11607663
How to talk about protein-level false discovery rates in shotgun proteomics.
The, Matthew; Tasnim, Ayesha; Käll, Lukas
2016-09-01
A frequently sought output from a shotgun proteomics experiment is a list of proteins that we believe to have been present in the analyzed sample before proteolytic digestion. The standard technique to control for errors in such lists is to enforce a preset threshold for the false discovery rate (FDR). Many consider protein-level FDRs a difficult and vague concept, as the measurement entities, spectra, are manifestations of peptides and not proteins. Here, we argue that this confusion is unnecessary and provide a framework on how to think about protein-level FDRs, starting from its basic principle: the null hypothesis. Specifically, we point out that two competing null hypotheses are used concurrently in today's protein inference methods, which has gone unnoticed by many. Using simulations of a shotgun proteomics experiment, we show how confusing one null hypothesis for the other can lead to serious discrepancies in the FDR. Furthermore, we demonstrate how the same simulations can be used to verify FDR estimates of protein inference methods. In particular, we show that, for a simple protein inference method, decoy models can be used to accurately estimate protein-level FDRs for both competing null hypotheses. © 2016 The Authors. Proteomics Published by Wiley-VCH Verlag GmbH & Co. KGaA, Weinheim.
Social Development in Six-Year-Old Identical and Fraternal Twins.
ERIC Educational Resources Information Center
Schave, Barbara; And Others
Four null hypotheses were formulated to test for relationships between pairs of identical and fraternal twins and their parents on measures of locus of control. Two additional hypotheses were formulated to test for differences between mean scores of identical and fraternal twins and scores of their parents on these same constructs. Twenty pairs of…
Testing Small Variance Priors Using Prior-Posterior Predictive p Values.
Hoijtink, Herbert; van de Schoot, Rens
2017-04-03
Muthén and Asparouhov (2012) propose to evaluate model fit in structural equation models based on approximate (using small variance priors) instead of exact equality of (combinations of) parameters to zero. This is an important development that adequately addresses Cohen's (1994) The Earth is Round (p < .05), which stresses that point null-hypotheses are so precise that small and irrelevant differences from the null-hypothesis may lead to their rejection. It is tempting to evaluate small variance priors using readily available approaches like the posterior predictive p value and the DIC. However, as will be shown, both are not suited for the evaluation of models based on small variance priors. In this article, a well behaving alternative, the prior-posterior predictive p value, will be introduced. It will be shown that it is consistent, the distributions under the null and alternative hypotheses will be elaborated, and it will be applied to testing whether the difference between 2 means and the size of a correlation are relevantly different from zero. (PsycINFO Database Record (c) 2017 APA, all rights reserved).
Compensation for PKMζ in long-term potentiation and spatial long-term memory in mutant mice.
Tsokas, Panayiotis; Hsieh, Changchi; Yao, Yudong; Lesburguères, Edith; Wallace, Emma Jane Claire; Tcherepanov, Andrew; Jothianandan, Desingarao; Hartley, Benjamin Rush; Pan, Ling; Rivard, Bruno; Farese, Robert V; Sajan, Mini P; Bergold, Peter John; Hernández, Alejandro Iván; Cottrell, James E; Shouval, Harel Z; Fenton, André Antonio; Sacktor, Todd Charlton
2016-05-17
PKMζ is a persistently active PKC isoform proposed to maintain late-LTP and long-term memory. But late-LTP and memory are maintained without PKMζ in PKMζ-null mice. Two hypotheses can account for these findings. First, PKMζ is unimportant for LTP or memory. Second, PKMζ is essential for late-LTP and long-term memory in wild-type mice, and PKMζ-null mice recruit compensatory mechanisms. We find that whereas PKMζ persistently increases in LTP maintenance in wild-type mice, PKCι/λ, a gene-product closely related to PKMζ, persistently increases in LTP maintenance in PKMζ-null mice. Using a pharmacogenetic approach, we find PKMζ-antisense in hippocampus blocks late-LTP and spatial long-term memory in wild-type mice, but not in PKMζ-null mice without the target mRNA. Conversely, a PKCι/λ-antagonist disrupts late-LTP and spatial memory in PKMζ-null mice but not in wild-type mice. Thus, whereas PKMζ is essential for wild-type LTP and long-term memory, persistent PKCι/λ activation compensates for PKMζ loss in PKMζ-null mice.
Null but not void: considerations for hypothesis testing.
Shaw, Pamela A; Proschan, Michael A
2013-01-30
Standard statistical theory teaches us that once the null and alternative hypotheses have been defined for a parameter, the choice of the statistical test is clear. Standard theory does not teach us how to choose the null or alternative hypothesis appropriate to the scientific question of interest. Neither does it tell us that in some cases, depending on which alternatives are realistic, we may want to define our null hypothesis differently. Problems in statistical practice are frequently not as pristinely summarized as the classic theory in our textbooks. In this article, we present examples in statistical hypothesis testing in which seemingly simple choices are in fact rich with nuance that, when given full consideration, make the choice of the right hypothesis test much less straightforward. Published 2012. This article is a US Government work and is in the public domain in the USA.
Unicorns do exist: a tutorial on "proving" the null hypothesis.
Streiner, David L
2003-12-01
Introductory statistics classes teach us that we can never prove the null hypothesis; all we can do is reject or fail to reject it. However, there are times when it is necessary to try to prove the nonexistence of a difference between groups. This most often happens within the context of comparing a new treatment against an established one and showing that the new intervention is not inferior to the standard. This article first outlines the logic of "noninferiority" testing by differentiating between the null hypothesis (that which we are trying to nullify) and the "nill" hypothesis (there is no difference), reversing the role of the null and alternate hypotheses, and defining an interval within which groups are said to be equivalent. We then work through an example and show how to calculate sample sizes for noninferiority studies.
Compensation for PKMζ in long-term potentiation and spatial long-term memory in mutant mice
Tsokas, Panayiotis; Hsieh, Changchi; Yao, Yudong; Lesburguères, Edith; Wallace, Emma Jane Claire; Tcherepanov, Andrew; Jothianandan, Desingarao; Hartley, Benjamin Rush; Pan, Ling; Rivard, Bruno; Farese, Robert V; Sajan, Mini P; Bergold, Peter John; Hernández, Alejandro Iván; Cottrell, James E; Shouval, Harel Z; Fenton, André Antonio; Sacktor, Todd Charlton
2016-01-01
PKMζ is a persistently active PKC isoform proposed to maintain late-LTP and long-term memory. But late-LTP and memory are maintained without PKMζ in PKMζ-null mice. Two hypotheses can account for these findings. First, PKMζ is unimportant for LTP or memory. Second, PKMζ is essential for late-LTP and long-term memory in wild-type mice, and PKMζ-null mice recruit compensatory mechanisms. We find that whereas PKMζ persistently increases in LTP maintenance in wild-type mice, PKCι/λ, a gene-product closely related to PKMζ, persistently increases in LTP maintenance in PKMζ-null mice. Using a pharmacogenetic approach, we find PKMζ-antisense in hippocampus blocks late-LTP and spatial long-term memory in wild-type mice, but not in PKMζ-null mice without the target mRNA. Conversely, a PKCι/λ-antagonist disrupts late-LTP and spatial memory in PKMζ-null mice but not in wild-type mice. Thus, whereas PKMζ is essential for wild-type LTP and long-term memory, persistent PKCι/λ activation compensates for PKMζ loss in PKMζ-null mice. DOI: http://dx.doi.org/10.7554/eLife.14846.001 PMID:27187150
Roberts, Sarah C M
2011-01-01
This systematic review focuses on research about macro-level gender equality and violence against women (VAW) and identifies conceptually and theoretically driven hypotheses as well as lessons relevant for alcohol research. Hypotheses include: amelioration--increased equality decreases VAW; backlash--increased equality increases VAW; and convergence--increased equality reduces the gender gap; and hypotheses that distinguish between relative and absolute status, with relative status comparing men's and women's status and absolute status measuring women's status without regard to men. Systematic review of studies published through June 2009 identified through PubMed and Web of Science, as well as citing and cited articles. A total of 30 studies are included. Of 85 findings examining amelioration/backlash, 25% support amelioration, 22% backlash; and 53% are null. Of 13 findings examining convergence, 31% support and 23% are inconsistent with convergence; 46% are null. Neither the existence nor the direction of the equality and VAW relationship can be assumed. This suggests that the relationship between macro-level gender equality and alcohol should also not be assumed, but rather investigated through research.
Gaffney, P. M.; Scott, T. M.; Koehn, R. K.; Diehl, W. J.
1990-01-01
Allozyme surveys of marine invertebrates commonly report heterozygote deficiencies, a correlation between multiple locus heterozygosity and size, or both. Hypotheses advanced to account for these phenomena include inbreeding, null alleles, selection, spatial or temporal Wahlund effects, aneuploidy and molecular imprinting. Previous studies have been unable to clearly distinguish among these alternative hypotheses. This report analyzes a large data set (1906 individuals, 15 allozyme loci) from a single field collection of the coot clam Mulinia lateralis and demonstrates (1) significant heterozygote deficiencies at 13 of 15 loci, (2) a correlation between the magnitude of heterozygote deficiency at a locus and the effect of heterozygosity at that locus on shell length, and (3) a distribution of multilocus heterozygosity which deviates from that predicted by observed single-locus heterozygosities. A critical examination of the abovementioned hypotheses as sources of these findings rules out inbreeding, null alleles, aneuploidy, population mixing and imprinting as sole causes. The pooling of larval subpopulations subjected to varying degrees of selection, aneuploidy or imprinting could account for the patterns observed in this study. PMID:2311919
Szucs, Denes; Ioannidis, John P. A.
2017-01-01
We have empirically assessed the distribution of published effect sizes and estimated power by analyzing 26,841 statistical records from 3,801 cognitive neuroscience and psychology papers published recently. The reported median effect size was D = 0.93 (interquartile range: 0.64–1.46) for nominally statistically significant results and D = 0.24 (0.11–0.42) for nonsignificant results. Median power to detect small, medium, and large effects was 0.12, 0.44, and 0.73, reflecting no improvement through the past half-century. This is so because sample sizes have remained small. Assuming similar true effect sizes in both disciplines, power was lower in cognitive neuroscience than in psychology. Journal impact factors negatively correlated with power. Assuming a realistic range of prior probabilities for null hypotheses, false report probability is likely to exceed 50% for the whole literature. In light of our findings, the recently reported low replication success in psychology is realistic, and worse performance may be expected for cognitive neuroscience. PMID:28253258
Perneger, Thomas V; Combescure, Christophe
2017-07-01
Published P-values provide a window into the global enterprise of medical research. The aim of this study was to use the distribution of published P-values to estimate the relative frequencies of null and alternative hypotheses and to seek irregularities suggestive of publication bias. This cross-sectional study included P-values published in 120 medical research articles in 2016 (30 each from the BMJ, JAMA, Lancet, and New England Journal of Medicine). The observed distribution of P-values was compared with expected distributions under the null hypothesis (i.e., uniform between 0 and 1) and the alternative hypothesis (strictly decreasing from 0 to 1). P-values were categorized according to conventional levels of statistical significance and in one-percent intervals. Among 4,158 recorded P-values, 26.1% were highly significant (P < 0.001), 9.1% were moderately significant (P ≥ 0.001 to < 0.01), 11.7% were weakly significant (P ≥ 0.01 to < 0.05), and 53.2% were nonsignificant (P ≥ 0.05). We noted three irregularities: (1) high proportion of P-values <0.001, especially in observational studies, (2) excess of P-values equal to 1, and (3) about twice as many P-values less than 0.05 compared with those more than 0.05. The latter finding was seen in both randomized trials and observational studies, and in most types of analyses, excepting heterogeneity tests and interaction tests. Under plausible assumptions, we estimate that about half of the tested hypotheses were null and the other half were alternative. This analysis suggests that statistical tests published in medical journals are not a random sample of null and alternative hypotheses but that selective reporting is prevalent. In particular, significant results are about twice as likely to be reported as nonsignificant results. Copyright © 2017 Elsevier Inc. All rights reserved.
"Trust the researchers": flying in the face of evidence.
Noble, John H
2017-01-01
There are always rival hypotheses to explain away the one that is posited as the most likely to be true. Context and Occam's razor - the principle that among competing hypotheses, the one with the fewest assumptions should be selected - ultimately point to which hypothesis is the most likely to be true.
Bayesian models based on test statistics for multiple hypothesis testing problems.
Ji, Yuan; Lu, Yiling; Mills, Gordon B
2008-04-01
We propose a Bayesian method for the problem of multiple hypothesis testing that is routinely encountered in bioinformatics research, such as the differential gene expression analysis. Our algorithm is based on modeling the distributions of test statistics under both null and alternative hypotheses. We substantially reduce the complexity of the process of defining posterior model probabilities by modeling the test statistics directly instead of modeling the full data. Computationally, we apply a Bayesian FDR approach to control the number of rejections of null hypotheses. To check if our model assumptions for the test statistics are valid for various bioinformatics experiments, we also propose a simple graphical model-assessment tool. Using extensive simulations, we demonstrate the performance of our models and the utility of the model-assessment tool. In the end, we apply the proposed methodology to an siRNA screening and a gene expression experiment.
One-way ANOVA based on interval information
NASA Astrophysics Data System (ADS)
Hesamian, Gholamreza
2016-08-01
This paper deals with extending the one-way analysis of variance (ANOVA) to the case where the observed data are represented by closed intervals rather than real numbers. In this approach, first a notion of interval random variable is introduced. Especially, a normal distribution with interval parameters is introduced to investigate hypotheses about the equality of interval means or test the homogeneity of interval variances assumption. Moreover, the least significant difference (LSD method) for investigating multiple comparison of interval means is developed when the null hypothesis about the equality of means is rejected. Then, at a given interval significance level, an index is applied to compare the interval test statistic and the related interval critical value as a criterion to accept or reject the null interval hypothesis of interest. Finally, the method of decision-making leads to some degrees to accept or reject the interval hypotheses. An applied example will be used to show the performance of this method.
Solomon, Gary S; Haase, Richard F; Kuhn, Andrew
2013-02-01
Sports neuropsychology has emerged as a specialty area within the field of clinical neuropsychology. The role of the sports neuropsychologist, rooted in baseline and post-concussion testing, has evolved to include other clinical domains, including the clinical assessment of potential draft picks. There is no published information on the neurocognitive characteristics of these draft picks. We sought to determine whether elite NFL draft picks differed from NFL roster athletes on neurocognitive (ImPACT) and biopsychosocial characteristics, and given that no published data exists for this population, adopted null hypotheses. Null hypotheses were rejected for two of the four ImPACT scores, as elite draft picks scored higher on measures of visual motor speed and reaction time than roster NFL athletes. Subtle but distinct neurocognitive differences are noted when comparing elite NFL draft picks with norms from a cumulative roster of a single NFL team.
ERIC Educational Resources Information Center
Case, Catherine; Whitaker, Douglas
2016-01-01
In the criminal justice system, defendants accused of a crime are presumed innocent until proven guilty. Statistical inference in any context is built on an analogous principle: The null hypothesis--often a hypothesis of "no difference" or "no effect"--is presumed true unless there is sufficient evidence against it. In this…
Statistical significance versus clinical relevance.
van Rijn, Marieke H C; Bech, Anneke; Bouyer, Jean; van den Brand, Jan A J G
2017-04-01
In March this year, the American Statistical Association (ASA) posted a statement on the correct use of P-values, in response to a growing concern that the P-value is commonly misused and misinterpreted. We aim to translate these warnings given by the ASA into a language more easily understood by clinicians and researchers without a deep background in statistics. Moreover, we intend to illustrate the limitations of P-values, even when used and interpreted correctly, and bring more attention to the clinical relevance of study findings using two recently reported studies as examples. We argue that P-values are often misinterpreted. A common mistake is saying that P < 0.05 means that the null hypothesis is false, and P ≥0.05 means that the null hypothesis is true. The correct interpretation of a P-value of 0.05 is that if the null hypothesis were indeed true, a similar or more extreme result would occur 5% of the times upon repeating the study in a similar sample. In other words, the P-value informs about the likelihood of the data given the null hypothesis and not the other way around. A possible alternative related to the P-value is the confidence interval (CI). It provides more information on the magnitude of an effect and the imprecision with which that effect was estimated. However, there is no magic bullet to replace P-values and stop erroneous interpretation of scientific results. Scientists and readers alike should make themselves familiar with the correct, nuanced interpretation of statistical tests, P-values and CIs. © The Author 2017. Published by Oxford University Press on behalf of ERA-EDTA. All rights reserved.
Bayesian evaluation of effect size after replicating an original study
van Aert, Robbie C. M.; van Assen, Marcel A. L. M.
2017-01-01
The vast majority of published results in the literature is statistically significant, which raises concerns about their reliability. The Reproducibility Project Psychology (RPP) and Experimental Economics Replication Project (EE-RP) both replicated a large number of published studies in psychology and economics. The original study and replication were statistically significant in 36.1% in RPP and 68.8% in EE-RP suggesting many null effects among the replicated studies. However, evidence in favor of the null hypothesis cannot be examined with null hypothesis significance testing. We developed a Bayesian meta-analysis method called snapshot hybrid that is easy to use and understand and quantifies the amount of evidence in favor of a zero, small, medium and large effect. The method computes posterior model probabilities for a zero, small, medium, and large effect and adjusts for publication bias by taking into account that the original study is statistically significant. We first analytically approximate the methods performance, and demonstrate the necessity to control for the original study’s significance to enable the accumulation of evidence for a true zero effect. Then we applied the method to the data of RPP and EE-RP, showing that the underlying effect sizes of the included studies in EE-RP are generally larger than in RPP, but that the sample sizes of especially the included studies in RPP are often too small to draw definite conclusions about the true effect size. We also illustrate how snapshot hybrid can be used to determine the required sample size of the replication akin to power analysis in null hypothesis significance testing and present an easy to use web application (https://rvanaert.shinyapps.io/snapshot/) and R code for applying the method. PMID:28388646
Goovaerts, Pierre; Jacquez, Geoffrey M
2004-01-01
Background Complete Spatial Randomness (CSR) is the null hypothesis employed by many statistical tests for spatial pattern, such as local cluster or boundary analysis. CSR is however not a relevant null hypothesis for highly complex and organized systems such as those encountered in the environmental and health sciences in which underlying spatial pattern is present. This paper presents a geostatistical approach to filter the noise caused by spatially varying population size and to generate spatially correlated neutral models that account for regional background obtained by geostatistical smoothing of observed mortality rates. These neutral models were used in conjunction with the local Moran statistics to identify spatial clusters and outliers in the geographical distribution of male and female lung cancer in Nassau, Queens, and Suffolk counties, New York, USA. Results We developed a typology of neutral models that progressively relaxes the assumptions of null hypotheses, allowing for the presence of spatial autocorrelation, non-uniform risk, and incorporation of spatially heterogeneous population sizes. Incorporation of spatial autocorrelation led to fewer significant ZIP codes than found in previous studies, confirming earlier claims that CSR can lead to over-identification of the number of significant spatial clusters or outliers. Accounting for population size through geostatistical filtering increased the size of clusters while removing most of the spatial outliers. Integration of regional background into the neutral models yielded substantially different spatial clusters and outliers, leading to the identification of ZIP codes where SMR values significantly depart from their regional background. Conclusion The approach presented in this paper enables researchers to assess geographic relationships using appropriate null hypotheses that account for the background variation extant in real-world systems. In particular, this new methodology allows one to identify geographic pattern above and beyond background variation. The implementation of this approach in spatial statistical software will facilitate the detection of spatial disparities in mortality rates, establishing the rationale for targeted cancer control interventions, including consideration of health services needs, and resource allocation for screening and diagnostic testing. It will allow researchers to systematically evaluate how sensitive their results are to assumptions implicit under alternative null hypotheses. PMID:15272930
ERIC Educational Resources Information Center
Stallings, William M.
In the educational research literature alpha, the a priori level of significance, and p, the a posteriori probability of obtaining a test statistic of at least a certain value when the null hypothesis is true, are often confused. Explanations for this confusion are offered. Paradoxically, alpha retains a prominent place in textbook discussions of…
To P or Not to P: Backing Bayesian Statistics.
Buchinsky, Farrel J; Chadha, Neil K
2017-12-01
In biomedical research, it is imperative to differentiate chance variation from truth before we generalize what we see in a sample of subjects to the wider population. For decades, we have relied on null hypothesis significance testing, where we calculate P values for our data to decide whether to reject a null hypothesis. This methodology is subject to substantial misinterpretation and errant conclusions. Instead of working backward by calculating the probability of our data if the null hypothesis were true, Bayesian statistics allow us instead to work forward, calculating the probability of our hypothesis given the available data. This methodology gives us a mathematical means of incorporating our "prior probabilities" from previous study data (if any) to produce new "posterior probabilities." Bayesian statistics tell us how confidently we should believe what we believe. It is time to embrace and encourage their use in our otolaryngology research.
Statistical Hypothesis Testing in Intraspecific Phylogeography: NCPA versus ABC
Templeton, Alan R.
2009-01-01
Nested clade phylogeographic analysis (NCPA) and approximate Bayesian computation (ABC) have been used to test phylogeographic hypotheses. Multilocus NCPA tests null hypotheses, whereas ABC discriminates among a finite set of alternatives. The interpretive criteria of NCPA are explicit and allow complex models to be built from simple components. The interpretive criteria of ABC are ad hoc and require the specification of a complete phylogeographic model. The conclusions from ABC are often influenced by implicit assumptions arising from the many parameters needed to specify a complex model. These complex models confound many assumptions so that biological interpretations are difficult. Sampling error is accounted for in NCPA, but ABC ignores important sources of sampling error that creates pseudo-statistical power. NCPA generates the full sampling distribution of its statistics, but ABC only yields local probabilities, which in turn make it impossible to distinguish between a good fitting model, a non-informative model, and an over-determined model. Both NCPA and ABC use approximations, but convergences of the approximations used in NCPA are well defined whereas those in ABC are not. NCPA can analyze a large number of locations, but ABC cannot. Finally, the dimensionality of tested hypothesis is known in NCPA, but not for ABC. As a consequence, the “probabilities” generated by ABC are not true probabilities and are statistically non-interpretable. Accordingly, ABC should not be used for hypothesis testing, but simulation approaches are valuable when used in conjunction with NCPA or other methods that do not rely on highly parameterized models. PMID:19192182
Red hair is the null phenotype of MC1R.
Beaumont, Kimberley A; Shekar, Sri N; Cook, Anthony L; Duffy, David L; Sturm, Richard A
2008-08-01
The Melanocortin-1 Receptor (MC1R) is a G-protein coupled receptor, which is responsible for production of the darker eumelanin pigment and the tanning response. The MC1R gene has many polymorphisms, some of which have been linked to variation in pigmentation phenotypes within human populations. In particular, the p.D84E, p.R151C, p.R160W and p.D294 H alleles have been strongly associated with red hair, fair skin and increased skin cancer risk. These red hair colour (RHC) variants are relatively well described and are thought to result in altered receptor function, while still retaining varying levels of signaling ability in vitro. The mouse Mc1r null phenotype is yellow fur colour, the p.R151C, p.R160W and p.D294 H alleles were able to partially rescue this phenotype, leading to the question of what the true null phenotype of MC1R would be in humans. Due to the rarity of MC1R null alleles in human populations, they have only been found in the heterozygous state until now. We report here the first case of a homozygous MC1R null individual, phenotypic analysis indicates that red hair and fair skin is found in the absence of MC1R function.
Roberts, Sarah C.M.
2011-01-01
Aims: This systematic review focuses on research about macro-level gender equality and violence against women (VAW) and identifies conceptually and theoretically driven hypotheses as well as lessons relevant for alcohol research. Hypotheses include: amelioration—increased equality decreases VAW; backlash—increased equality increases VAW; and convergence—increased equality reduces the gender gap; and hypotheses that distinguish between relative and absolute status, with relative status comparing men's and women's status and absolute status measuring women's status without regard to men. Methods: Systematic review of studies published through June 2009 identified through PubMed and Web of Science, as well as citing and cited articles. Results: A total of 30 studies are included. Of 85 findings examining amelioration/backlash, 25% support amelioration, 22% backlash; and 53% are null. Of 13 findings examining convergence, 31% support and 23% are inconsistent with convergence; 46% are null. Conclusion: Neither the existence nor the direction of the equality and VAW relationship can be assumed. This suggests that the relationship between macro-level gender equality and alcohol should also not be assumed, but rather investigated through research. PMID:21239417
How Often Is p[subscript rep] Close to the True Replication Probability?
ERIC Educational Resources Information Center
Trafimow, David; MacDonald, Justin A.; Rice, Stephen; Clason, Dennis L.
2010-01-01
Largely due to dissatisfaction with the standard null hypothesis significance testing procedure, researchers have begun to consider alternatives. For example, Killeen (2005a) has argued that researchers should calculate p[subscript rep] that is purported to indicate the probability that, if the experiment in question were replicated, the obtained…
Absence of Wip1 partially rescues Atm deficiency phenotypes in mice
Darlington, Yolanda; Nguyen, Thuy-Ai; Moon, Sung-Hwan; Herron, Alan; Rao, Pulivarthi; Zhu, Chengming; Lu, Xiongbin; Donehower, Lawrence A.
2011-01-01
Wildtype p53-Induced Phosphatase 1 (WIP1) is a serine/threonine phosphatase that dephosphorylates proteins in the ataxia telangiectasia mutated (ATM)-initiated DNA damage response pathway. WIP1 may play a homeostatic role in ATM signaling by returning the cell to a normal pre-stress state following completion of DNA repair. To better understand the effects of WIP1 on ATM signaling, we crossed Atm-deficient mice to Wip1-deficient mice and characterized phenotypes of the double knockout progeny. We hypothesized that the absence of Wip1 might rescue Atm deficiency phenotypes. Atm null mice, like ATM-deficient humans with the inherited syndrome ataxia telangiectasia, exhibit radiation sensitivity, fertility defects, and are T-cell lymphoma prone. Most double knockout mice were largely protected from lymphoma development and had a greatly extended lifespan compared to Atm null mice. Double knockout mice had increased p53 and H2AX phosphorylation and p21 expression compared to their Atm null counterparts, indicating enhanced p53 and DNA damage responses. Additionally, double knockout splenocytes displayed reduced chromosomal instability compared to Atm null mice. Finally, doubly null mice were partially rescued from infertility defects observed in Atm null mice. These results indicate that inhibition of WIP1 may represent a useful strategy for cancer treatment in general and A-T patients in particular. PMID:21765465
McArtor, Daniel B.; Lubke, Gitta H.; Bergeman, C. S.
2017-01-01
Person-centered methods are useful for studying individual differences in terms of (dis)similarities between response profiles on multivariate outcomes. Multivariate distance matrix regression (MDMR) tests the significance of associations of response profile (dis)similarities and a set of predictors using permutation tests. This paper extends MDMR by deriving and empirically validating the asymptotic null distribution of its test statistic, and by proposing an effect size for individual outcome variables, which is shown to recover true associations. These extensions alleviate the computational burden of permutation tests currently used in MDMR and render more informative results, thus making MDMR accessible to new research domains. PMID:27738957
McArtor, Daniel B; Lubke, Gitta H; Bergeman, C S
2017-12-01
Person-centered methods are useful for studying individual differences in terms of (dis)similarities between response profiles on multivariate outcomes. Multivariate distance matrix regression (MDMR) tests the significance of associations of response profile (dis)similarities and a set of predictors using permutation tests. This paper extends MDMR by deriving and empirically validating the asymptotic null distribution of its test statistic, and by proposing an effect size for individual outcome variables, which is shown to recover true associations. These extensions alleviate the computational burden of permutation tests currently used in MDMR and render more informative results, thus making MDMR accessible to new research domains.
DETECTING UNSPECIFIED STRUCTURE IN LOW-COUNT IMAGES
DOE Office of Scientific and Technical Information (OSTI.GOV)
Stein, Nathan M.; Dyk, David A. van; Kashyap, Vinay L.
Unexpected structure in images of astronomical sources often presents itself upon visual inspection of the image, but such apparent structure may either correspond to true features in the source or be due to noise in the data. This paper presents a method for testing whether inferred structure in an image with Poisson noise represents a significant departure from a baseline (null) model of the image. To infer image structure, we conduct a Bayesian analysis of a full model that uses a multiscale component to allow flexible departures from the posited null model. As a test statistic, we use a tailmore » probability of the posterior distribution under the full model. This choice of test statistic allows us to estimate a computationally efficient upper bound on a p-value that enables us to draw strong conclusions even when there are limited computational resources that can be devoted to simulations under the null model. We demonstrate the statistical performance of our method on simulated images. Applying our method to an X-ray image of the quasar 0730+257, we find significant evidence against the null model of a single point source and uniform background, lending support to the claim of an X-ray jet.« less
Cyclin A2 promotes DNA repair in the brain during both development and aging.
Gygli, Patrick E; Chang, Joshua C; Gokozan, Hamza N; Catacutan, Fay P; Schmidt, Theresa A; Kaya, Behiye; Goksel, Mustafa; Baig, Faisal S; Chen, Shannon; Griveau, Amelie; Michowski, Wojciech; Wong, Michael; Palanichamy, Kamalakannan; Sicinski, Piotr; Nelson, Randy J; Czeisler, Catherine; Otero, José J
2016-07-01
Various stem cell niches of the brain have differential requirements for Cyclin A2. Cyclin A2 loss results in marked cerebellar dysmorphia, whereas forebrain growth is retarded during early embryonic development yet achieves normal size at birth. To understand the differential requirements of distinct brain regions for Cyclin A2, we utilized neuroanatomical, transgenic mouse, and mathematical modeling techniques to generate testable hypotheses that provide insight into how Cyclin A2 loss results in compensatory forebrain growth during late embryonic development. Using unbiased measurements of the forebrain stem cell niche, we parameterized a mathematical model whereby logistic growth instructs progenitor cells as to the cell-types of their progeny. Our data was consistent with prior findings that progenitors proliferate along an auto-inhibitory growth curve. The growth retardation inCCNA2-null brains corresponded to cell cycle lengthening, imposing a developmental delay. We hypothesized that Cyclin A2 regulates DNA repair and that CCNA2-null progenitors thus experienced lengthened cell cycle. We demonstrate that CCNA2-null progenitors suffer abnormal DNA repair, and implicate Cyclin A2 in double-strand break repair. Cyclin A2's DNA repair functions are conserved among cell lines, neural progenitors, and hippocampal neurons. We further demonstrate that neuronal CCNA2 ablation results in learning and memory deficits in aged mice.
Pronoun Preferences of Children in a Language without Typical Third-Person Pronouns
ERIC Educational Resources Information Center
Iraola Azpiroz, Maialen; Santesteban, Mikel; Sorace, Antonella; Ezeizabarrena, Maria-José
2017-01-01
This study presents comprehension data from 6-7-and 8-10-year-old children as well as adults on the acceptability of null vs overt anaphoric forms (the demonstrative "hura" "that" and the quasipronoun bera "(s)he, him-/herself") in Basque, a language without true third-person pronouns. In an acceptability judgement…
Building Intuitions about Statistical Inference Based on Resampling
ERIC Educational Resources Information Center
Watson, Jane; Chance, Beth
2012-01-01
Formal inference, which makes theoretical assumptions about distributions and applies hypothesis testing procedures with null and alternative hypotheses, is notoriously difficult for tertiary students to master. The debate about whether this content should appear in Years 11 and 12 of the "Australian Curriculum: Mathematics" has gone on…
Dougherty, Michael R; Hamovitz, Toby; Tidwell, Joe W
2016-02-01
A recent meta-analysis by Au et al. Psychonomic Bulletin & Review, 22, 366-377, (2015) reviewed the n-back training paradigm for working memory (WM) and evaluated whether (when aggregating across existing studies) there was evidence that gains obtained for training tasks transferred to gains in fluid intelligence (Gf). Their results revealed an overall effect size of g = 0.24 for the effect of n-back training on Gf. We reexamine the data through a Bayesian lens, to evaluate the relative strength of the evidence for the alternative versus null hypotheses, contingent on the type of control condition used. We find that studies using a noncontact (passive) control group strongly favor the alternative hypothesis that training leads to transfer but that studies using active-control groups show modest evidence in favor of the null. We discuss these findings in the context of placebo effects.
Parsons, Brendon A; Marney, Luke C; Siegler, W Christopher; Hoggard, Jamin C; Wright, Bob W; Synovec, Robert E
2015-04-07
Comprehensive two-dimensional (2D) gas chromatography coupled with time-of-flight mass spectrometry (GC × GC-TOFMS) is a versatile instrumental platform capable of collecting highly informative, yet highly complex, chemical data for a variety of samples. Fisher-ratio (F-ratio) analysis applied to the supervised comparison of sample classes algorithmically reduces complex GC × GC-TOFMS data sets to find class distinguishing chemical features. F-ratio analysis, using a tile-based algorithm, significantly reduces the adverse effects of chromatographic misalignment and spurious covariance of the detected signal, enhancing the discovery of true positives while simultaneously reducing the likelihood of detecting false positives. Herein, we report a study using tile-based F-ratio analysis whereby four non-native analytes were spiked into diesel fuel at several concentrations ranging from 0 to 100 ppm. Spike level comparisons were performed in two regimes: comparing the spiked samples to the nonspiked fuel matrix and to each other at relative concentration factors of two. Redundant hits were algorithmically removed by refocusing the tiled results onto the original high resolution pixel level data. To objectively limit the tile-based F-ratio results to only features which are statistically likely to be true positives, we developed a combinatorial technique using null class comparisons, called null distribution analysis, by which we determined a statistically defensible F-ratio cutoff for the analysis of the hit list. After applying null distribution analysis, spiked analytes were reliably discovered at ∼1 to ∼10 ppm (∼5 to ∼50 pg using a 200:1 split), depending upon the degree of mass spectral selectivity and 2D chromatographic resolution, with minimal occurrence of false positives. To place the relevance of this work among other methods in this field, results are compared to those for pixel and peak table-based approaches.
SANABRIA, FEDERICO; KILLEEN, PETER R.
2008-01-01
Despite being under challenge for the past 50 years, null hypothesis significance testing (NHST) remains dominant in the scientific field for want of viable alternatives. NHST, along with its significance level p, is inadequate for most of the uses to which it is put, a flaw that is of particular interest to educational practitioners who too often must use it to sanctify their research. In this article, we review the failure of NHST and propose prep, the probability of replicating an effect, as a more useful statistic for evaluating research and aiding practical decision making. PMID:19122766
Peripheral Frequency of CD4+ CD28− Cells in Acute Ischemic Stroke
Tuttolomondo, Antonino; Pecoraro, Rosaria; Casuccio, Alessandra; Di Raimondo, Domenico; Buttà, Carmelo; Clemente, Giuseppe; Corte, Vittoriano della; Guggino, Giuliana; Arnao, Valentina; Maida, Carlo; Simonetta, Irene; Maugeri, Rosario; Squatrito, Rosario; Pinto, Antonio
2015-01-01
Abstract CD4+ CD28− T cells also called CD28 null cells have been reported as increased in the clinical setting of acute coronary syndrome. Only 2 studies previously analyzed peripheral frequency of CD28 null cells in subjects with acute ischemic stroke but, to our knowledge, peripheral frequency of CD28 null cells in each TOAST subtype of ischemic stroke has never been evaluated. We hypothesized that CD4+ cells and, in particular, the CD28 null cell subset could show a different degree of peripheral percentage in subjects with acute ischemic stroke in relation to clinical subtype and severity of ischemic stroke. The aim of our study was to analyze peripheral frequency of CD28 null cells in subjects with acute ischemic stroke in relation to TOAST diagnostic subtype, and to evaluate their relationship with scores of clinical severity of acute ischemic stroke, and their predictive role in the diagnosis of acute ischemic stroke and diagnostic subtype We enrolled 98 consecutive subjects admitted to our recruitment wards with a diagnosis of ischemic stroke. As controls we enrolled 66 hospitalized patients without a diagnosis of acute ischemic stroke. Peripheral frequency of CD4+ and CD28 null cells has been evaluated with a FACS Calibur flow cytometer. Subjects with acute ischemic stroke had a significantly higher peripheral frequency of CD4+ cells and CD28 null cells compared to control subjects without acute ischemic stroke. Subjects with cardioembolic stroke had a significantly higher peripheral frequency of CD4+ cells and CD28 null cells compared to subjects with other TOAST subtypes. We observed a significant relationship between CD28 null cells peripheral percentage and Scandinavian Stroke Scale and NIHSS scores. ROC curve analysis showed that CD28 null cell percentage may be useful to differentiate between stroke subtypes. These findings seem suggest a possible role for a T-cell component also in acute ischemic stroke clinical setting showing a different peripheral frequency of CD28 null cells in relation of each TOAST subtype of stroke. PMID:25997053
Effects of Gum Chewing on Appetite and Digestion
2013-05-28
The Null Hypothesis is That Food Rheology Will Have no Effect on These Indices.; The Alternate Hypothesis is That Increased Mechanical Stimulation Will Result in Stronger Satiation/Satiety and Reduced Energy Intake.; Further, it is Hypothesized That the Effects of Mastication Will be Less Evident in Obese Compared to Lean Individuals.
Shi, Haolun; Yin, Guosheng
2018-02-21
Simon's two-stage design is one of the most commonly used methods in phase II clinical trials with binary endpoints. The design tests the null hypothesis that the response rate is less than an uninteresting level, versus the alternative hypothesis that the response rate is greater than a desirable target level. From a Bayesian perspective, we compute the posterior probabilities of the null and alternative hypotheses given that a promising result is declared in Simon's design. Our study reveals that because the frequentist hypothesis testing framework places its focus on the null hypothesis, a potentially efficacious treatment identified by rejecting the null under Simon's design could have only less than 10% posterior probability of attaining the desirable target level. Due to the indifference region between the null and alternative, rejecting the null does not necessarily mean that the drug achieves the desirable response level. To clarify such ambiguity, we propose a Bayesian enhancement two-stage (BET) design, which guarantees a high posterior probability of the response rate reaching the target level, while allowing for early termination and sample size saving in case that the drug's response rate is smaller than the clinically uninteresting level. Moreover, the BET design can be naturally adapted to accommodate survival endpoints. We conduct extensive simulation studies to examine the empirical performance of our design and present two trial examples as applications. © 2018, The International Biometric Society.
A Hands-On Exercise Improves Understanding of the Standard Error of the Mean
ERIC Educational Resources Information Center
Ryan, Robert S.
2006-01-01
One of the most difficult concepts for statistics students is the standard error of the mean. To improve understanding of this concept, 1 group of students used a hands-on procedure to sample from small populations representing either a true or false null hypothesis. The distribution of 120 sample means (n = 3) from each population had standard…
Weinheimer-Haus, Eileen M.; Mirza, Rita E.; Koh, Timothy J.
2015-01-01
The Nod-like receptor protein (NLRP)-3 inflammasome/IL-1β pathway is involved in the pathogenesis of various inflammatory skin diseases, but its biological role in wound healing remains to be elucidated. Since inflammation is typically thought to impede healing, we hypothesized that loss of NLRP-3 activity would result in a downregulated inflammatory response and accelerated wound healing. NLRP-3 null mice, caspase-1 null mice and C57Bl/6 wild type control mice (WT) received four 8 mm excisional cutaneous wounds; inflammation and healing were assessed during the early stage of wound healing. Consistent with our hypothesis, wounds from NLRP-3 null and caspase-1 null mice contained lower levels of the pro-inflammatory cytokines IL-1β and TNF-α compared to WT mice and had reduced neutrophil and macrophage accumulation. Contrary to our hypothesis, re-epithelialization, granulation tissue formation, and angiogenesis were delayed in NLRP-3 null mice and caspase-1 null mice compared to WT mice, indicating that NLRP-3 signaling is important for early events in wound healing. Topical treatment of excisional wounds with recombinant IL-1β partially restored granulation tissue formation in wounds of NLRP-3 null mice, confirming the importance of NLRP-3-dependent IL-1β production during early wound healing. Despite the improvement in healing, angiogenesis and levels of the pro-angiogenic growth factor VEGF were further reduced in IL-1β treated wounds, suggesting that IL-1β has a negative effect on angiogenesis and that NLRP-3 promotes angiogenesis in an IL-1β-independent manner. These findings indicate that the NLRP-3 inflammasome contributes to the early inflammatory phase following skin wounding and is important for efficient healing. PMID:25793779
van Reenen, Mari; Westerhuis, Johan A; Reinecke, Carolus J; Venter, J Hendrik
2017-02-02
ERp is a variable selection and classification method for metabolomics data. ERp uses minimized classification error rates, based on data from a control and experimental group, to test the null hypothesis of no difference between the distributions of variables over the two groups. If the associated p-values are significant they indicate discriminatory variables (i.e. informative metabolites). The p-values are calculated assuming a common continuous strictly increasing cumulative distribution under the null hypothesis. This assumption is violated when zero-valued observations can occur with positive probability, a characteristic of GC-MS metabolomics data, disqualifying ERp in this context. This paper extends ERp to address two sources of zero-valued observations: (i) zeros reflecting the complete absence of a metabolite from a sample (true zeros); and (ii) zeros reflecting a measurement below the detection limit. This is achieved by allowing the null cumulative distribution function to take the form of a mixture between a jump at zero and a continuous strictly increasing function. The extended ERp approach is referred to as XERp. XERp is no longer non-parametric, but its null distributions depend only on one parameter, the true proportion of zeros. Under the null hypothesis this parameter can be estimated by the proportion of zeros in the available data. XERp is shown to perform well with regard to bias and power. To demonstrate the utility of XERp, it is applied to GC-MS data from a metabolomics study on tuberculosis meningitis in infants and children. We find that XERp is able to provide an informative shortlist of discriminatory variables, while attaining satisfactory classification accuracy for new subjects in a leave-one-out cross-validation context. XERp takes into account the distributional structure of data with a probability mass at zero without requiring any knowledge of the detection limit of the metabolomics platform. XERp is able to identify variables that discriminate between two groups by simultaneously extracting information from the difference in the proportion of zeros and shifts in the distributions of the non-zero observations. XERp uses simple rules to classify new subjects and a weight pair to adjust for unequal sample sizes or sensitivity and specificity requirements.
Facebook, Twitter Activities Sites, Location and Students' Interest in Learning
ERIC Educational Resources Information Center
Igbo, J. N.; Ezenwaji, Ifeyinwa; Ajuziogu, Christiana U.
2018-01-01
This study was carried out to ascertain the influence of social networking sites activities (twitter and Facebook) on secondary school students' interest in learning It also considered the impact of these social networking sites activities on location of the students. Two research questions and two null hypotheses guided the study. Mean and…
Effects of Graphic Organiser on Students' Achievement in Algebraic Word Problems
ERIC Educational Resources Information Center
Owolabi, Josiah; Adaramati, Tobiloba Faith
2015-01-01
This study investigated the effects of graphic organiser and gender on students' academic achievement in algebraic word problem. Three research questions and three null hypotheses were used in guiding this study. Quasi experimental research was employed and Non-equivalent pre and post test design was used. The study involved the Senior Secondary…
Anxiety and Self-Concept Among American and Chinese College Students
ERIC Educational Resources Information Center
Paschal, Billy J.; You-Yuh, Kuo
1973-01-01
In this study, 60 pairs of Ss were randomly selected and individually matched on age, sex, grade equivalence, and birth order. The seven null hypotheses dealt with culture, sex, birth order, and their interactions. The main self-rating scales employed were the IPAT Anxiety Scale and the Tennessee Self Concept Scale. (Author/EK)
Determinants of Effective and Ineffective Supervision in Schools: Teachers Perspectives
ERIC Educational Resources Information Center
Oghuvbu, Enamiroro Patrick
2007-01-01
This study identified determinants of effective and ineffective supervision in schools. A forty-two items questionnaire was administered on 1150 teachers used in this study. Two research questions were raised and answered using percentages. Two null hypotheses were formulated and tested using spearman rho and z-test statistics at 0.05 level of…
The Effects of Physical Environment on Children's Behavior in the Classroom.
ERIC Educational Resources Information Center
Gingold, William
No significant difference of student-concrete physical environment interaction occurred with a change in physical environment. A test was made on five null hypotheses related to the change of physical environment and (1) student-concrete physical environment interaction; (2) environmental preference by students; (3) student attending behavior; (4)…
ERIC Educational Resources Information Center
Oleforo, Ngozika A.; Oko, Dominic Edema; Akpan, Eno G.
2013-01-01
Entrepreneurial training programme has to do with acquiring relevant skills in which an individual has to be sensitized, motivated and guided to achieve self-reliance and self employment. The paper examined the relevance of entrepreneurial training programme in the universities to graduates' productivity. Three null hypotheses were formulated. A…
The Same or Not the Same: Equivalence as an Issue in Educational Research
NASA Astrophysics Data System (ADS)
Lewis, Scott E.; Lewis, Jennifer E.
2005-09-01
In educational research, particularly in the sciences, a common research design calls for the establishment of a control and experimental group to determine the effectiveness of an intervention. As part of this design, it is often desirable to illustrate that the two groups were equivalent at the start of the intervention, based on measures such as standardized cognitive tests or student grades in prior courses. In this article we use SAT and ACT scores to illustrate a more robust way of testing equivalence. The method incorporates two one-sided t tests evaluating two null hypotheses, providing a stronger claim for equivalence than the standard method, which often does not address the possible problem of low statistical power. The two null hypotheses are based on the construction of an equivalence interval particular to the data, so the article also provides a rationale for and illustration of a procedure for constructing equivalence intervals. Our consideration of equivalence using this method also underscores the need to include sample sizes, standard deviations, and group means in published quantitative studies.
Testing for nonlinearity in time series: The method of surrogate data
DOE Office of Scientific and Technical Information (OSTI.GOV)
Theiler, J.; Galdrikian, B.; Longtin, A.
1991-01-01
We describe a statistical approach for identifying nonlinearity in time series; in particular, we want to avoid claims of chaos when simpler models (such as linearly correlated noise) can explain the data. The method requires a careful statement of the null hypothesis which characterizes a candidate linear process, the generation of an ensemble of surrogate'' data sets which are similar to the original time series but consistent with the null hypothesis, and the computation of a discriminating statistic for the original and for each of the surrogate data sets. The idea is to test the original time series against themore » null hypothesis by checking whether the discriminating statistic computed for the original time series differs significantly from the statistics computed for each of the surrogate sets. We present algorithms for generating surrogate data under various null hypotheses, and we show the results of numerical experiments on artificial data using correlation dimension, Lyapunov exponent, and forecasting error as discriminating statistics. Finally, we consider a number of experimental time series -- including sunspots, electroencephalogram (EEG) signals, and fluid convection -- and evaluate the statistical significance of the evidence for nonlinear structure in each case. 56 refs., 8 figs.« less
The OmpL porin does not modulate redox potential in the periplasmic space of Escherichia coli.
Sardesai, Abhijit A; Genevaux, Pierre; Schwager, Françoise; Ang, Debbie; Georgopoulos, Costa
2003-04-01
The Escherichia coli DsbA protein is the major oxidative catalyst in the periplasm. Dartigalongue et al. (EMBO J., 19, 5980-5988, 2000) reported that null mutations in the ompL gene of E.coli fully suppress all phenotypes associated with dsbA mutants, i.e. sensitivity to the reducing agent dithiothreitol (DTT) and the antibiotic benzylpenicillin, lack of motility, reduced alkaline phosphatase activity and mucoidy. They showed that OmpL is a porin and hypothesized that ompL null mutations exert their suppressive effect by preventing efflux of a putative oxidizing-reducing compound into the medium. We have repeated these experiments using two different ompL null alleles in at least three different E.coli K-12 genetic backgrounds and have failed to reproduce any of the ompL suppressive effects noted above. Also, we show that, contrary to earlier results, ompL null mutations alone do not result in partial DTT sensitivity or partial motility, nor do they appreciably affect bacterial growth rates or block propagation of the male-specific bacteriophage M13. Thus, our findings clearly demonstrate that ompL plays no perceptible role in modulating redox potential in the periplasm of E.coli.
Multi-arm group sequential designs with a simultaneous stopping rule.
Urach, S; Posch, M
2016-12-30
Multi-arm group sequential clinical trials are efficient designs to compare multiple treatments to a control. They allow one to test for treatment effects already in interim analyses and can have a lower average sample number than fixed sample designs. Their operating characteristics depend on the stopping rule: We consider simultaneous stopping, where the whole trial is stopped as soon as for any of the arms the null hypothesis of no treatment effect can be rejected, and separate stopping, where only recruitment to arms for which a significant treatment effect could be demonstrated is stopped, but the other arms are continued. For both stopping rules, the family-wise error rate can be controlled by the closed testing procedure applied to group sequential tests of intersection and elementary hypotheses. The group sequential boundaries for the separate stopping rule also control the family-wise error rate if the simultaneous stopping rule is applied. However, we show that for the simultaneous stopping rule, one can apply improved, less conservative stopping boundaries for local tests of elementary hypotheses. We derive corresponding improved Pocock and O'Brien type boundaries as well as optimized boundaries to maximize the power or average sample number and investigate the operating characteristics and small sample properties of the resulting designs. To control the power to reject at least one null hypothesis, the simultaneous stopping rule requires a lower average sample number than the separate stopping rule. This comes at the cost of a lower power to reject all null hypotheses. Some of this loss in power can be regained by applying the improved stopping boundaries for the simultaneous stopping rule. The procedures are illustrated with clinical trials in systemic sclerosis and narcolepsy. © 2016 The Authors. Statistics in Medicine Published by John Wiley & Sons Ltd. © 2016 The Authors. Statistics in Medicine Published by John Wiley & Sons Ltd.
Predictive uncertainty analysis of a saltwater intrusion model using null-space Monte Carlo
Herckenrath, Daan; Langevin, Christian D.; Doherty, John
2011-01-01
Because of the extensive computational burden and perhaps a lack of awareness of existing methods, rigorous uncertainty analyses are rarely conducted for variable-density flow and transport models. For this reason, a recently developed null-space Monte Carlo (NSMC) method for quantifying prediction uncertainty was tested for a synthetic saltwater intrusion model patterned after the Henry problem. Saltwater intrusion caused by a reduction in fresh groundwater discharge was simulated for 1000 randomly generated hydraulic conductivity distributions, representing a mildly heterogeneous aquifer. From these 1000 simulations, the hydraulic conductivity distribution giving rise to the most extreme case of saltwater intrusion was selected and was assumed to represent the "true" system. Head and salinity values from this true model were then extracted and used as observations for subsequent model calibration. Random noise was added to the observations to approximate realistic field conditions. The NSMC method was used to calculate 1000 calibration-constrained parameter fields. If the dimensionality of the solution space was set appropriately, the estimated uncertainty range from the NSMC analysis encompassed the truth. Several variants of the method were implemented to investigate their effect on the efficiency of the NSMC method. Reducing the dimensionality of the null-space for the processing of the random parameter sets did not result in any significant gains in efficiency and compromised the ability of the NSMC method to encompass the true prediction value. The addition of intrapilot point heterogeneity to the NSMC process was also tested. According to a variogram comparison, this provided the same scale of heterogeneity that was used to generate the truth. However, incorporation of intrapilot point variability did not make a noticeable difference to the uncertainty of the prediction. With this higher level of heterogeneity, however, the computational burden of generating calibration-constrained parameter fields approximately doubled. Predictive uncertainty variance computed through the NSMC method was compared with that computed through linear analysis. The results were in good agreement, with the NSMC method estimate showing a slightly smaller range of prediction uncertainty than was calculated by the linear method. Copyright 2011 by the American Geophysical Union.
Organic Anion Transporting Polypeptide 1a1 Null Mice Are Sensitive to Cholestatic Liver Injury
Zhang, Youcai; Csanaky, Iván L.; Cheng, Xingguo; Lehman-McKeeman, Lois D.; Klaassen, Curtis D.
2012-01-01
Organic anion transporting polypeptide 1a1 (Oatp1a1) is predominantly expressed in livers of mice and is thought to transport bile acids (BAs) from blood into liver. Because Oatp1a1 expression is markedly decreased in mice after bile duct ligation (BDL). We hypothesized that Oatp1a1-null mice would be protected against liver injury during BDL-induced cholestasis due largely to reduced hepatic uptake of BAs. To evaluate this hypothesis, BDL surgeries were performed in both male wild-type (WT) and Oatp1a1-null mice. At 24 h after BDL, Oatp1a1-null mice showed higher serum alanine aminotransferase levels and more severe liver injury than WT mice, and all Oatp1a1-null mice died within 4 days after BDL, whereas all WT mice survived. At 24 h after BDL, surprisingly Oatp1a1-null mice had higher total BA concentrations in livers than WT mice, suggesting that loss of Oatp1a1 did not prevent BA accumulation in the liver. In addition, secondary BAs dramatically increased in serum of Oatp1a1-null BDL mice but not in WT BDL mice. Oatp1a1-null BDL mice had similar basolateral BA uptake (Na+-taurocholate cotransporting polypeptide and Oatp1b2) and BA-efflux (multidrug resistance–associated protein [Mrp]-3, Mrp4, and organic solute transporter α/β) transporters, as well as BA-synthetic enzyme (Cyp7a1) in livers as WT BDL mice. Hepatic expression of small heterodimer partner Cyp3a11, Cyp4a14, and Nqo1, which are target genes of farnesoid X receptor, pregnane X receptor, peroxisome proliferator-activated receptor alpha, and NF-E2-related factor 2, respectively, were increased in WT BDL mice but not in Oatp1a1-null BDL mice. These results demonstrate that loss of Oatp1a1 function exacerbates cholestatic liver injury in mice and suggest that Oatp1a1 plays a unique role in liver adaptive responses to obstructive cholestasis. PMID:22461449
Dynamic Pedagogy for Effective Training of Youths in Cell Phone Maintenance
ERIC Educational Resources Information Center
Ogbuanya, T. C.; Jimoh, Bakare
2015-01-01
The study determined dynamic pedagogies for effective training of youths in cell phone maintenance. The study was conducted in Enugu State of Nigeria. Four research questions were developed while four null hypotheses formulated were tested at 0.05 level of significance. A survey research design was adopted for the study. The population for the…
ERIC Educational Resources Information Center
Uchendu, Chika C.; Nwafor, Innocent A.; Nwaneri, Mary G.
2015-01-01
The study investigated marketing strategies and students' enrolment in private secondary schools in Calabar Municipality, Cross River State. One research question was raised and two null hypotheses formulated to guide the study. Thirty two (32) school administrators in 32 private secondary schools in the study area constitute the study population…
Undergraduates' Attitude towards the Use of Social Media for Learning Purposes
ERIC Educational Resources Information Center
Williams, Cheta; Adesope, Rebecca Yinka
2017-01-01
The study investigated Undergraduates' attitude towards the use of social media for learning purposes. It was conducted at the University of Port Harcourt, Rivers State, Nigeria. Two objectives and two null hypotheses was used to investigate the study. The population used were Undergraduate students from three faculties at the University of Port…
AN EXPERIMENTAL STUDY UTILIZING CLOSED-CIRCUIT TELEVISION IN THE TEACHING OF DENTAL TECHNIQUES.
ERIC Educational Resources Information Center
MORRISON, ARTHUR H.
CLOSED CIRCUIT TELEVISION WAS WELL RECEIVED BY DENTISTRY STUDENTS AT NEW YORK UNIVERSITY BUT FAILED TO YIELD SIGNIFICANT GAINS IN ACHIEVEMENT OVER CONVENTIONAL INSTRUCTION. TWENTY-ONE NULL HYPOTHESES WERE TESTED ON 154 MALE SOPHOMORE STUDENTS, WHO WERE DIVIDED INTO GWO GROUPS, HALF BEING INSTRUCTED TO A LARGE EXTENT VIA CCTV, TV CLASS, AND HALF…
ERIC Educational Resources Information Center
Negari, Giti Mousapour; Azizi, Aliye; Arani, Davood Khedmatkar
2018-01-01
The present study attempted to investigate the effects of audio input enhancement on EFL learners' retention of intensifiers. To this end, two research questions were formulated. In order to address these research questions, this study attempted to reject two null hypotheses. Pretest-posttest control group quasi-experimental design was employed to…
Impact of Self-Correction on Extrovert and Introvert Students in EFL Writing Progress
ERIC Educational Resources Information Center
Hajimohammadi, Reza; Makundan, Jayakaran
2011-01-01
To investigate the impact of self-correction method as an alternative to the traditional teacher-correction method, on the one side, and to evaluate the impact of personality traits of Extroversion/Introversion, on the other side, on the writing progress of the pre-intermediate learners three null-hypotheses were proposed. In spite of students…
ERIC Educational Resources Information Center
Loyce, Onyali Chiedozie; Victor, Akinfolarin Akinwale
2017-01-01
This study ascertained the principals' application of instructional leadership practices for secondary school effectiveness in Oyo State. Two research questions guided the study and two null hypotheses were tested. The descriptive survey research design was adopted for the study. The population of the study comprised 8,701 which were made of 969…
Principals' Provision of Incentive for Secondary Schools' Improvement in Oyo State
ERIC Educational Resources Information Center
Onyali, Loyce Chiedozie; Victor, Akinfolarin Akinwale
2017-01-01
This study ascertained the principals' provision of incentive for secondary schools' improvement in Oyo State. Two research questions guided the study and two null hypotheses were tested. The descriptive survey research design was adopted for the study. The population of the study was 8,701 which comprised 969 principals and 7,732 teachers in…
Default Bayes Factors for Model Selection in Regression
ERIC Educational Resources Information Center
Rouder, Jeffrey N.; Morey, Richard D.
2012-01-01
In this article, we present a Bayes factor solution for inference in multiple regression. Bayes factors are principled measures of the relative evidence from data for various models or positions, including models that embed null hypotheses. In this regard, they may be used to state positive evidence for a lack of an effect, which is not possible…
What is too much variation? The null hypothesis in small-area analysis.
Diehr, P; Cain, K; Connell, F; Volinn, E
1990-01-01
A small-area analysis (SAA) in health services research often calculates surgery rates for several small areas, compares the largest rate to the smallest, notes that the difference is large, and attempts to explain this discrepancy as a function of service availability, physician practice styles, or other factors. SAAs are often difficult to interpret because there is little theoretical basis for determining how much variation would be expected under the null hypothesis that all of the small areas have similar underlying surgery rates and that the observed variation is due to chance. We developed a computer program to simulate the distribution of several commonly used descriptive statistics under the null hypothesis, and used it to examine the variability in rates among the counties of the state of Washington. The expected variability when the null hypothesis is true is surprisingly large, and becomes worse for procedures with low incidence, for smaller populations, when there is variability among the populations of the counties, and when readmissions are possible. The characteristics of four descriptive statistics were studied and compared. None was uniformly good, but the chi-square statistic had better performance than the others. When we reanalyzed five journal articles that presented sufficient data, the results were usually statistically significant. Since SAA research today is tending to deal with low-incidence events, smaller populations, and measures where readmissions are possible, more research is needed on the distribution of small-area statistics under the null hypothesis. New standards are proposed for the presentation of SAA results. PMID:2312306
What is too much variation? The null hypothesis in small-area analysis.
Diehr, P; Cain, K; Connell, F; Volinn, E
1990-02-01
A small-area analysis (SAA) in health services research often calculates surgery rates for several small areas, compares the largest rate to the smallest, notes that the difference is large, and attempts to explain this discrepancy as a function of service availability, physician practice styles, or other factors. SAAs are often difficult to interpret because there is little theoretical basis for determining how much variation would be expected under the null hypothesis that all of the small areas have similar underlying surgery rates and that the observed variation is due to chance. We developed a computer program to simulate the distribution of several commonly used descriptive statistics under the null hypothesis, and used it to examine the variability in rates among the counties of the state of Washington. The expected variability when the null hypothesis is true is surprisingly large, and becomes worse for procedures with low incidence, for smaller populations, when there is variability among the populations of the counties, and when readmissions are possible. The characteristics of four descriptive statistics were studied and compared. None was uniformly good, but the chi-square statistic had better performance than the others. When we reanalyzed five journal articles that presented sufficient data, the results were usually statistically significant. Since SAA research today is tending to deal with low-incidence events, smaller populations, and measures where readmissions are possible, more research is needed on the distribution of small-area statistics under the null hypothesis. New standards are proposed for the presentation of SAA results.
Frömke, Cornelia; Hothorn, Ludwig A; Kropf, Siegfried
2008-01-27
In many research areas it is necessary to find differences between treatment groups with several variables. For example, studies of microarray data seek to find a significant difference in location parameters from zero or one for ratios thereof for each variable. However, in some studies a significant deviation of the difference in locations from zero (or 1 in terms of the ratio) is biologically meaningless. A relevant difference or ratio is sought in such cases. This article addresses the use of relevance-shifted tests on ratios for a multivariate parallel two-sample group design. Two empirical procedures are proposed which embed the relevance-shifted test on ratios. As both procedures test a hypothesis for each variable, the resulting multiple testing problem has to be considered. Hence, the procedures include a multiplicity correction. Both procedures are extensions of available procedures for point null hypotheses achieving exact control of the familywise error rate. Whereas the shift of the null hypothesis alone would give straight-forward solutions, the problems that are the reason for the empirical considerations discussed here arise by the fact that the shift is considered in both directions and the whole parameter space in between these two limits has to be accepted as null hypothesis. The first algorithm to be discussed uses a permutation algorithm, and is appropriate for designs with a moderately large number of observations. However, many experiments have limited sample sizes. Then the second procedure might be more appropriate, where multiplicity is corrected according to a concept of data-driven order of hypotheses.
Reduced infarct size in neuroglobin-null mice after experimental stroke in vivo
2012-01-01
Background Neuroglobin is considered to be a novel important pharmacological target in combating stroke and neurodegenerative disorders, although the mechanism by which this protection is accomplished remains an enigma. We hypothesized that if neuroglobin is directly involved in neuroprotection, then permanent cerebral ischemia would lead to larger infarct volumes in neuroglobin-null mice than in wild-type mice. Methods Using neuroglobin-null mice, we estimated the infarct volume 24 hours after permanent middle cerebral artery occlusion using Cavalieri’s Principle, and compared the infarct volume in neuroglobin-null and wild-type mice. Neuroglobin antibody staining was used to examine neuroglobin expression in the infarct area of wild-type mice. Results Infarct volumes 24 hours after permanent middle cerebral artery occlusion were significantly smaller in neuroglobin-null mice than in wild-types (p < 0.01). Neuroglobin immunostaining of the penumbra area revealed no visible up-regulation of neuroglobin protein in ischemic wild-type mice when compared to uninjured wild-type mice. In uninjured wild-type mice, neuroglobin protein was seen throughout cortical layer II and sparsely in layer V. In contrast, no neuroglobin-immunoreactive neurons were observed in the aforementioned layers of the ischemia injured cortical area, or in the surrounding penumbra of ischemic wild-type mice. This suggests no selective sparing of neuroglobin expressing neurons in ischemia. Conclusions Neuroglobin-deficiency resulted in reduced tissue infarction, suggesting that, at least at endogenous expression levels, neuroglobin in itself is non-protective against ischemic injury. PMID:22901501
Interleukin-6 inhibits hepatic growth hormone signaling via upregulation of Cis and Socs-3.
Denson, Lee A; Held, Matthew A; Menon, Ram K; Frank, Stuart J; Parlow, Albert F; Arnold, Dodie L
2003-04-01
Cytokines may cause an acquired growth hormone (GH) resistance in patients with inflammatory diseases. Anabolic effects of GH are mediated through activation of STAT5 transcription factors. We have reported that TNF-alpha suppresses hepatic GH receptor (GHR) gene expression, whereas the cytokine-inducible SH2-containing protein 1 (Cis)/suppressors of cytokine signaling (Socs) genes are upregulated by TNF-alpha and IL-6 and inhibit GH activation of STAT5. However, the relative importance of these mechanisms in inflammatory GH resistance was not known. We hypothesized that IL-6 would prevent GH activation of STAT5 and that this would involve Cis/Socs protein upregulation. GH +/- LPS was administered to TNF receptor 1 (TNFR1) or IL-6 null mice and wild-type (WT) controls. STAT5, STAT3, GHR, Socs 1-3, and Cis phosphorylation and abundance were assessed by using immunoblots, EMSA, and/or real time RT-PCR. TNF-alpha and IL-6 abundance were assessed by using ELISA. GH activated STAT5 in WT and TNFR1 or IL-6 null mice. LPS pretreatment prevented STAT5 activation in WT and TNFR1 null mice; however, STAT5 activation was preserved in IL-6 null mice. GHR abundance did not change with LPS administration. Inhibition of STAT5 activation by LPS was temporally associated with phosphorylation of STAT3 and upregulation of Cis and Socs-3 protein in WT and TNFR1 null mice; STAT3, Cis, and Socs-3 were not induced in IL-6 null mice. IL-6 inhibits hepatic GH signaling by upregulating Cis and Socs-3, which may involve activation of STAT3. Therapies that block IL-6 may enhance GH signaling in inflammatory diseases.
ERIC Educational Resources Information Center
Ogbu, James E.
2015-01-01
This study investigated the influences of inadequate instructional materials and facilities in the teaching and learning of electrical/electronics (E/E) technology education courses. The study was guided by two research questions and two null hypotheses which were tested at 0.05 level of significance. The design employed was descriptive survey…
ERIC Educational Resources Information Center
Okoye, Namdi N. S.
2009-01-01
The study tried to examine the interaction between two independent variables of selective attention and cognitive development on Achievement in Genetics at the Secondary School level. In looking at the problem of this study three null hypotheses were generated for testing at 0.05 level of significance. Factorial Analysis of Variance design with…
A SWOT Analysis of Male and Female Students' Performance in Chemistry: A Comparative Study
ERIC Educational Resources Information Center
Ezeudu, Florence O.; Chiaha, Gertrude-Theresa Uzoamaka; Anazor, Lynda Chioma; Eze, Justina Uzoamaka; Omeke, Faith Chinwe
2015-01-01
The purpose of this study was to do a SWOT analysis and compare performances of male and female students in chemistry. Four research questions and four null hypotheses guided the study. Two boys', two girls' and two coeducational schools involving 1319 males and 1831 females, were selected by a stratified, deliberate sampling technique. A…
The Effects of a Stress Management Course on Counselors-in-Training
ERIC Educational Resources Information Center
Abel, Holly; Abel, Annette; Smith, Robert L.
2012-01-01
The effects of a stress management course on the stress knowledge and coping techniques of 101 graduate students in counseling were examined. Participants, drawn from various racial groups, were typically female (79%) and 21 to 55 years of age. Seven of the 8 null hypotheses were rejected. There were significant differences on 6 of the 7 dependent…
The Impact Literacy Coaches Have on Mississippi's Lower-Performing Schools
ERIC Educational Resources Information Center
Trivelli-Bowen, Barbara
2017-01-01
The purpose of this study was to explore the impact literacy coaches had on Mississippi's lower-performing schools. To guide the study, the researcher developed four research questions and four null hypotheses. The population of this study was derived from a sample of Mississippi students in Grades K-3 who were administered the Early Literacy STAR…
ERIC Educational Resources Information Center
Okpube, Nnaemeka Michael; Anugwo, M. N.
2016-01-01
This study investigated the Card Games and Algebra tic-Tacmatics on Junior Secondary II Students' Achievement in Algebraic Expressions. Three research questions and three null hypotheses guided the study. The study adopted the pre-test, post-test control group design. A total of two hundred and forty (240) Junior Secondary School II students were…
ERIC Educational Resources Information Center
Iruloh, Betty-Ruth Ngozi; Wilson, Chukwu Juliet U.
2017-01-01
The study investigated the relative relationship between family distress and eating disorders among undergraduate students of University of Port Harcourt. The study was guided by three research questions and three null hypotheses to test the tenability of the independent variables on the dependent variable at 0.05 level of significance. The study…
ERIC Educational Resources Information Center
Okolocha, Chimezie Comfort; Nwadiani, Comfort Onaigho
2015-01-01
This study assessed the utilization of ICT resources in teaching among business educators in tertiary institutions in south Nigeria. Two research questions and two null hypotheses guided the study. Descriptive survey research design was adopted for the study. The population and sample for the study comprised all 240 business educators in colleges…
ERIC Educational Resources Information Center
Mumuni, Abosede Anthonia Olufemi; Dike, John Worlu; Uzoma-Nwogu, Azibaolanari
2017-01-01
This study investigated the effects of teaching trajectories on students' understanding of difficult concepts in Biology. Two research questions and two null hypotheses guided the study which was carried out in Obio/Akpor Local Government Area of Rivers State. Two public coeducational schools out of thirteen drawn through purposive sampling…
Return rates of banded granivores in relation to band color and number of bands worn
Jared Verner; Dawn Breese; Kathryn L. Purcell
1998-01-01
We tested the null hypotheses of (1) no effect of band color and (2) no effect of number of bands worn on annual recapture rates of birds on their winter range. Results are reported from four species of granivores-Spotted Towhee (Pipilo maculatus), Goldencrowned Sparrow (Zonotrichia atricapilla) , White-crowned Sparrow (2. leucophrys) , and Darkeyed Junco (Junco...
Delivery Pain Anxiety/Fear Control between Midwives among Women in Cross River State, Nigeria
ERIC Educational Resources Information Center
Oyira, Emilia James; Mgbekem, Mary; Osuchukwu, Easther Chukwudi; Affiong, Ekpenyong Onoyom; Lukpata, Felicia E.; Ojong-Alasia, Mary Manyo
2016-01-01
Objective: To examine background of midwives the effectiveness in delivery pain and anxiety/fear control of expectant mothers in Nigeria. Methods: Two null hypotheses were formulated. The survey design with sample of 360 post-natal women was selected from a population of 78,814 through the polio immunization registers of selected health center in…
ERIC Educational Resources Information Center
Oyira, Emilia James; Emon, Umoe Duke; Essien, N. C.; Ekpenyong, Affiong Onoyom
2015-01-01
This study sought to investigate western and traditional educational background of midwives with regard to their effectiveness in delivery pain control in Cross River State-Nigeria. To achieve this purpose, two null hypotheses were formulated to guide the investigation. The study adopted the survey design. The sample consisted of 360 post-natal…
ERIC Educational Resources Information Center
Ndem, Joseph; Ogba, Ernest; Egbe, Benjamin
2015-01-01
This study was designed to assess the agricultural engineering knowledge and competencies acquired by the senior secondary students for farm mechanization in technical colleges in Ebonyi state of Nigeria. A survey research design was adopted for the study. Three research questions and two null hypotheses guided the study. The population of the…
Globigerinoides ruber morphotypes in the Gulf of Mexico: a test of null hypothesis
Thirumalai, Kaustubh; Richey, Julie N.; Quinn, Terrence M.; Poore, Richard Z.
2014-01-01
Planktic foraminifer Globigerinoides ruber (G. ruber), due to its abundance and ubiquity in the tropical/subtropical mixed layer, has been the workhorse of paleoceanographic studies investigating past sea-surface conditions on a range of timescales. Recent geochemical work on the two principal white G. ruber (W) morphotypes, sensu stricto (ss) and sensu lato (sl), has hypothesized differences in seasonal preferences or calcification depths, implying that reconstructions using a non-selective mixture of morphotypes could potentially be biased. Here, we test these hypotheses by performing stable isotope and abundance measurements on the two morphotypes in sediment trap, core-top, and downcore samples from the northern Gulf of Mexico. As a test of null hypothesis, we perform the same analyses on couplets of G. ruber (W) specimens with attributes intermediate to the holotypic ss and sl morphologies. We find no systematic or significant offsets in coeval ss-sl δ18O, and δ13C. These offsets are no larger than those in the intermediate pairs. Coupling our results with foraminiferal statistical model INFAUNAL, we find that contrary to previous work elsewhere, there is no evidence for discrepancies in ss-sl calcifying depth habitat or seasonality in the Gulf of Mexico.
Collins, Carol M.; Ellis, Joseph A.
2017-01-01
ABSTRACT Mutations in the gene encoding emerin cause Emery–Dreifuss muscular dystrophy (EDMD). Emerin is an integral inner nuclear membrane protein and a component of the nuclear lamina. EDMD is characterized by skeletal muscle wasting, cardiac conduction defects and tendon contractures. The failure to regenerate skeletal muscle is predicted to contribute to the skeletal muscle pathology of EDMD. We hypothesize that muscle regeneration defects are caused by impaired muscle stem cell differentiation. Myogenic progenitors derived from emerin-null mice were used to confirm their impaired differentiation and analyze selected myogenic molecular pathways. Emerin-null progenitors were delayed in their cell cycle exit, had decreased myosin heavy chain (MyHC) expression and formed fewer myotubes. Emerin binds to and activates histone deacetylase 3 (HDAC3). Here, we show that theophylline, an HDAC3-specific activator, improved myotube formation in emerin-null cells. Addition of the HDAC3-specific inhibitor RGFP966 blocked myotube formation and MyHC expression in wild-type and emerin-null myogenic progenitors, but did not affect cell cycle exit. Downregulation of emerin was previously shown to affect the p38 MAPK and ERK/MAPK pathways in C2C12 myoblast differentiation. Using a pure population of myogenic progenitors completely lacking emerin expression, we show that these pathways are also disrupted. ERK inhibition improved MyHC expression in emerin-null cells, but failed to rescue myotube formation or cell cycle exit. Inhibition of p38 MAPK prevented differentiation in both wild-type and emerin-null progenitors. These results show that each of these molecular pathways specifically regulates a particular stage of myogenic differentiation in an emerin-dependent manner. Thus, pharmacological targeting of multiple pathways acting at specific differentiation stages may be a better therapeutic approach in the future to rescue muscle regeneration in vivo. PMID:28188262
Statistical modeling, detection, and segmentation of stains in digitized fabric images
NASA Astrophysics Data System (ADS)
Gururajan, Arunkumar; Sari-Sarraf, Hamed; Hequet, Eric F.
2007-02-01
This paper will describe a novel and automated system based on a computer vision approach, for objective evaluation of stain release on cotton fabrics. Digitized color images of the stained fabrics are obtained, and the pixel values in the color and intensity planes of these images are probabilistically modeled as a Gaussian Mixture Model (GMM). Stain detection is posed as a decision theoretic problem, where the null hypothesis corresponds to absence of a stain. The null hypothesis and the alternate hypothesis mathematically translate into a first order GMM and a second order GMM respectively. The parameters of the GMM are estimated using a modified Expectation-Maximization (EM) algorithm. Minimum Description Length (MDL) is then used as the test statistic to decide the verity of the null hypothesis. The stain is then segmented by a decision rule based on the probability map generated by the EM algorithm. The proposed approach was tested on a dataset of 48 fabric images soiled with stains of ketchup, corn oil, mustard, ragu sauce, revlon makeup and grape juice. The decision theoretic part of the algorithm produced a correct detection rate (true positive) of 93% and a false alarm rate of 5% on these set of images.
Dedeoglu, Burç; Meijers, Ruud W. J.; Klepper, Mariska; Hesselink, Dennis A.; Baan, Carla C.; Litjens, Nicolle H. R.; Betjes, Michiel G. H.
2016-01-01
Background End-stage renal disease patients have a dysfunctional, prematurely aged peripheral T-cell system. Here we hypothesized that the degree of premature T-cell ageing before kidney transplantation predicts the risk for early acute allograft rejection (EAR). Methods 222 living donor kidney transplant recipients were prospectively analyzed. EAR was defined as biopsy proven acute allograft rejection within 3 months after kidney transplantation. The differentiation status of circulating T cells, the relative telomere length and the number of CD31+ naive T cells were determined as T-cell ageing parameters. Results Of the 222 patients analyzed, 30 (14%) developed an EAR. The donor age and the historical panel reactive antibody score were significantly higher (p = 0.024 and p = 0.039 respectively) and the number of related donor kidney transplantation was significantly lower (p = 0.018) in the EAR group. EAR-patients showed lower CD4+CD28null T-cell numbers (p<0.01) and the same trend was observed for CD8+CD28null T-cell numbers (p = 0.08). No differences regarding the other ageing parameters were found. A multivariate Cox regression analysis showed that higher CD4+CD28null T-cell numbers was associated with a lower risk for EAR (HR: 0.65, p = 0.028). In vitro, a significant lower percentage of alloreactive T cells was observed within CD28null T cells (p<0.001). Conclusion Immunological ageing-related expansion of highly differentiated CD28null T cells is associated with a lower risk for EAR. PMID:26950734
Dependence of paranodal junctional gap width on transverse bands.
Rosenbluth, Jack; Petzold, Chris; Peles, Elior
2012-08-15
Mouse mutants with paranodal junctional (PNJ) defects display variable degrees of neurological impairment. In this study we compare control paranodes with those from three mouse mutants that differ with respect to a conspicuous PNJ component, the transverse bands (TBs). We hypothesize that TBs link the apposed junctional membranes together at a fixed distance and thereby determine the width of the junctional gap, which may in turn determine the extent to which nodal action currents can be short-circuited underneath the myelin sheath. Electron micrographs of aldehyde-fixed control PNJs, in which TBs are abundant, show a consistent junctional gap of ∼3.5 nm. In Caspr-null PNJs, which lack TBs entirely, the gap is wider (∼6-7 nm) and more variable. In CST-null PNJs, which have only occasional TBs, the mean PNJ gap width is comparable to that in Caspr-null mice. In the shaking mutant, in contrast, which has approximately 60% of the normal complement of TBs, mean PNJ gap width is not significantly different from that in controls. Correspondingly, shaking mice are much less impaired neurologically than either Caspr-null or CST-null mice. We conclude that in the absence or gross diminution of TBs, mean PNJ gap width increases significantly and suggest that this difference could underlie some of the neurological impairment seen in those mutants. Surprisingly, even in the absence of TBs, paranodes are to some extent maintained in their usual form, implying that in addition to TBs, other factors govern the formation and maintenance of overall paranodal structure. Copyright © 2012 Wiley Periodicals, Inc.
ERIC Educational Resources Information Center
Oboegbulem, Angie; Ugwu, Rita N.
2013-01-01
This study aimed at identifying the role of ICT (information and communication technology) in school administration and the extent of its application by secondary school principals in administration. To guide this study, two research questions were answered and two null hypotheses were tested. The design of the study was a descriptive survey…
ERIC Educational Resources Information Center
Sule, Mary Anike; Eyiene, Ameh; Egbai, Mercy E.
2015-01-01
The study investigated the relationship between instructional supervisory practices and teachers' role effectiveness in public secondary schools in Calabar South Local Government Area of Cross River State. Two null hypotheses were formulated to guide the study. Ex-post facto research design was adopted for the study. The population of the study…
ERIC Educational Resources Information Center
Audu, Amos; Ali, Domiya G.; Pur, Hamsatu J.
2017-01-01
The study investigated the effect of group counselling on attitude of senior secondary school students' towards schooling in Federal Government College, Maiduguri, Borno State, Nigeria. Two objectives were stated and two null hypotheses were formulated and tested at 0.05 level of significance. Experimental design was used for the study. The target…
ERIC Educational Resources Information Center
GOODWIN, WILLIAM L.; AND OTHERS
NULL HYPOTHESES WERE TESTED TO DETERMINE THE DIFFERENTIAL EFFECTS OF (1) EXPERIMENTAL ATMOSPHERE AND ABSENCE OF SAME, (2) NOTICE OF TEST (10 SCHOOL DAYS) AND NO NOTICE (1 SCHOOL DAY), (3) TEACHER ADMINISTRATION AND OUTSIDE ADMINISTRATION OF TESTS, AND (4) TEACHER SCORING AND OUTSIDE SCORING OF TESTS. SIXTH-GRADE CLASSES (N=64), EACH FROM A…
ERIC Educational Resources Information Center
Ogbe, Joseph O.
2010-01-01
The purpose of this study was to stimulate action to address and identify maternal, child and community needs towards the improvement in health of pregnant women, children and communities. Four null hypotheses were generated from the research questions while multiple regression analysis was used to analyse the data. The study found that household…
ERIC Educational Resources Information Center
George, I. N.; Sakirudeen, Abisola Oladeni; Sunday, Adam Happiness
2017-01-01
This study was carried out to investigate Effective Classroom Management and Students' Academic Performance in Secondary schools in Uyo Local Government Area. Four research questions and four null hypotheses were formulated to guide the study. The survey design was adopted for the study. The population of 2044 Senior Secondary School One (SS1)…
ERIC Educational Resources Information Center
Oyeoku, E. K.; Meziobi, D.; Ezegbe, N. B.; Obikwelu, C. L.
2013-01-01
The main purpose of the study was to evolve modalities for preventing domestic violence against women in Nsukka education zone. Three research questions and two null hypotheses guided the study. The sample comprised 150 urban women and 450 rural women in Nsukka education zone. A 20-item questionnaire was developed, validated, and administered to…
ERIC Educational Resources Information Center
Li, Libo; Bentler, Peter M.
2011-01-01
MacCallum, Browne, and Cai (2006) proposed a new framework for evaluation and power analysis of small differences between nested structural equation models (SEMs). In their framework, the null and alternative hypotheses for testing a small difference in fit and its related power analyses were defined by some chosen root-mean-square error of…
ERIC Educational Resources Information Center
Ajobiewe, Theo; Ayena, Olugbenga O.
2012-01-01
The main objective of this study was to find out problems of people with visual impairment in rehabilitation centers in Nigeria. The sample consisted of 600 participants. A Questionnaire was used to collect data for this study. Two null hypotheses were formulated and tested at the 0.05 alpha-level of significance. Data were analyzed using t-test…
Towers, Sherry; Mubayi, Anuj; Castillo-Chavez, Carlos
2018-01-01
When attempting to statistically distinguish between a null and an alternative hypothesis, many researchers in the life and social sciences turn to binned statistical analysis methods, or methods that are simply based on the moments of a distribution (such as the mean, and variance). These methods have the advantage of simplicity of implementation, and simplicity of explanation. However, when null and alternative hypotheses manifest themselves in subtle differences in patterns in the data, binned analysis methods may be insensitive to these differences, and researchers may erroneously fail to reject the null hypothesis when in fact more sensitive statistical analysis methods might produce a different result when the null hypothesis is actually false. Here, with a focus on two recent conflicting studies of contagion in mass killings as instructive examples, we discuss how the use of unbinned likelihood methods makes optimal use of the information in the data; a fact that has been long known in statistical theory, but perhaps is not as widely appreciated amongst general researchers in the life and social sciences. In 2015, Towers et al published a paper that quantified the long-suspected contagion effect in mass killings. However, in 2017, Lankford & Tomek subsequently published a paper, based upon the same data, that claimed to contradict the results of the earlier study. The former used unbinned likelihood methods, and the latter used binned methods, and comparison of distribution moments. Using these analyses, we also discuss how visualization of the data can aid in determination of the most appropriate statistical analysis methods to distinguish between a null and alternate hypothesis. We also discuss the importance of assessment of the robustness of analysis results to methodological assumptions made (for example, arbitrary choices of number of bins and bin widths when using binned methods); an issue that is widely overlooked in the literature, but is critical to analysis reproducibility and robustness. When an analysis cannot distinguish between a null and alternate hypothesis, care must be taken to ensure that the analysis methodology itself maximizes the use of information in the data that can distinguish between the two hypotheses. The use of binned methods by Lankford & Tomek (2017), that examined how many mass killings fell within a 14 day window from a previous mass killing, substantially reduced the sensitivity of their analysis to contagion effects. The unbinned likelihood methods used by Towers et al (2015) did not suffer from this problem. While a binned analysis might be favorable for simplicity and clarity of presentation, unbinned likelihood methods are preferable when effects might be somewhat subtle.
Mubayi, Anuj; Castillo-Chavez, Carlos
2018-01-01
Background When attempting to statistically distinguish between a null and an alternative hypothesis, many researchers in the life and social sciences turn to binned statistical analysis methods, or methods that are simply based on the moments of a distribution (such as the mean, and variance). These methods have the advantage of simplicity of implementation, and simplicity of explanation. However, when null and alternative hypotheses manifest themselves in subtle differences in patterns in the data, binned analysis methods may be insensitive to these differences, and researchers may erroneously fail to reject the null hypothesis when in fact more sensitive statistical analysis methods might produce a different result when the null hypothesis is actually false. Here, with a focus on two recent conflicting studies of contagion in mass killings as instructive examples, we discuss how the use of unbinned likelihood methods makes optimal use of the information in the data; a fact that has been long known in statistical theory, but perhaps is not as widely appreciated amongst general researchers in the life and social sciences. Methods In 2015, Towers et al published a paper that quantified the long-suspected contagion effect in mass killings. However, in 2017, Lankford & Tomek subsequently published a paper, based upon the same data, that claimed to contradict the results of the earlier study. The former used unbinned likelihood methods, and the latter used binned methods, and comparison of distribution moments. Using these analyses, we also discuss how visualization of the data can aid in determination of the most appropriate statistical analysis methods to distinguish between a null and alternate hypothesis. We also discuss the importance of assessment of the robustness of analysis results to methodological assumptions made (for example, arbitrary choices of number of bins and bin widths when using binned methods); an issue that is widely overlooked in the literature, but is critical to analysis reproducibility and robustness. Conclusions When an analysis cannot distinguish between a null and alternate hypothesis, care must be taken to ensure that the analysis methodology itself maximizes the use of information in the data that can distinguish between the two hypotheses. The use of binned methods by Lankford & Tomek (2017), that examined how many mass killings fell within a 14 day window from a previous mass killing, substantially reduced the sensitivity of their analysis to contagion effects. The unbinned likelihood methods used by Towers et al (2015) did not suffer from this problem. While a binned analysis might be favorable for simplicity and clarity of presentation, unbinned likelihood methods are preferable when effects might be somewhat subtle. PMID:29742115
Stookey, Jodi J. D.
2016-01-01
Drinking water has heterogeneous effects on energy intake (EI), energy expenditure (EE), fat oxidation (FO) and weight change in randomized controlled trials (RCTs) involving adults and/or children. The aim of this qualitative review of RCTs was to identify conditions associated with negative, null and beneficial effects of drinking water on EI, EE, FO and weight, to generate hypotheses about ways to optimize drinking water interventions for weight management. RCT conditions that are associated with negative or null effects of drinking water on EI, EE and/or FO in the short term are associated with negative or null effects on weight over the longer term. RCT conditions that are associated with lower EI, increased EE and/or increased FO in the short term are associated with less weight gain or greater weight loss over time. Drinking water instead of caloric beverages decreases EI when food intake is ad libitum. Drinking water increases EE in metabolically-inflexible, obese individuals. Drinking water increases FO when blood carbohydrate and/or insulin concentrations are not elevated and when it is consumed instead of caloric beverages or in volumes that alter hydration status. Further research is needed to confirm the observed associations and to determine if/what specific conditions optimize drinking water interventions for weight management. PMID:26729162
Stookey, Jodi J D
2016-01-02
Drinking water has heterogeneous effects on energy intake (EI), energy expenditure (EE), fat oxidation (FO) and weight change in randomized controlled trials (RCTs) involving adults and/or children. The aim of this qualitative review of RCTs was to identify conditions associated with negative, null and beneficial effects of drinking water on EI, EE, FO and weight, to generate hypotheses about ways to optimize drinking water interventions for weight management. RCT conditions that are associated with negative or null effects of drinking water on EI, EE and/or FO in the short term are associated with negative or null effects on weight over the longer term. RCT conditions that are associated with lower EI, increased EE and/or increased FO in the short term are associated with less weight gain or greater weight loss over time. Drinking water instead of caloric beverages decreases EI when food intake is ad libitum. Drinking water increases EE in metabolically-inflexible, obese individuals. Drinking water increases FO when blood carbohydrate and/or insulin concentrations are not elevated and when it is consumed instead of caloric beverages or in volumes that alter hydration status. Further research is needed to confirm the observed associations and to determine if/what specific conditions optimize drinking water interventions for weight management.
ERIC Educational Resources Information Center
Usen, Onodiong Mfreke
2016-01-01
The study examined the relationship between teachers' utilization of school facilities and academic achievement of student nurses in Human Biology in schools of Nursing in Akwa Ibom State. Four (4) specific objectives, four (4) research questions and four (4) null hypotheses were formulated to guide the study. Ex-post facto survey design was…
ERIC Educational Resources Information Center
Oluwuo, S. O.; Enefaa, Bestman Briggs Anthonia
2016-01-01
The study investigated the application of education information management support tools in the promotion of teaching/learning and management of students' performance in federal universities in the South-South zone of Nigeria. Two research questions and two null hypotheses guided the study. The study adopted a descriptive survey design. The…
ERIC Educational Resources Information Center
Adekola, G.; Egbo, Nwoye Charles
2016-01-01
This study examined the influence of traditions and customs on community development in Nkanu West and Nkanu East Local Government Areas of Enugu State. The study was carried out with three objectives and three null hypotheses. The research adopted descriptive survey design with a population of 2,125 members of community Based Organizations in the…
ERIC Educational Resources Information Center
Onukwufor, Jonathan N.; Chukwu, Mercy Anwuri
2017-01-01
The study was conducted to find out the relationship between parenting styles and secondary students' drug addiction among adolescents in secondary schools in Obio-Akpor Local Government Area (L.G.A.) of Rivers State Nigeria. The study was guided by three research questions and similar number of null hypotheses. The study adopted a correlation…
Task-induced Changes in Idiopathic Infantile Nystagmus Vary with Gaze.
Salehi Fadardi, Marzieh; Bathke, Arne C; Harrar, Solomon W; Abel, Larry Allen
2017-05-01
Investigations of infantile nystagmus syndrome (INS) at center or at the null position have reported that INS worsens when visual demand is combined with internal states, e.g. stress. Visual function and INS parameters such as foveation time, frequency, amplitude, and intensity can also be influenced by gaze position. We hypothesized that increases from baseline in visual demand and mental load would affect INS parameters at the null position differently than at other gaze positions. Eleven participants with idiopathic INS were asked to determine the direction of Tumbling-E targets, whose visual demand was varied through changes in size and contrast, using a staircase procedure. Targets appeared between ±25° in 5° steps. The task was repeated with both mental arithmetic and time restriction to impose higher mental load, confirmed through subjective ratings and concurrent physiological measurements. Within-subject comparisons were limited to the null and 15° away from it. No significant main effects of task on any INS parameters were found. At both locations, high mental load worsened task performance metrics, i.e. lowest contrast (P = .001) and smallest optotype size reached (P = .012). There was a significant interaction between mental load and gaze position for foveation time (P = .02) and for the smallest optotype reached (P = .028). The increase in threshold optotype size from the low to high mental load was greater at the null than away from it. During high visual demand, foveation time significantly decreased from baseline at the null as compared to away from it (mean difference ± SE: 14.19 ± 0.7 msec; P = .010). Under high visual demand, the effects of increased mental load on foveation time and visual task performance differed at the null as compared to 15° away from it. Assessment of these effects could be valuable when evaluating INS clinically and when considering its impact on patients' daily activities.
Coffee, R. Lane; Williamson, Ashley J.; Adkins, Christopher M.; Gray, Marisa C.; Page, Terry L.; Broadie, Kendal
2012-01-01
Fragile X syndrome (FXS), caused by loss of the Fragile X Mental Retardation 1 (FMR1) gene product (FMRP), is the most common heritable cause of intellectual disability and autism spectrum disorders. It has been long hypothesized that the phosphorylation of serine 500 (S500) in human FMRP controls its function as an RNA-binding translational repressor. To test this hypothesis in vivo, we employed neuronally targeted expression of three human FMR1 transgenes, including wild-type (hFMR1), dephosphomimetic (S500A-hFMR1) and phosphomimetic (S500D-hFMR1), in the Drosophila FXS disease model to investigate phosphorylation requirements. At the molecular level, dfmr1 null mutants exhibit elevated brain protein levels due to loss of translational repressor activity. This defect is rescued for an individual target protein and across the population of brain proteins by the phosphomimetic, whereas the dephosphomimetic phenocopies the null condition. At the cellular level, dfmr1 null synapse architecture exhibits increased area, branching and bouton number. The phosphomimetic fully rescues these synaptogenesis defects, whereas the dephosphomimetic provides no rescue. The presence of Futsch-positive (microtubule-associated protein 1B) supernumerary microtubule loops is elevated in dfmr1 null synapses. The human phosphomimetic restores normal Futsch loops, whereas the dephosphomimetic provides no activity. At the behavioral level, dfmr1 null mutants exhibit strongly impaired olfactory associative learning. The human phosphomimetic targeted only to the brain-learning center restores normal learning ability, whereas the dephosphomimetic provides absolutely no rescue. We conclude that human FMRP S500 phosphorylation is necessary for its in vivo function as a neuronal translational repressor and regulator of synaptic architecture, and for the manifestation of FMRP-dependent learning behavior. PMID:22080836
The Inhibitory G Protein α-Subunit, Gαz, Promotes Type 1 Diabetes-Like Pathophysiology in NOD Mice.
Fenske, Rachel J; Cadena, Mark T; Harenda, Quincy E; Wienkes, Haley N; Carbajal, Kathryn; Schaid, Michael D; Laundre, Erin; Brill, Allison L; Truchan, Nathan A; Brar, Harpreet; Wisinski, Jaclyn; Cai, Jinjin; Graham, Timothy E; Engin, Feyza; Kimple, Michelle E
2017-06-01
The α-subunit of the heterotrimeric Gz protein, Gαz, promotes β-cell death and inhibits β-cell replication when pancreatic islets are challenged by stressors. Thus, we hypothesized that loss of Gαz protein would preserve functional β-cell mass in the nonobese diabetic (NOD) model, protecting from overt diabetes. We saw that protection from diabetes was robust and durable up to 35 weeks of age in Gαz knockout mice. By 17 weeks of age, Gαz-null NOD mice had significantly higher diabetes-free survival than wild-type littermates. Islets from these mice had reduced markers of proinflammatory immune cell infiltration on both the histological and transcript levels and secreted more insulin in response to glucose. Further analyses of pancreas sections revealed significantly fewer terminal deoxynucleotidyltransferase-mediated dUTP nick end labeling (TUNEL)-positive β-cells in Gαz-null islets despite similar immune infiltration in control mice. Islets from Gαz-null mice also exhibited a higher percentage of Ki-67-positive β-cells, a measure of proliferation, even in the presence of immune infiltration. Finally, β-cell-specific Gαz-null mice phenocopy whole-body Gαz-null mice in their protection from developing hyperglycemia after streptozotocin administration, supporting a β-cell-centric role for Gαz in diabetes pathophysiology. We propose that Gαz plays a key role in β-cell signaling that becomes dysfunctional in the type 1 diabetes setting, accelerating the death of β-cells, which promotes further accumulation of immune cells in the pancreatic islets, and inhibiting a restorative proliferative response. Copyright © 2017 Endocrine Society.
NASA Astrophysics Data System (ADS)
Straka, Mika J.; Caldarelli, Guido; Squartini, Tiziano; Saracco, Fabio
2018-04-01
Bipartite networks provide an insightful representation of many systems, ranging from mutualistic networks of species interactions to investment networks in finance. The analyses of their topological structures have revealed the ubiquitous presence of properties which seem to characterize many—apparently different—systems. Nestedness, for example, has been observed in biological plant-pollinator as well as in country-product exportation networks. Due to the interdisciplinary character of complex networks, tools developed in one field, for example ecology, can greatly enrich other areas of research, such as economy and finance, and vice versa. With this in mind, we briefly review several entropy-based bipartite null models that have been recently proposed and discuss their application to real-world systems. The focus on these models is motivated by the fact that they show three very desirable features: analytical character, general applicability, and versatility. In this respect, entropy-based methods have been proven to perform satisfactorily both in providing benchmarks for testing evidence-based null hypotheses and in reconstructing unknown network configurations from partial information. Furthermore, entropy-based models have been successfully employed to analyze ecological as well as economic systems. As an example, the application of entropy-based null models has detected early-warning signals, both in economic and financial systems, of the 2007-2008 world crisis. Moreover, they have revealed a statistically-significant export specialization phenomenon of country export baskets in international trade, a result that seems to reconcile Ricardo's hypothesis in classical economics with recent findings on the (empirical) diversification industrial production at the national level. Finally, these null models have shown that the information contained in the nestedness is already accounted for by the degree sequence of the corresponding graphs.
Using potential performance theory to test five hypotheses about meta-attribution.
Trafimow, David; Hunt, Gayle; Rice, Stephen; Geels, Kasha
2011-01-01
Based on I. Kant's (1991) distinction between perfect and imperfect duties and the attribution literature pertaining to that distinction, the authors proposed and tested 5 hypotheses about meta-attribution. More specifically, violations of perfect duties have been shown to arouse both more negative affect and stronger correspondent inferences than do violations of imperfect duties (e.g., D. Trafimow, I. K. Bromgard, K. A. Finlay, & T. Ketelaar, 2005). But when it comes to making meta-attributions-that is, guessing the attributions others would make-is the affect differential an advantage or a disadvantage? In addition to the null hypothesis of no effect, the authors proposed and tested additional hypotheses about how negative affect might increase or decrease the effectiveness of people's meta-attribution strategies and how even if there is no effect on strategy effectiveness, negative affect could increase or decrease the consistencies with which these strategies could be used.
Pridemore, William Alex; Freilich, Joshua D
2007-12-01
Since Roe v. Wade, most states have passed laws either restricting or further protecting reproductive rights. During a wave of anti-abortion violence in the early 1990s, several states also enacted legislation protecting abortion clinics, staff, and patients. One hypothesis drawn from the theoretical literature predicts that these laws provide a deterrent effect and thus fewer anti-abortion crimes in states that protect clinics and reproductive rights. An alternative hypothesis drawn from the literature expects a backlash effect from radical members of the movement and thus more crimes in states with protective legislation. We tested these competing hypotheses by taking advantage of unique data sets that gauge the strength of laws protecting clinics and reproductive rights and that provide self-report victimization data from clinics. Employing logistic regression and controlling for several potential covariates, we found null effects and thus no support for either hypothesis. The null findings were consistent across a number of different types of victimization. Our discussion contextualizes these results in terms of previous research on crimes against abortion providers, discusses alternative explanations for the null findings, and considers the implications for future policy development and research.
Filipiak, Katarzyna; Klein, Daniel; Roy, Anuradha
2017-01-01
The problem of testing the separability of a covariance matrix against an unstructured variance-covariance matrix is studied in the context of multivariate repeated measures data using Rao's score test (RST). The RST statistic is developed with the first component of the separable structure as a first-order autoregressive (AR(1)) correlation matrix or an unstructured (UN) covariance matrix under the assumption of multivariate normality. It is shown that the distribution of the RST statistic under the null hypothesis of any separability does not depend on the true values of the mean or the unstructured components of the separable structure. A significant advantage of the RST is that it can be performed for small samples, even smaller than the dimension of the data, where the likelihood ratio test (LRT) cannot be used, and it outperforms the standard LRT in a number of contexts. Monte Carlo simulations are then used to study the comparative behavior of the null distribution of the RST statistic, as well as that of the LRT statistic, in terms of sample size considerations, and for the estimation of the empirical percentiles. Our findings are compared with existing results where the first component of the separable structure is a compound symmetry (CS) correlation matrix. It is also shown by simulations that the empirical null distribution of the RST statistic converges faster than the empirical null distribution of the LRT statistic to the limiting χ 2 distribution. The tests are implemented on a real dataset from medical studies. © 2016 WILEY-VCH Verlag GmbH & Co. KGaA, Weinheim.
Trust in Leadership DEOCS 4.1 Construct Validity Summary
2017-08-01
Item Corrected Item- Total Correlation Cronbach’s Alpha if Item Deleted Four-point Scale Items I can depend on my immediate supervisor to meet...1974) were used to assess the fit between the data and the factor. The BTS hypothesizes that the correlation matrix is an identity matrix. The...to reject the null hypothesis that the correlation matrix is an identity, and to conclude that the factor analysis is an appropriate method to
ERIC Educational Resources Information Center
Ebele, Uju F.; Olofu, Paul A.
2017-01-01
The study focused on enhancing the standard of teaching and learning in the 21st century via qualitative school-based supervision in secondary schools in Abuja municipal area council. To guide the study, two null hypotheses were formulated. A descriptive survey research design was adopted. The sample of the study constituted of 270 secondary…
Map LineUps: Effects of spatial structure on graphical inference.
Beecham, Roger; Dykes, Jason; Meulemans, Wouter; Slingsby, Aidan; Turkay, Cagatay; Wood, Jo
2017-01-01
Fundamental to the effective use of visualization as an analytic and descriptive tool is the assurance that presenting data visually provides the capability of making inferences from what we see. This paper explores two related approaches to quantifying the confidence we may have in making visual inferences from mapped geospatial data. We adapt Wickham et al.'s 'Visual Line-up' method as a direct analogy with Null Hypothesis Significance Testing (NHST) and propose a new approach for generating more credible spatial null hypotheses. Rather than using as a spatial null hypothesis the unrealistic assumption of complete spatial randomness, we propose spatially autocorrelated simulations as alternative nulls. We conduct a set of crowdsourced experiments (n=361) to determine the just noticeable difference (JND) between pairs of choropleth maps of geographic units controlling for spatial autocorrelation (Moran's I statistic) and geometric configuration (variance in spatial unit area). Results indicate that people's abilities to perceive differences in spatial autocorrelation vary with baseline autocorrelation structure and the geometric configuration of geographic units. These results allow us, for the first time, to construct a visual equivalent of statistical power for geospatial data. Our JND results add to those provided in recent years by Klippel et al. (2011), Harrison et al. (2014) and Kay & Heer (2015) for correlation visualization. Importantly, they provide an empirical basis for an improved construction of visual line-ups for maps and the development of theory to inform geospatial tests of graphical inference.
Detecting Multifractal Properties in Asset Returns:
NASA Astrophysics Data System (ADS)
Lux, Thomas
It has become popular recently to apply the multifractal formalism of statistical physics (scaling analysis of structure functions and f(α) singularity spectrum analysis) to financial data. The outcome of such studies is a nonlinear shape of the structure function and a nontrivial behavior of the spectrum. Eventually, this literature has moved from basic data analysis to estimation of particular variants of multifractal models for asset returns via fitting of the empirical τ(q) and f(α) functions. Here, we reinvestigate earlier claims of multifractality using four long time series of important financial markets. Taking the recently proposed multifractal models of asset returns as our starting point, we show that the typical "scaling estimators" used in the physics literature are unable to distinguish between spurious and "true" multiscaling of financial data. Designing explicit tests for multiscaling, we can in no case reject the null hypothesis that the apparent curvature of both the scaling function and the Hölder spectrum are spuriously generated by the particular fat-tailed distribution of financial data. Given the well-known overwhelming evidence in favor of different degrees of long-term dependence in the powers of returns, we interpret this inability to reject the null hypothesis of multiscaling as a lack of discriminatory power of the standard approach rather than as a true rejection of multiscaling. However, the complete "failure" of the multifractal apparatus in this setting also raises the question whether results in other areas (like geophysics) suffer from similar shortcomings of the traditional methodology.
van Assen, Marcel A L M; van Aert, Robbie C M; Nuijten, Michèle B; Wicherts, Jelte M
2014-01-01
De Winter and Happee examined whether science based on selective publishing of significant results may be effective in accurate estimation of population effects, and whether this is even more effective than a science in which all results are published (i.e., a science without publication bias). Based on their simulation study they concluded that "selective publishing yields a more accurate meta-analytic estimation of the true effect than publishing everything, (and that) publishing nonreplicable results while placing null results in the file drawer can be beneficial for the scientific collective" (p.4). Using their scenario with a small to medium population effect size, we show that publishing everything is more effective for the scientific collective than selective publishing of significant results. Additionally, we examined a scenario with a null effect, which provides a more dramatic illustration of the superiority of publishing everything over selective publishing. Publishing everything is more effective than only reporting significant outcomes.
van Assen, Marcel A. L. M.; van Aert, Robbie C. M.; Nuijten, Michèle B.; Wicherts, Jelte M.
2014-01-01
Background De Winter and Happee [1] examined whether science based on selective publishing of significant results may be effective in accurate estimation of population effects, and whether this is even more effective than a science in which all results are published (i.e., a science without publication bias). Based on their simulation study they concluded that “selective publishing yields a more accurate meta-analytic estimation of the true effect than publishing everything, (and that) publishing nonreplicable results while placing null results in the file drawer can be beneficial for the scientific collective” (p.4). Methods and Findings Using their scenario with a small to medium population effect size, we show that publishing everything is more effective for the scientific collective than selective publishing of significant results. Additionally, we examined a scenario with a null effect, which provides a more dramatic illustration of the superiority of publishing everything over selective publishing. Conclusion Publishing everything is more effective than only reporting significant outcomes. PMID:24465448
Testing for Polytomies in Phylogenetic Species Trees Using Quartet Frequencies.
Sayyari, Erfan; Mirarab, Siavash
2018-02-28
Phylogenetic species trees typically represent the speciation history as a bifurcating tree. Speciation events that simultaneously create more than two descendants, thereby creating polytomies in the phylogeny, are possible. Moreover, the inability to resolve relationships is often shown as a (soft) polytomy. Both types of polytomies have been traditionally studied in the context of gene tree reconstruction from sequence data. However, polytomies in the species tree cannot be detected or ruled out without considering gene tree discordance. In this paper, we describe a statistical test based on properties of the multi-species coalescent model to test the null hypothesis that a branch in an estimated species tree should be replaced by a polytomy. On both simulated and biological datasets, we show that the null hypothesis is rejected for all but the shortest branches, and in most cases, it is retained for true polytomies. The test, available as part of the Accurate Species TRee ALgorithm (ASTRAL) package, can help systematists decide whether their datasets are sufficient to resolve specific relationships of interest.
Reinterpreting maximum entropy in ecology: a null hypothesis constrained by ecological mechanism.
O'Dwyer, James P; Rominger, Andrew; Xiao, Xiao
2017-07-01
Simplified mechanistic models in ecology have been criticised for the fact that a good fit to data does not imply the mechanism is true: pattern does not equal process. In parallel, the maximum entropy principle (MaxEnt) has been applied in ecology to make predictions constrained by just a handful of state variables, like total abundance or species richness. But an outstanding question remains: what principle tells us which state variables to constrain? Here we attempt to solve both problems simultaneously, by translating a given set of mechanisms into the state variables to be used in MaxEnt, and then using this MaxEnt theory as a null model against which to compare mechanistic predictions. In particular, we identify the sufficient statistics needed to parametrise a given mechanistic model from data and use them as MaxEnt constraints. Our approach isolates exactly what mechanism is telling us over and above the state variables alone. © 2017 John Wiley & Sons Ltd/CNRS.
Testing for Polytomies in Phylogenetic Species Trees Using Quartet Frequencies
Sayyari, Erfan
2018-01-01
Phylogenetic species trees typically represent the speciation history as a bifurcating tree. Speciation events that simultaneously create more than two descendants, thereby creating polytomies in the phylogeny, are possible. Moreover, the inability to resolve relationships is often shown as a (soft) polytomy. Both types of polytomies have been traditionally studied in the context of gene tree reconstruction from sequence data. However, polytomies in the species tree cannot be detected or ruled out without considering gene tree discordance. In this paper, we describe a statistical test based on properties of the multi-species coalescent model to test the null hypothesis that a branch in an estimated species tree should be replaced by a polytomy. On both simulated and biological datasets, we show that the null hypothesis is rejected for all but the shortest branches, and in most cases, it is retained for true polytomies. The test, available as part of the Accurate Species TRee ALgorithm (ASTRAL) package, can help systematists decide whether their datasets are sufficient to resolve specific relationships of interest. PMID:29495636
Beyond statistical inference: A decision theory for science
KILLEEN, PETER R.
2008-01-01
Traditional null hypothesis significance testing does not yield the probability of the null or its alternative and, therefore, cannot logically ground scientific decisions. The decision theory proposed here calculates the expected utility of an effect on the basis of (1) the probability of replicating it and (2) a utility function on its size. It takes significance tests—which place all value on the replicability of an effect and none on its magnitude—as a special case, one in which the cost of a false positive is revealed to be an order of magnitude greater than the value of a true positive. More realistic utility functions credit both replicability and effect size, integrating them for a single index of merit. The analysis incorporates opportunity cost and is consistent with alternate measures of effect size, such as r2 and information transmission, and with Bayesian model selection criteria. An alternate formulation is functionally equivalent to the formal theory, transparent, and easy to compute. PMID:17201351
Beyond statistical inference: a decision theory for science.
Killeen, Peter R
2006-08-01
Traditional null hypothesis significance testing does not yield the probability of the null or its alternative and, therefore, cannot logically ground scientific decisions. The decision theory proposed here calculates the expected utility of an effect on the basis of (1) the probability of replicating it and (2) a utility function on its size. It takes significance tests--which place all value on the replicability of an effect and none on its magnitude--as a special case, one in which the cost of a false positive is revealed to be an order of magnitude greater than the value of a true positive. More realistic utility functions credit both replicability and effect size, integrating them for a single index of merit. The analysis incorporates opportunity cost and is consistent with alternate measures of effect size, such as r2 and information transmission, and with Bayesian model selection criteria. An alternate formulation is functionally equivalent to the formal theory, transparent, and easy to compute.
p-Curve and p-Hacking in Observational Research.
Bruns, Stephan B; Ioannidis, John P A
2016-01-01
The p-curve, the distribution of statistically significant p-values of published studies, has been used to make inferences on the proportion of true effects and on the presence of p-hacking in the published literature. We analyze the p-curve for observational research in the presence of p-hacking. We show by means of simulations that even with minimal omitted-variable bias (e.g., unaccounted confounding) p-curves based on true effects and p-curves based on null-effects with p-hacking cannot be reliably distinguished. We also demonstrate this problem using as practical example the evaluation of the effect of malaria prevalence on economic growth between 1960 and 1996. These findings call recent studies into question that use the p-curve to infer that most published research findings are based on true effects in the medical literature and in a wide range of disciplines. p-values in observational research may need to be empirically calibrated to be interpretable with respect to the commonly used significance threshold of 0.05. Violations of randomization in experimental studies may also result in situations where the use of p-curves is similarly unreliable.
Tests of Hypotheses Arising In the Correlated Random Coefficient Model*
Heckman, James J.; Schmierer, Daniel
2010-01-01
This paper examines the correlated random coefficient model. It extends the analysis of Swamy (1971), who pioneered the uncorrelated random coefficient model in economics. We develop the properties of the correlated random coefficient model and derive a new representation of the variance of the instrumental variable estimator for that model. We develop tests of the validity of the correlated random coefficient model against the null hypothesis of the uncorrelated random coefficient model. PMID:21170148
2011-06-03
Permutationalmultivariate analysis of variance ( PerMANOVA ; McArdle and Anderson, 2001) was used to test hypotheses regard- ing regions and invasion level...for the differences due to invasion level after removing any differences due to regions, soil texture, and habitat. The null distribution for PerMANOVA ...soil neigh- borhoods, PerMANOVA tests were carried out separately for each site. We did not use a stratified randomization scheme for these tests, under
An exploratory analysis of the relationship between ambient ...
Background: Associations between ozone (O3) and fine particulate matter (PM2.5) concentrations and birth outcomes have been previously demonstrated. We perform an exploratory analysis of O3 and PM2.5 concentrations during early pregnancy and multiple types of birth defects. Methods: Data on births were obtained from the Texas Birth Defects Registry and the National Birth Defects Prevention Study (NBDPS) in Texas. Air pollution concentrations were determined using a Bayesian hierarchical model that combined modeled air pollution concentrations with air monitoring data to create bias-corrected concentrations and matched to residential address at birth. Average air pollution concentrations during the first trimester were calculated. Results: The analysis generated hypotheses for future, confirmatory studies; although many of the observed associations between the air pollutants and birth defects were null. The hypotheses are provided by an observed association between O3 and craniosynostosis [adjusted OR 1.28 (95% CI 1.04, 1.58) per 13.3 ppb increase) and observed inverse associations between PM2.5 concentrations and septal heart defects and obstructive heart defects [adjusted ORs 0.79 (95% CI 0.75, 0.82) and 0.88 (95% CI 0.79, 0.97) per 5.0 µg/m3 increase, respectively] in the Texas Birth Defects Registry study. Septal heart defects and ventricular outflow tract obstructions were also examined using the NBDPS but the associations with PM2.5 were null [adj
True self-alienation positively predicts reports of mindwandering.
Vess, Matthew; Leal, Stephanie A; Hoeldtke, Russell T; Schlegel, Rebecca J; Hicks, Joshua A
2016-10-01
Two studies assessed the relationship between feelings of uncertainty about who one truly is (i.e., true self-alienation) and self-reported task-unrelated thoughts (i.e., mindwandering) during performance tasks. Because true self-alienation is conceptualized as the subjective disconnect between conscious awareness and actual experience, we hypothesized that greater feelings of true self-alienation would positively relate to subjective reports of mindwandering. Two convergent studies supported this hypothesis. Moreover, this relationship could not consistently be accounted for by the independent influence of other aspects of authenticity, negative mood, mindfulness, or broad personality dimensions. These findings suggest that individual differences in true self-alienation are reliably associated with subjective reports of mindwandering. The implications of these findings for the true self-alienation construct, the ways that personality relates to mindwandering, and future research directions focused on curtailing mindwandering and improving performance and achievement are discussed. Copyright © 2016 Elsevier Inc. All rights reserved.
The Psychological Benefits of Being Authentic on Facebook.
Grieve, Rachel; Watkinson, Jarrah
2016-07-01
Having others acknowledge and validate one's true self is associated with better psychological health. Existing research indicates that an individual's true self may be more readily expressed on Facebook than in person. This study brought together these two premises by investigating for the first time the psychosocial outcomes associated with communicating one's true self on Facebook. Participants (n = 164) completed a personality assessment once as their true self and once as the self they present on Facebook (Facebook self), as well as measures of social connectedness, subjective well-being, depression, anxiety, and stress. Euclidean distances quantified the difference between one's true self and the Facebook self. Hypotheses received partial support. Better coherence between the true self and the Facebook self was associated with better social connectedness and less stress. Two models provided evidence of mediation effects. Findings highlight that authentic self-presentation on Facebook can be associated with positive psychological outcomes.
Brasier, Martin
2015-01-01
In 1673, Martin Lister explored the preservation of ‘St Cuthbert's beads’ plus other fossil crinoid remains from approximately 350 Ma Carboniferous limestone in northern England. He used taphonomic evidence (transport, disarticulation, burial and cementation) to infer an origin as petrified plant remains, in contrast with his views expressed elsewhere that fossil mollusc shells could have formed abiogenically, by ‘plastic forces’ within rock. Lister also observed pentagonal symmetry, now seen as characteristic of living echinoderm skeletons. A postscript from John Ray supports Lister's ‘taphonomic’ observations and accepts the biogenicity of these fossil ‘vegetables’. Ray then concluded with a prophecy, predicting the discovery of comparable living fossils in remote ocean waters. These early discussions compare with current debates about the character of candidate microfossils from the early Earth and Mars. Interesting biomorphs are now tested against the abiogenic null hypotheses, making use of features such as those pioneered by Lister, including evidence for geological context, rules for growth and taphonomy. Advanced techniques now allow us to extend this list of criteria to include the nanoscale mapping of biology-like behaviour patterns plus metabolic pathways. Whereas the science of palaeobiology once began with tests for biogenicity, the same is now true for geobiology and astrobiology. This commentary was written to celebrate the 350th anniversary of the journal Philosophical Transactions of the Royal Society. PMID:25750150
Statistical testing of association between menstruation and migraine.
Barra, Mathias; Dahl, Fredrik A; Vetvik, Kjersti G
2015-02-01
To repair and refine a previously proposed method for statistical analysis of association between migraine and menstruation. Menstrually related migraine (MRM) affects about 20% of female migraineurs in the general population. The exact pathophysiological link from menstruation to migraine is hypothesized to be through fluctuations in female reproductive hormones, but the exact mechanisms remain unknown. Therefore, the main diagnostic criterion today is concurrency of migraine attacks with menstruation. Methods aiming to exclude spurious associations are wanted, so that further research into these mechanisms can be performed on a population with a true association. The statistical method is based on a simple two-parameter null model of MRM (which allows for simulation modeling), and Fisher's exact test (with mid-p correction) applied to standard 2 × 2 contingency tables derived from the patients' headache diaries. Our method is a corrected version of a previously published flawed framework. To our best knowledge, no other published methods for establishing a menstruation-migraine association by statistical means exist today. The probabilistic methodology shows good performance when subjected to receiver operator characteristic curve analysis. Quick reference cutoff values for the clinical setting were tabulated for assessing association given a patient's headache history. In this paper, we correct a proposed method for establishing association between menstruation and migraine by statistical methods. We conclude that the proposed standard of 3-cycle observations prior to setting an MRM diagnosis should be extended with at least one perimenstrual window to obtain sufficient information for statistical processing. © 2014 American Headache Society.
The Leishmania major BBSome subunit BBS1 is essential for parasite virulence in the mammalian host
Price, Helen P; Paape, Daniel; Hodgkinson, Michael R; Farrant, Katie; Doehl, Johannes; Stark, Meg; Smith, Deborah F
2013-01-01
Summary Bardet–Biedl syndrome (BBS) is a human genetic disorder with a spectrum of symptoms caused by primary cilium dysfunction. The disease is caused by mutations in one of at least 17 identified genes, of which seven encode subunits of the BBSome, a protein complex required for specific trafficking events to and from the primary cilium. The molecular mechanisms associated with BBSome function remain to be fully elucidated. Here, we generated null and complemented mutants of the BBSome subunit BBS1 in the protozoan parasite, Leishmania. In the absence of BBS1, extracellular parasites have no apparent defects in growth, flagellum assembly, motility or differentiation in vitro but there is accumulation of vacuole-like structures close to the flagellar pocket. Infectivity of these parasites for macrophages in vitro is reduced compared with wild-type controls but the null parasites retain the ability to differentiate to the intracellular amastigote stage. However, infectivity of BBS1 null parasites is severely compromised in a BALB/c mouse footpad model. We hypothesize that the absence of BBS1 in Leishmania leads to defects in specific trafficking events that affect parasite persistence in the host. This is the first report of an association between the BBSome complex and pathogen infectivity. PMID:23998526
Prajapati, Saumya; Tao, Jinhui; Ruan, Qichao; De Yoreo, James J.; Moradian-Oldak, Janet
2015-01-01
Reconstruction of enamel-like materials is a central topic of research in dentistry and material sciences. The importance of precise proteolytic mechanisms in amelogenesis to form a hard tissue with more than 95% mineral content has already been reported. A mutation in the Matrix Metalloproteinase-20 (MMP-20) gene results in hypomineralized enamel that is thin, disorganized and breaks from the underlying dentin. We hypothesized that the absence of MMP-20 during amelogenesis results in the occlusion of amelogenin in the enamel hydroxyapatite crystals. We used spectroscopy and electron microscopy techniques to qualitatively and quantitatively analyze occluded proteins within the isolated enamel crystals from MMP-20 null and Wild type (WT) mice. Our results showed that the isolated enamel crystals of MMP-20 null mice had more organic macromolecules occluded inside them than enamel crystals from the WT. The crystal lattice arrangements of MMP-20 null enamel crystals analyzed by High Resolution Transmission Electron Microscopy (HRTEM) were found to be significantly different from those of the WT. Raman studies indicated that the crystallinity of the MMP-20 null enamel crystals was lower than that of the WT. In conclusion, we present a novel functional mechanism of MMP-20, specifically prevention of unwanted organic material entrapped in the forming enamel crystals, which occurs as the result of precise amelogenin cleavage. MMP-20 action guides the growth morphology of the forming hydroxyapatite crystals and enhances their crystallinity. Elucidating such molecular mechanisms can be applied in the design of novel biomaterials for future clinical applications in dental restoration or repair. PMID:26513418
Prajapati, Saumya; Tao, Jinhui; Ruan, Qichao; De Yoreo, James J; Moradian-Oldak, Janet
2016-01-01
Reconstruction of enamel-like materials is a central topic of research in dentistry and material sciences. The importance of precise proteolytic mechanisms in amelogenesis to form a hard tissue with more than 95% mineral content has already been reported. A mutation in the Matrix Metalloproteinase-20 (MMP-20) gene results in hypomineralized enamel that is thin, disorganized and breaks from the underlying dentin. We hypothesized that the absence of MMP-20 during amelogenesis results in the occlusion of amelogenin in the enamel hydroxyapatite crystals. We used spectroscopy and electron microscopy techniques to qualitatively and quantitatively analyze occluded proteins within the isolated enamel crystals from MMP-20 null and Wild type (WT) mice. Our results showed that the isolated enamel crystals of MMP-20 null mice had more organic macromolecules occluded inside them than enamel crystals from the WT. The crystal lattice arrangements of MMP-20 null enamel crystals analyzed by High Resolution Transmission Electron Microscopy (HRTEM) were found to be significantly different from those of the WT. Raman studies indicated that the crystallinity of the MMP-20 null enamel crystals was lower than that of the WT. In conclusion, we present a novel functional mechanism of MMP-20, specifically prevention of unwanted organic material entrapped in the forming enamel crystals, which occurs as the result of precise amelogenin cleavage. MMP-20 action guides the growth morphology of the forming hydroxyapatite crystals and enhances their crystallinity. Elucidating such molecular mechanisms can be applied in the design of novel biomaterials for future clinical applications in dental restoration or repair. Copyright © 2015 Elsevier Ltd. All rights reserved.
Share Market Analysis Using Various Economical Determinants to Predict Decision of Investors
NASA Astrophysics Data System (ADS)
Ghosh, Arijit; Roy, Samrat; Bandyopadhyay, Gautam; Choudhuri, Kripasindhu
2010-10-01
The following paper tries to develop six major hypotheses in Bombay Stock Exchange (BSE) in India. The paper tries to proof the hypothesis by collecting data from the fields on six sectors: oil prices, gold price, Cash Reserve Ratio, food price inflation, call money rate and Dollar price. The research uses these data as indicators to identify relationship and level of influence on Share prices of Bombay Stock Exchange by rejecting and accepting the null hypothesis.
Nonparametric tests for interaction and group differences in a two-way layout.
Fisher, A C; Wallenstein, S
1991-01-01
Nonparametric tests of group differences and interaction across strata are developed in which the null hypotheses for these tests are expressed as functions of rho i = P(X > Y) + 1/2P(X = Y), where X refers to a random observation from one group and Y refers to a random observation from the other group within stratum i. The estimator r of the parameter rho is shown to be a useful way to summarize and examine data for ordinal and continuous data.
Hodge, Greg; Jersmann, Hubertus; Tran, Hai B; Roscioli, Eugene; Holmes, Mark; Reynolds, Paul N; Hodge, Sandra
2015-10-24
Histone acetyltransferases (HAT) and histone deacetylases (HDAC) are enzymes that upregulate and down-regulate pro-inflammatory gene transcription respectively. HDAC2 is required by corticosteroids to switch off activated inflammatory genes and is reduced in lung macrophages in COPD. We have shown that COPD patients have increased steroid resistant CD28null (senescent) pro-inflammatory T and NKT-like peripheral blood cells (particularly CD8+ subsets) and we hypothesized that these changes would be associated with a loss of HDAC2 from these senescent pro-inflammatory lymphocytes. Blood was collected from 10 COPD and 10 aged-matched controls. Intracellular pro-inflammatory cytokines, IFNγ and TNFα, and expression of CD28, HDAC2 and HAT, were determined in lymphocyte subsets in the presence of ± 5 mg/ml theophylline (HDAC2 activator), 10 μM prednisolone and 2.5 ng/ml cyclosporine A (immunosuppressant), using flow cytometry. There was a loss of HDAC2 from CD28null CD8+ T and NKT-like cells in COPD. There was a significant negative correlation between HDAC2 expression and the percentage of CD28null CD8+ T and NKT-like cells producing IFNγ or TNFα in all subjects (eg, COPD: R = -.763, p < 0.001 for T-cell IFNγ). There was a synergistic upregulation of HDAC2 and associated decrease in pro-inflammatory cytokine production in CD28nullCD8+ T and NKT-like cells in the presence of 5 mg/L theophylline + 10(-6) M prednisolone or 2.5 ng/mL cyclosporine A (CsA). Lymphocyte senescence in COPD is associated with loss of HDAC2 in CD28nullCD8+ T and NKT-like cells. Alternative treatment options such as combined theophylline with low-dose CsA, that inhibit these pro-inflammatory cells, may reduce systemic inflammation in COPD.
Hodge, Greg; Jersmann, Hubertus; Tran, Hai B; Holmes, Mark; Reynolds, Paul N; Hodge, Sandra
2015-01-09
Glucocorticoid (GC) resistance is a major barrier in COPD treatment. We have shown increased expression of the drug efflux pump, Pgp1 in cytotoxic/pro-inflammatory lymphocytes in COPD. Loss of lymphocyte co-stimulatory molecule CD28 (lymphocyte senescence) was associated with a further increase in their pro-inflammatory/cytotoxic potential and resistance to GC. We hypothesized that lymphocyte senescence and increased Pgp1 are also associated with down-regulation of the GC receptor (GCR). Blood was collected from 10 COPD and 10 healthy aged-matched controls. Flow cytometry was applied to assess intracellular pro-inflammatory cytokines, CD28, Pgp1, GCR, steroid binding and relative cytoplasm/nuclear GCR by CD28+ and CD28null T, NKT-like cells. GCR localization was confirmed by fluorescent microscopy. COPD was associated with increased numbers of CD28nullCD8+ T and NKT-like cells. Loss of CD28 was associated with an increased percentage of T and NKT-like cells producing IFNγ or TNFα and associated with a loss of GCR and Dex-Fluor staining but unchanged Pgp1. There was a significant loss of GCR in CD8 + CD28null compared with CD8 + CD28+ T and NKT-like cells from both COPD and controls (eg, mean ± SEM 8 ± 3% GCR + CD8 + CD28null T-cells vs 49 ± 5% GCR + CD8 + CD28+ T-cells in COPD). There was a significant negative correlation between GCR expression and IFNγ and TNFα production by T and NKT-like cells(eg, COPD: T-cell IFNγ R = -.615; ) and with FEV1 in COPD (R = -.777). COPD is associated with loss of GCR in senescent CD28null and NKT-like cells suggesting alternative treatment options to GC are required to inhibit these pro-inflammatory/cytotoxic cells.
Hodge, Greg; Roscioli, Eugene; Jersmann, Hubertus; Tran, Hai B; Holmes, Mark; Reynolds, Paul N; Hodge, Sandra
2016-10-21
Corticosteroid resistance is a major barrier to effective treatment of COPD. We have shown that the resistance is associated with decreased expression of glucocorticoid receptor (GCR) by senescent CD28nullCD8+ pro-inflammatory lymphocytes in peripheral blood of COPD patients. GCR must be bound to molecular chaperones heat shock proteins (Hsp) 70 and Hsp90 to acquire a high-affinity steroid binding conformation, and traffic to the nucleus. We hypothesized a loss of Hsp70/90 from these lymphocytes may further contribute to steroid resistance in COPD. Blood was collected from COPD (n = 10) and aged-matched controls (n = 10). To assess response to steroids, cytotoxic mediators, intracellular pro-inflammatory cytokines, CD28, GCR, Hsp70 and Hsp90 were determined in T and NKT-like cells in the presence of ± 10 μM prednisolone and 2.5 ng/mL cyclosporine A (binds to GCR-Hsp70/90 complex) using flow cytometry, western blot and fluorescence microscopy. A loss of expression of Hsp90 and GCR from CD28null CD8+ T and NKT-like cells in COPD was noted (Hsp70 unchanged). Loss of Hsp90 expression correlated with the percentage of CD28null CD8+ T and NKT-like cells producing IFNγ or TNFα in all subjects (eg, COPD: R = -0.763, p = 0.007 for T-cell IFNγ). Up-regulation of Hsp90 and associated decrease in pro-inflammatory cytokine production was found in CD28nullCD8+ T and NKT-like cells in the presence of 10 μM prednisolone and 2.5 ng/mL cyclosporine A. Loss of Hsp90 from cytotoxic/pro-inflammatory CD28nullCD8+ T and NKT-like cells could contribute to steroid resistance in COPD. Combination prednisolone and low-dose cyclosporine A therapy inhibits these pro-inflammatory cells and may reduce systemic inflammation in COPD.
A large scale test of the gaming-enhancement hypothesis.
Przybylski, Andrew K; Wang, John C
2016-01-01
A growing research literature suggests that regular electronic game play and game-based training programs may confer practically significant benefits to cognitive functioning. Most evidence supporting this idea, the gaming-enhancement hypothesis , has been collected in small-scale studies of university students and older adults. This research investigated the hypothesis in a general way with a large sample of 1,847 school-aged children. Our aim was to examine the relations between young people's gaming experiences and an objective test of reasoning performance. Using a Bayesian hypothesis testing approach, evidence for the gaming-enhancement and null hypotheses were compared. Results provided no substantive evidence supporting the idea that having preference for or regularly playing commercially available games was positively associated with reasoning ability. Evidence ranged from equivocal to very strong in support for the null hypothesis over what was predicted. The discussion focuses on the value of Bayesian hypothesis testing for investigating electronic gaming effects, the importance of open science practices, and pre-registered designs to improve the quality of future work.
Rank-based permutation approaches for non-parametric factorial designs.
Umlauft, Maria; Konietschke, Frank; Pauly, Markus
2017-11-01
Inference methods for null hypotheses formulated in terms of distribution functions in general non-parametric factorial designs are studied. The methods can be applied to continuous, ordinal or even ordered categorical data in a unified way, and are based only on ranks. In this set-up Wald-type statistics and ANOVA-type statistics are the current state of the art. The first method is asymptotically exact but a rather liberal statistical testing procedure for small to moderate sample size, while the latter is only an approximation which does not possess the correct asymptotic α level under the null. To bridge these gaps, a novel permutation approach is proposed which can be seen as a flexible generalization of the Kruskal-Wallis test to all kinds of factorial designs with independent observations. It is proven that the permutation principle is asymptotically correct while keeping its finite exactness property when data are exchangeable. The results of extensive simulation studies foster these theoretical findings. A real data set exemplifies its applicability. © 2017 The British Psychological Society.
A more powerful test based on ratio distribution for retention noninferiority hypothesis.
Deng, Ling; Chen, Gang
2013-03-11
Rothmann et al. ( 2003 ) proposed a method for the statistical inference of fraction retention noninferiority (NI) hypothesis. A fraction retention hypothesis is defined as a ratio of the new treatment effect verse the control effect in the context of a time to event endpoint. One of the major concerns using this method in the design of an NI trial is that with a limited sample size, the power of the study is usually very low. This makes an NI trial not applicable particularly when using time to event endpoint. To improve power, Wang et al. ( 2006 ) proposed a ratio test based on asymptotic normality theory. Under a strong assumption (equal variance of the NI test statistic under null and alternative hypotheses), the sample size using Wang's test was much smaller than that using Rothmann's test. However, in practice, the assumption of equal variance is generally questionable for an NI trial design. This assumption is removed in the ratio test proposed in this article, which is derived directly from a Cauchy-like ratio distribution. In addition, using this method, the fundamental assumption used in Rothmann's test, that the observed control effect is always positive, that is, the observed hazard ratio for placebo over the control is greater than 1, is no longer necessary. Without assuming equal variance under null and alternative hypotheses, the sample size required for an NI trial can be significantly reduced if using the proposed ratio test for a fraction retention NI hypothesis.
Type-II generalized family-wise error rate formulas with application to sample size determination.
Delorme, Phillipe; de Micheaux, Pierre Lafaye; Liquet, Benoit; Riou, Jérémie
2016-07-20
Multiple endpoints are increasingly used in clinical trials. The significance of some of these clinical trials is established if at least r null hypotheses are rejected among m that are simultaneously tested. The usual approach in multiple hypothesis testing is to control the family-wise error rate, which is defined as the probability that at least one type-I error is made. More recently, the q-generalized family-wise error rate has been introduced to control the probability of making at least q false rejections. For procedures controlling this global type-I error rate, we define a type-II r-generalized family-wise error rate, which is directly related to the r-power defined as the probability of rejecting at least r false null hypotheses. We obtain very general power formulas that can be used to compute the sample size for single-step and step-wise procedures. These are implemented in our R package rPowerSampleSize available on the CRAN, making them directly available to end users. Complexities of the formulas are presented to gain insight into computation time issues. Comparison with Monte Carlo strategy is also presented. We compute sample sizes for two clinical trials involving multiple endpoints: one designed to investigate the effectiveness of a drug against acute heart failure and the other for the immunogenicity of a vaccine strategy against pneumococcus. Copyright © 2016 John Wiley & Sons, Ltd. Copyright © 2016 John Wiley & Sons, Ltd.
p-Curve and p-Hacking in Observational Research
Bruns, Stephan B.; Ioannidis, John P. A.
2016-01-01
The p-curve, the distribution of statistically significant p-values of published studies, has been used to make inferences on the proportion of true effects and on the presence of p-hacking in the published literature. We analyze the p-curve for observational research in the presence of p-hacking. We show by means of simulations that even with minimal omitted-variable bias (e.g., unaccounted confounding) p-curves based on true effects and p-curves based on null-effects with p-hacking cannot be reliably distinguished. We also demonstrate this problem using as practical example the evaluation of the effect of malaria prevalence on economic growth between 1960 and 1996. These findings call recent studies into question that use the p-curve to infer that most published research findings are based on true effects in the medical literature and in a wide range of disciplines. p-values in observational research may need to be empirically calibrated to be interpretable with respect to the commonly used significance threshold of 0.05. Violations of randomization in experimental studies may also result in situations where the use of p-curves is similarly unreliable. PMID:26886098
A Closer Look at Data Independence: Comment on “Lies, Damned Lies, and Statistics (in Geology)”
NASA Astrophysics Data System (ADS)
Kravtsov, Sergey; Saunders, Rolando Olivas
2011-02-01
In his Forum (Eos, 90(47), 443, doi:10.1029/2009EO470004, 2009), P. Vermeesch suggests that statistical tests are not fit to interpret long data records. He asserts that for large enough data sets any true null hypothesis will always be rejected. This is certainly not the case! Here we revisit this author's example of weekly distribution of earthquakes and show that statistical results support the commonsense expectation that seismic activity does not depend on weekday (see the online supplement to this Eos issue for details (http://www.agu.org/eos_elec/)).
The earth is flat (p > 0.05): significance thresholds and the crisis of unreplicable research.
Amrhein, Valentin; Korner-Nievergelt, Fränzi; Roth, Tobias
2017-01-01
The widespread use of 'statistical significance' as a license for making a claim of a scientific finding leads to considerable distortion of the scientific process (according to the American Statistical Association). We review why degrading p -values into 'significant' and 'nonsignificant' contributes to making studies irreproducible, or to making them seem irreproducible. A major problem is that we tend to take small p -values at face value, but mistrust results with larger p -values. In either case, p -values tell little about reliability of research, because they are hardly replicable even if an alternative hypothesis is true. Also significance ( p ≤ 0.05) is hardly replicable: at a good statistical power of 80%, two studies will be 'conflicting', meaning that one is significant and the other is not, in one third of the cases if there is a true effect. A replication can therefore not be interpreted as having failed only because it is nonsignificant. Many apparent replication failures may thus reflect faulty judgment based on significance thresholds rather than a crisis of unreplicable research. Reliable conclusions on replicability and practical importance of a finding can only be drawn using cumulative evidence from multiple independent studies. However, applying significance thresholds makes cumulative knowledge unreliable. One reason is that with anything but ideal statistical power, significant effect sizes will be biased upwards. Interpreting inflated significant results while ignoring nonsignificant results will thus lead to wrong conclusions. But current incentives to hunt for significance lead to selective reporting and to publication bias against nonsignificant findings. Data dredging, p -hacking, and publication bias should be addressed by removing fixed significance thresholds. Consistent with the recommendations of the late Ronald Fisher, p -values should be interpreted as graded measures of the strength of evidence against the null hypothesis. Also larger p -values offer some evidence against the null hypothesis, and they cannot be interpreted as supporting the null hypothesis, falsely concluding that 'there is no effect'. Information on possible true effect sizes that are compatible with the data must be obtained from the point estimate, e.g., from a sample average, and from the interval estimate, such as a confidence interval. We review how confusion about interpretation of larger p -values can be traced back to historical disputes among the founders of modern statistics. We further discuss potential arguments against removing significance thresholds, for example that decision rules should rather be more stringent, that sample sizes could decrease, or that p -values should better be completely abandoned. We conclude that whatever method of statistical inference we use, dichotomous threshold thinking must give way to non-automated informed judgment.
The earth is flat (p > 0.05): significance thresholds and the crisis of unreplicable research
Korner-Nievergelt, Fränzi; Roth, Tobias
2017-01-01
The widespread use of ‘statistical significance’ as a license for making a claim of a scientific finding leads to considerable distortion of the scientific process (according to the American Statistical Association). We review why degrading p-values into ‘significant’ and ‘nonsignificant’ contributes to making studies irreproducible, or to making them seem irreproducible. A major problem is that we tend to take small p-values at face value, but mistrust results with larger p-values. In either case, p-values tell little about reliability of research, because they are hardly replicable even if an alternative hypothesis is true. Also significance (p ≤ 0.05) is hardly replicable: at a good statistical power of 80%, two studies will be ‘conflicting’, meaning that one is significant and the other is not, in one third of the cases if there is a true effect. A replication can therefore not be interpreted as having failed only because it is nonsignificant. Many apparent replication failures may thus reflect faulty judgment based on significance thresholds rather than a crisis of unreplicable research. Reliable conclusions on replicability and practical importance of a finding can only be drawn using cumulative evidence from multiple independent studies. However, applying significance thresholds makes cumulative knowledge unreliable. One reason is that with anything but ideal statistical power, significant effect sizes will be biased upwards. Interpreting inflated significant results while ignoring nonsignificant results will thus lead to wrong conclusions. But current incentives to hunt for significance lead to selective reporting and to publication bias against nonsignificant findings. Data dredging, p-hacking, and publication bias should be addressed by removing fixed significance thresholds. Consistent with the recommendations of the late Ronald Fisher, p-values should be interpreted as graded measures of the strength of evidence against the null hypothesis. Also larger p-values offer some evidence against the null hypothesis, and they cannot be interpreted as supporting the null hypothesis, falsely concluding that ‘there is no effect’. Information on possible true effect sizes that are compatible with the data must be obtained from the point estimate, e.g., from a sample average, and from the interval estimate, such as a confidence interval. We review how confusion about interpretation of larger p-values can be traced back to historical disputes among the founders of modern statistics. We further discuss potential arguments against removing significance thresholds, for example that decision rules should rather be more stringent, that sample sizes could decrease, or that p-values should better be completely abandoned. We conclude that whatever method of statistical inference we use, dichotomous threshold thinking must give way to non-automated informed judgment. PMID:28698825
A SIGNIFICANCE TEST FOR THE LASSO1
Lockhart, Richard; Taylor, Jonathan; Tibshirani, Ryan J.; Tibshirani, Robert
2014-01-01
In the sparse linear regression setting, we consider testing the significance of the predictor variable that enters the current lasso model, in the sequence of models visited along the lasso solution path. We propose a simple test statistic based on lasso fitted values, called the covariance test statistic, and show that when the true model is linear, this statistic has an Exp(1) asymptotic distribution under the null hypothesis (the null being that all truly active variables are contained in the current lasso model). Our proof of this result for the special case of the first predictor to enter the model (i.e., testing for a single significant predictor variable against the global null) requires only weak assumptions on the predictor matrix X. On the other hand, our proof for a general step in the lasso path places further technical assumptions on X and the generative model, but still allows for the important high-dimensional case p > n, and does not necessarily require that the current lasso model achieves perfect recovery of the truly active variables. Of course, for testing the significance of an additional variable between two nested linear models, one typically uses the chi-squared test, comparing the drop in residual sum of squares (RSS) to a χ12 distribution. But when this additional variable is not fixed, and has been chosen adaptively or greedily, this test is no longer appropriate: adaptivity makes the drop in RSS stochastically much larger than χ12 under the null hypothesis. Our analysis explicitly accounts for adaptivity, as it must, since the lasso builds an adaptive sequence of linear models as the tuning parameter λ decreases. In this analysis, shrinkage plays a key role: though additional variables are chosen adaptively, the coefficients of lasso active variables are shrunken due to the l1 penalty. Therefore, the test statistic (which is based on lasso fitted values) is in a sense balanced by these two opposing properties—adaptivity and shrinkage—and its null distribution is tractable and asymptotically Exp(1). PMID:25574062
Fractional parentage analysis and a scale-free reproductive network of brown trout.
Koyano, Hitoshi; Serbezov, Dimitar; Kishino, Hirohisa; Schweder, Tore
2013-11-07
In this study, we developed a method of fractional parentage analysis using microsatellite markers. We propose a method for calculating parentage probability, which considers missing data and genotyping errors due to null alleles and other causes, by regarding observed alleles as realizations of random variables which take values in the set of alleles at the locus and developing a method for simultaneously estimating the true and null allele frequencies of all alleles at each locus. We then applied our proposed method to a large sample collected from a wild population of brown trout (Salmo trutta). On analyzing the data using our method, we found that the reproductive success of brown trout obeyed a power law, indicating that when the parent-offspring relationship is regarded as a link, the reproductive system of brown trout is a scale-free network. Characteristics of the reproductive network of brown trout include individuals with large bodies as hubs in the network and different power exponents of degree distributions between males and females. © 2013 Elsevier Ltd. All rights reserved.
Ding, Shengli; Blue, Randal E.; Morgan, Douglas R.; Lund, Pauline K.
2015-01-01
Background Activatable near-infrared fluorescent (NIRF) probes have been used for ex vivo and in vivo detection of intestinal tumors in animal models. We hypothesized that NIRF probes activatable by cathepsins or MMPs will detect and quantify dextran sulphate sodium (DSS) induced acute colonic inflammation in wild type (WT) mice or chronic colitis in IL-10 null mice ex vivo or in vivo. Methods WT mice given DSS, water controls and IL-10 null mice with chronic colitis were administered probes by retro-orbital injection. FMT2500 LX system imaged fresh and fixed intestine ex vivo and mice in vivo. Inflammation detected by probes was verified by histology and colitis scoring. NIRF signal intensity was quantified using 2D region of interest (ROI) ex vivo or 3D ROI-analysis in vivo. Results Ex vivo, seven probes tested yielded significant higher NIRF signals in colon of DSS treated mice versus controls. A subset of probes was tested in IL-10 null mice and yielded strong ex vivo signals. Ex vivo fluorescence signal with 680 series probes was preserved after formalin fixation. In DSS and IL-10 null models, ex vivo NIRF signal strongly and significantly correlated with colitis scores. In vivo, ProSense680, CatK680FAST and MMPsense680 yielded significantly higher NIRF signals in DSS treated mice than controls but background was high in controls. Conclusion Both cathepsin or MMP-activated NIRF-probes can detect and quantify colonic inflammation ex vivo. ProSense680 yielded the strongest signals in DSS colitis ex vivo and in vivo, but background remains a problem for in vivo quantification of colitis. PMID:24374874
Harrington, S; Reeder, T W
2017-02-01
The binary-state speciation and extinction (BiSSE) model has been used in many instances to identify state-dependent diversification and reconstruct ancestral states. However, recent studies have shown that the standard procedure of comparing the fit of the BiSSE model to constant-rate birth-death models often inappropriately favours the BiSSE model when diversification rates vary in a state-independent fashion. The newly developed HiSSE model enables researchers to identify state-dependent diversification rates while accounting for state-independent diversification at the same time. The HiSSE model also allows researchers to test state-dependent models against appropriate state-independent null models that have the same number of parameters as the state-dependent models being tested. We reanalyse two data sets that originally used BiSSE to reconstruct ancestral states within squamate reptiles and reached surprising conclusions regarding the evolution of toepads within Gekkota and viviparity across Squamata. We used this new method to demonstrate that there are many shifts in diversification rates across squamates. We then fit various HiSSE submodels and null models to the state and phylogenetic data and reconstructed states under these models. We found that there is no single, consistent signal for state-dependent diversification associated with toepads in gekkotans or viviparity across all squamates. Our reconstructions show limited support for the recently proposed hypotheses that toepads evolved multiple times independently in Gekkota and that transitions from viviparity to oviparity are common in Squamata. Our results highlight the importance of considering an adequate pool of models and null models when estimating diversification rate parameters and reconstructing ancestral states. © 2016 European Society For Evolutionary Biology. Journal of Evolutionary Biology © 2016 European Society For Evolutionary Biology.
DOE Office of Scientific and Technical Information (OSTI.GOV)
Chung, Chi-Jung; Department of Medical Research, China Medical University Hospital, Taichung, Taiwan; Huang, Chao-Yuan
Inter-individual variation in the metabolism of xenobiotics, caused by factors such as cigarette smoking or inorganic arsenic exposure, is hypothesized to be a susceptibility factor for urothelial carcinoma (UC). Therefore, our study aimed to evaluate the role of gene–environment interaction in the carcinogenesis of UC. A hospital-based case–control study was conducted. Urinary arsenic profiles were measured using high-performance liquid chromatography–hydride generator-atomic absorption spectrometry. Genotyping was performed using a polymerase chain reaction-restriction fragment length polymorphism technique. Information about cigarette smoking exposure was acquired from a lifestyle questionnaire. Multivariate logistic regression was applied to estimate the UC risk associated with certain riskmore » factors. We found that UC patients had higher urinary levels of total arsenic, higher percentages of inorganic arsenic (InAs%) and monomethylarsonic acid (MMA%) and lower percentages of dimethylarsinic acid (DMA%) compared to controls. Subjects carrying the GSTM1 null genotype had significantly increased UC risk. However, no association was observed between gene polymorphisms of CYP1A1, EPHX1, SULT1A1 and GSTT1 and UC risk after adjustment for age and sex. Significant gene–environment interactions among urinary arsenic profile, cigarette smoking, and GSTM1 wild/null polymorphism and UC risk were observed after adjustment for potential risk factors. Overall, gene–environment interactions simultaneously played an important role in UC carcinogenesis. In the future, large-scale studies should be conducted using tag-SNPs of xenobiotic-metabolism-related enzymes for gene determination. -- Highlights: ► Subjects with GSTM1 null genotype had significantly increased UC risk. ► UC patients had poor arsenic metabolic ability compared to controls. ► GSTM1 null genotype may modify arsenic related UC risk.« less
Accumulation of True Single Strand Breaks and AP sites in Base Excision Repair Deficient Cells
Luke, April M.; Chastain, Paul D.; Pachkowski, Brian F.; Afonin, Valeriy; Takeda, Shunichi; Kaufman, David G.; Swenberg, James A.; Nakamura, Jun
2010-01-01
Single strand breaks (SSBs) are one of the most frequent DNA lesions caused by endogenous and exogenous agents. The most utilized alkaline-based assays for SSB detection frequently give false positive results due to the presence of alkali-labile sites that are converted to SSBs. Methoxyamine, an acidic O-hydroxylamine, has been utilized to measure DNA damage in cells. However, the neutralization of methoxyamine is required prior to usage. Here we developed a convenient, specific SSB assay using alkaline gel electrophoresis (AGE) coupled with a neutral O-hydroxylamine, O-(tetrahydro-2H-pyran-2-yl)hydroxylamine (OTX). OTX stabilizes abasic sites (AP sites) to prevent their alkaline incision while still allowing for strong alkaline DNA denaturation. DNA from DT40 and isogenic polymerase β null cells exposed to methyl methanesulfonate were applied to the OTX-coupled AGE (OTX-AGE) assay. Time-dependent increases in SSBs were detected in each cell line with more extensive SSB formation in the null cells. These findings were supported by an assay that indirectly detects SSBs through measuring NAD(P)H depletion. An ARP-slot blot assay demonstrated a significant time-dependent increase in AP sites in both cell lines by 1 mM MMS compared to control. Furthermore, the Pol β-null cells displayed greater AP site formation than the parental DT40 cells. OTX use represents a facile approach for assessing SSB formation, whose benefits can also be applied to other established SSB assays. PMID:20851134
Gerlai, R; Adams, B; Fitch, T; Chaney, S; Baez, M
2002-08-01
mGluR8 is a G-protein coupled metabotropic glutamate receptor expressed in the mammalian brain. Members of the mGluR family have been shown to be modulators of neural plasticity and learning and memory. Here we analyze the consequences of a null mutation at the mGluR8 gene locus generated using homologous recombination in embryonic stem cells by comparing the learning performance of the mutants with that of wild type controls in the Morris water maze (MWM) and the context and cue dependent fear conditioning (CFC). Our results revealed robust performance deficits associated with the genetic background, the ICR outbred strain, in both mGluR8 null mutant and the wild type control mice. Mice of this strain origin suffered from impaired vision as compared to CD1 or C57BL/6 mice, a significant impediment in MWM, a visuo-spatial learning task. The CFC task, being less dependent on visual cues, allowed us to reveal subtle performance deficits in the mGluR8 mutants: novelty induced hyperactivity and temporally delayed and blunted responding to shocks and temporally delayed responding to contextual stimuli were detected. The role of mGluR8 as a presynaptic autoreceptor and its contribution to cognitive processes are hypothesized and the utility of gene targeting as compared to pharmacological methods is discussed.
Laffitte, Marie-Claude N.; Leprohon, Philippe; Hainse, Maripier; Légaré, Danielle; Masson, Jean-Yves; Ouellette, Marc
2016-01-01
The parasite Leishmania often relies on gene rearrangements to survive stressful environments. However, safeguarding a minimum level of genome integrity is important for cell survival. We hypothesized that maintenance of genomic integrity in Leishmania would imply a leading role of the MRE11 and RAD50 proteins considering their role in DNA repair, chromosomal organization and protection of chromosomes ends in other organisms. Attempts to generate RAD50 null mutants in a wild-type background failed and we provide evidence that this gene is essential. Remarkably, inactivation of RAD50 was possible in a MRE11 null mutant that we had previously generated, providing good evidence that RAD50 may be dispensable in the absence of MRE11. Inactivation of the MRE11 and RAD50 genes led to a decreased frequency of homologous recombination and analysis of the null mutants by whole genome sequencing revealed several chromosomal translocations. Sequencing of the junction between translocated chromosomes highlighted microhomology sequences at the level of breakpoint regions. Sequencing data also showed a decreased coverage at subtelomeric locations in many chromosomes in the MRE11-/-RAD50-/- parasites. This study demonstrates an MRE11-independent microhomology-mediated end-joining mechanism and a prominent role for MRE11 and RAD50 in the maintenance of genomic integrity. Moreover, we suggest the possible involvement of RAD50 in subtelomeric regions stability. PMID:27314941
NASA Astrophysics Data System (ADS)
Hilburn, Monty D.
Successful lean manufacturing and cellular manufacturing execution relies upon a foundation of leadership commitment and strategic planning built upon solid data and robust analysis. The problem for this study was to create and employ a simple lean transformation planning model and review process that could be used to identify functional support staff resources required to plan and execute lean manufacturing cells within aerospace assembly and manufacturing sites. The lean planning model was developed using available literature for lean manufacturing kaizen best practices and validated through a Delphi panel of lean experts. The resulting model and a standardized review process were used to assess the state of lean transformation planning at five sites of an international aerospace manufacturing and assembly company. The results of the three day, on-site review were compared with baseline plans collected from each of the five sites to determine if there analyzed, with focus on three critical areas of lean planning: the number and type of manufacturing cells identified, the number, type, and duration of planned lean and continuous kaizen events, and the quantity and type of functional staffing resources planned to support the kaizen schedule. Summarized data of the baseline and on-site reviews was analyzed with descriptive statistics. ANOVAs and paired-t tests at 95% significance level were conducted on the means of data sets to determine if null hypotheses related to cell, kaizen event, and support resources could be rejected. The results of the research found significant differences between lean transformation plans developed by site leadership and plans developed utilizing the structured, on-site review process and lean transformation planning model. The null hypothesis that there was no difference between the means of pre-review and on-site cell counts was rejected, as was the null hypothesis that there was no significant difference in kaizen event plans. These factors are critical inputs into the support staffing resources calculation used by the lean planning model. Null hypothesis related to functional support staff resources was rejected for most functional groups, indicating that the baseline site plan inadequately provided for cross-functional staff involvement to support the lean transformation plan. Null hypotheses related to total lean transformation staffing could not be rejected, indicating that while total staffing plans were not significantly different than plans developed during the on-site review and through the use of the lean planning model, the allocation of staffing among various functional groups such as engineering, production, and materials planning was an issue. The on-site review process and simple lean transformation plan developed was determined to be useful in identifying short-comings in lean transformation planning within aerospace manufacturing and assembly sites. It was concluded that the differences uncovered were likely contributing factors affecting the effectiveness of aerospace manufacturing sites' implementation of lean cellular manufacturing.
Using prediction markets to estimate the reproducibility of scientific research.
Dreber, Anna; Pfeiffer, Thomas; Almenberg, Johan; Isaksson, Siri; Wilson, Brad; Chen, Yiling; Nosek, Brian A; Johannesson, Magnus
2015-12-15
Concerns about a lack of reproducibility of statistically significant results have recently been raised in many fields, and it has been argued that this lack comes at substantial economic costs. We here report the results from prediction markets set up to quantify the reproducibility of 44 studies published in prominent psychology journals and replicated in the Reproducibility Project: Psychology. The prediction markets predict the outcomes of the replications well and outperform a survey of market participants' individual forecasts. This shows that prediction markets are a promising tool for assessing the reproducibility of published scientific results. The prediction markets also allow us to estimate probabilities for the hypotheses being true at different testing stages, which provides valuable information regarding the temporal dynamics of scientific discovery. We find that the hypotheses being tested in psychology typically have low prior probabilities of being true (median, 9%) and that a "statistically significant" finding needs to be confirmed in a well-powered replication to have a high probability of being true. We argue that prediction markets could be used to obtain speedy information about reproducibility at low cost and could potentially even be used to determine which studies to replicate to optimally allocate limited resources into replications.
Using prediction markets to estimate the reproducibility of scientific research
Dreber, Anna; Pfeiffer, Thomas; Almenberg, Johan; Isaksson, Siri; Wilson, Brad; Chen, Yiling; Nosek, Brian A.; Johannesson, Magnus
2015-01-01
Concerns about a lack of reproducibility of statistically significant results have recently been raised in many fields, and it has been argued that this lack comes at substantial economic costs. We here report the results from prediction markets set up to quantify the reproducibility of 44 studies published in prominent psychology journals and replicated in the Reproducibility Project: Psychology. The prediction markets predict the outcomes of the replications well and outperform a survey of market participants’ individual forecasts. This shows that prediction markets are a promising tool for assessing the reproducibility of published scientific results. The prediction markets also allow us to estimate probabilities for the hypotheses being true at different testing stages, which provides valuable information regarding the temporal dynamics of scientific discovery. We find that the hypotheses being tested in psychology typically have low prior probabilities of being true (median, 9%) and that a “statistically significant” finding needs to be confirmed in a well-powered replication to have a high probability of being true. We argue that prediction markets could be used to obtain speedy information about reproducibility at low cost and could potentially even be used to determine which studies to replicate to optimally allocate limited resources into replications. PMID:26553988
Han, Buhm; Kang, Hyun Min; Eskin, Eleazar
2009-01-01
With the development of high-throughput sequencing and genotyping technologies, the number of markers collected in genetic association studies is growing rapidly, increasing the importance of methods for correcting for multiple hypothesis testing. The permutation test is widely considered the gold standard for accurate multiple testing correction, but it is often computationally impractical for these large datasets. Recently, several studies proposed efficient alternative approaches to the permutation test based on the multivariate normal distribution (MVN). However, they cannot accurately correct for multiple testing in genome-wide association studies for two reasons. First, these methods require partitioning of the genome into many disjoint blocks and ignore all correlations between markers from different blocks. Second, the true null distribution of the test statistic often fails to follow the asymptotic distribution at the tails of the distribution. We propose an accurate and efficient method for multiple testing correction in genome-wide association studies—SLIDE. Our method accounts for all correlation within a sliding window and corrects for the departure of the true null distribution of the statistic from the asymptotic distribution. In simulations using the Wellcome Trust Case Control Consortium data, the error rate of SLIDE's corrected p-values is more than 20 times smaller than the error rate of the previous MVN-based methods' corrected p-values, while SLIDE is orders of magnitude faster than the permutation test and other competing methods. We also extend the MVN framework to the problem of estimating the statistical power of an association study with correlated markers and propose an efficient and accurate power estimation method SLIP. SLIP and SLIDE are available at http://slide.cs.ucla.edu. PMID:19381255
2014-01-01
Background Thresholds for statistical significance are insufficiently demonstrated by 95% confidence intervals or P-values when assessing results from randomised clinical trials. First, a P-value only shows the probability of getting a result assuming that the null hypothesis is true and does not reflect the probability of getting a result assuming an alternative hypothesis to the null hypothesis is true. Second, a confidence interval or a P-value showing significance may be caused by multiplicity. Third, statistical significance does not necessarily result in clinical significance. Therefore, assessment of intervention effects in randomised clinical trials deserves more rigour in order to become more valid. Methods Several methodologies for assessing the statistical and clinical significance of intervention effects in randomised clinical trials were considered. Balancing simplicity and comprehensiveness, a simple five-step procedure was developed. Results For a more valid assessment of results from a randomised clinical trial we propose the following five-steps: (1) report the confidence intervals and the exact P-values; (2) report Bayes factor for the primary outcome, being the ratio of the probability that a given trial result is compatible with a ‘null’ effect (corresponding to the P-value) divided by the probability that the trial result is compatible with the intervention effect hypothesised in the sample size calculation; (3) adjust the confidence intervals and the statistical significance threshold if the trial is stopped early or if interim analyses have been conducted; (4) adjust the confidence intervals and the P-values for multiplicity due to number of outcome comparisons; and (5) assess clinical significance of the trial results. Conclusions If the proposed five-step procedure is followed, this may increase the validity of assessments of intervention effects in randomised clinical trials. PMID:24588900
Temporal variation in survival and recovery rates of lesser scaup: A response
Arnold, Todd W.; Afton, Alan D.; Anteau, Michael J.; Koons, David N.; Nicolai, Chris A.
2017-01-01
We recently analyzed long-term (1951–2011) continental band-recovery data from lesser scaup (Aythya affinis) and demonstrated that harvest rates declined through time, but annual survival rates exhibited no such trends; moreover, annual harvest and survival rates were uncorrelated for all age-sex classes. We therefore concluded that declining fecundity was most likely responsible for recent population declines, rather than changes in harvest or survival. Lindberg et al. (2017) critiqued our conclusions, arguing that we did little more than fail to reject a null hypothesis of compensatory mortality, postulated ecologically unrealistic changes in fecundity, and failed to give sufficient consideration to additive harvest mortality. Herein, we re-summarize our original evidence indicating that harvest has been compensatory, or at most weakly additive, and demonstrate that our analysis had sufficient power to detect strongly additive mortality if it occurred. We further demonstrate that our conclusions were not confounded by population size, band loss, or individual heterogeneity, as suggested by Lindberg et al. (2017), and we provide additional support for our conjecture that low fecundity played a major role in declining scaup populations during 1983–2006. We therefore reiterate our original management recommendations: given low harvest rates and lack of demonstrable effect on scaup survival, harvest regulations could return to more liberal frameworks, and waterfowl biologists should work together to continue banding lesser scaup and use these data to explore alternative hypotheses to identify the true ecological causes of population change, given that it is unlikely to be excessive harvest.
Fordyce, James A
2010-07-23
Phylogenetic hypotheses are increasingly being used to elucidate historical patterns of diversification rate-variation. Hypothesis testing is often conducted by comparing the observed vector of branching times to a null, pure-birth expectation. A popular method for inferring a decrease in speciation rate, which might suggest an early burst of diversification followed by a decrease in diversification rate is the gamma statistic. Using simulations under varying conditions, I examine the sensitivity of gamma to the distribution of the most recent branching times. Using an exploratory data analysis tool for lineages through time plots, tree deviation, I identified trees with a significant gamma statistic that do not appear to have the characteristic early accumulation of lineages consistent with an early, rapid rate of cladogenesis. I further investigated the sensitivity of the gamma statistic to recent diversification by examining the consequences of failing to simulate the full time interval following the most recent cladogenic event. The power of gamma to detect rate decrease at varying times was assessed for simulated trees with an initial high rate of diversification followed by a relatively low rate. The gamma statistic is extraordinarily sensitive to recent diversification rates, and does not necessarily detect early bursts of diversification. This was true for trees of various sizes and completeness of taxon sampling. The gamma statistic had greater power to detect recent diversification rate decreases compared to early bursts of diversification. Caution should be exercised when interpreting the gamma statistic as an indication of early, rapid diversification.
Does movement influence representations of time and space?
2017-01-01
Embodied cognition posits that abstract conceptual knowledge such as mental representations of time and space are at least partially grounded in sensorimotor experiences. If true, then the execution of whole-body movements should result in modulations of temporal and spatial reference frames. To scrutinize this hypothesis, in two experiments participants either walked forward, backward or stood on a treadmill and responded either to an ambiguous temporal question (Experiment 1) or an ambiguous spatial question (Experiment 2) at the end of the walking manipulation. Results confirmed the ambiguousness of the questions in the control condition. Nevertheless, despite large power, walking forward or backward did not influence the answers or response times to the temporal (Experiment 1) or spatial (Experiment 2) question. A follow-up Experiment 3 indicated that this is also true for walking actively (or passively) in free space (as opposed to a treadmill). We explore possible reasons for the null-finding as concerns the modulation of temporal and spatial reference frames by movements and we critically discuss the methodological and theoretical implications. PMID:28376130
Does movement influence representations of time and space?
Loeffler, Jonna; Raab, Markus; Cañal-Bruland, Rouwen
2017-01-01
Embodied cognition posits that abstract conceptual knowledge such as mental representations of time and space are at least partially grounded in sensorimotor experiences. If true, then the execution of whole-body movements should result in modulations of temporal and spatial reference frames. To scrutinize this hypothesis, in two experiments participants either walked forward, backward or stood on a treadmill and responded either to an ambiguous temporal question (Experiment 1) or an ambiguous spatial question (Experiment 2) at the end of the walking manipulation. Results confirmed the ambiguousness of the questions in the control condition. Nevertheless, despite large power, walking forward or backward did not influence the answers or response times to the temporal (Experiment 1) or spatial (Experiment 2) question. A follow-up Experiment 3 indicated that this is also true for walking actively (or passively) in free space (as opposed to a treadmill). We explore possible reasons for the null-finding as concerns the modulation of temporal and spatial reference frames by movements and we critically discuss the methodological and theoretical implications.
Haider, Husnain Kh; Jiang, Shujia; Idris, Niagara M; Ashraf, Muhammad
2008-11-21
We hypothesized that mesenchymal stem cells (MSCs) overexpressing insulin-like growth factor (IGF)-1 showed improved survival and engraftment in the infarcted heart and promoted stem cell recruitment through paracrine release of stromal cell-derived factor (SDF)-1alpha. Rat bone marrow-derived MSCs were used as nontransduced ((Norm)MSCs) or transduced with adenoviral-null vector ((Null)MSCs) or vector encoding for IGF-1 ((IGF-1)MSCs). (IGF-1)MSCs secreted higher IGF-1 until 12 days of observation (P<0.001 versus (Null)MSCs). Molecular studies revealed activation of phosphoinositide 3-kinase, Akt, and Bcl.xL and inhibition of glycogen synthase kinase 3beta besides release of SDF-1alpha in parallel with IGF-1 expression in (IGF-1)MSCs. For in vivo studies, 70 muL of DMEM without cells (group 1) or containing 1.5x10(6) (Null)MSCs (group 2) or (IGF-1)MSCs (group 3) were implanted intramyocardially in a female rat model of permanent coronary artery occlusion. One week later, immunoblot on rat heart tissue (n=4 per group) showed elevated myocardial IGF-1 and phospho-Akt in group 3 and higher survival of (IGF-1)MSCs (P<0.06 versus (Null)MSCs) (n=6 per group). SDF-1alpha was increased in group 3 animal hearts (20-fold versus group 2), with massive mobilization and homing of ckit(+), MDR1(+), CD31(+), and CD34(+) cells into the infarcted heart. Infarction size was significantly reduced in cell transplanted groups compared with the control. Confocal imaging after immunostaining for myosin heavy chain, actinin, connexin-43, and von Willebrand factor VIII showed extensive angiomyogenesis in the infarcted heart. Indices of left ventricular function, including ejection fraction and fractional shortening, were improved in group 3 as compared with group 1 (P<0.05). In conclusion, the strategy of IGF-1 transgene expression induced massive stem cell mobilization via SDF-1alpha signaling and culminated in extensive angiomyogenesis in the infarcted heart.
Jones, Jennifer C; Kroscher, Kellie A; Dilger, Anna C
2014-03-28
Genes that decline in expression with age and are thought to coordinate growth cessation have been identified in various organs, but their expression in skeletal muscle is unknown. Therefore, our objective was to determine expression of these genes (Ezh2, Gpc3, Mdk, Mest, Mycn, Peg3, and Plagl1) in skeletal muscle from birth to maturity. We hypothesized that expression of these genes would decline with age in skeletal muscle but differ between sexes and between wild type and myostatin null mice. Female and male wild type and myostatin null mice (C57BL/6J background) were sacrificed by carbon dioxide asphyxiation followed by decapitation at d -7, 0, 21, 42, and 70 days of age. Whole bodies at d -7, all muscles from both hind limbs at d 0, and bicep femoris muscle from d 21, 42 and 70 were collected. Gene expression was determined by quantitative real-time PCR. In general, expression of these growth-regulating genes was reduced at d 21 compared with day 0 and d -7. Expression of Gpc3, Mest, and Peg3 was further reduced at d 42 and 70 compared with d 21, however the expression of Mycn increased from d 21 to d 42 and 70. Myostatin null mice, as expected, were heavier with increased biceps femoris weight at d 70. However, with respect to sex and genotype, there were few differences in expression. Expression of Ezh2 was increased at d 70 and expression of Mdk was increased at d 21 in myostatin null mice compared with wild type, but no other genotype effects were present. Expression of Mdk was increased in females compared to males at d 70, but no other sex effects were present. Overall, these data suggest the downregulation of these growth-regulating genes with age might play a role in the coordinated cessation of muscle growth similar to organ growth but likely have a limited role in the differences between sexes or genotypes.
Loss of Desmocollin 3 in Skin Tumor Development and Progression
Chen, Jiangli; O’Shea, Charlene; Fitzpatrick, James E.; Koster, Maranke I.; Koch, Peter J.
2011-01-01
Desmocollin 3 (DSC3) is a desmosomal cadherin that is required for maintaining cell adhesion in the epidermis as demonstrated by the intra-epidermal blistering observed in Dsc3 null skin. Recently, it has been suggested that deregulated expression of DSC3 occurs in certain human tumor types. It is not clear whether DSC3 plays a role in the development or progression of cancers arising in stratified epithelia such as the epidermis. To address this issue, we generated a mouse model in which Dsc3 expression is ablated in K-Ras oncogene-induced skin tumors. Our results demonstrate that loss of Dsc3 leads to an increase in K-Ras induced skin tumors. We hypothesize that acantholysis-induced epidermal hyperplasia in the Dsc3 null epidermis facilitates Ras-induced tumor development. Further, we demonstrate that spontaneous loss of DSC3 expression is a common occurrence during human and mouse skin tumor progression. This loss occurs in tumor cells invading the dermis. Interestingly, other desmosomal proteins are still expressed in tumor cells that lack DSC3, suggesting a specific function of DSC3 loss in tumor progression. While loss of DSC3 on the skin surface leads to epidermal blistering, it does not appear to induce loss of cell-cell adhesion in tumor cells invading the dermis, most likely due to a protection of these cells within the dermis from mechanical stress. We thus hypothesize that DSC3 can contribute to the progression of tumors both by cell adhesion-dependent (skin surface) and likely by cell adhesion-independent (invading tumor cells) mechanisms. PMID:21681825
NASA Astrophysics Data System (ADS)
Landry, M. R.; Taylor, A. G.
2016-02-01
Phytoplankton community structure is shaped both by the bottom-up influences of the physical-chemical environment and by the top-down impacts of food webs. Emergent patterns in the contemporary ocean can thus be "null hypotheses" of future changes assuming that the underlying structuring relationships remain intact but only shift spatially. To provide such a context for the California Current Ecosystem (CCE) and adjacent open-ocean ecosystems, we used a combination of digital epifluorescence microscopy and flow cytometry to investigate variability of phytoplankton biomass, composition and size structure across gradients of ecosystem richness, as represented by total autotrophic carbon (AC). Biomass of large micro-sized (>20 µm) phytoplankton increases as a power function with system richness. Nano-sized cells (2-20 µm) increase at a lower rate at low AC, and level off at high AC. Pico-sized cells (<2-µm) do not clearly dominate at low AC and decline significantly at high AC, neither predicted by competition theory. This study provides several new insights into structural relationships and mechanisms in the CCE: 1) diatoms and dinoflagellates co-dominate the micro-phytoplankton size class throughout the range of system richness; 2) nano-phytoplankton co-dominate biomass in oligotrophic (low AC) waters, suggesting widespread mixotrophy rather than direct competition with pico-phytoplankton for nutrients; and 3) the pico-phytoplankton decline at high AC impacts small eukaryotes as well as photosynthetic bacteria, consistent with a broad stimulation of grazing pressure on all bacterial-sized cells in richer systems. Observed variability in heterotrophic bacteria and nano-flagellate grazers with system richness is consistent with this mechanism.
Wildfire Selectivity for Land Cover Type: Does Size Matter?
Barros, Ana M. G.; Pereira, José M. C.
2014-01-01
Previous research has shown that fires burn certain land cover types disproportionally to their abundance. We used quantile regression to study land cover proneness to fire as a function of fire size, under the hypothesis that they are inversely related, for all land cover types. Using five years of fire perimeters, we estimated conditional quantile functions for lower (avoidance) and upper (preference) quantiles of fire selectivity for five land cover types - annual crops, evergreen oak woodlands, eucalypt forests, pine forests and shrublands. The slope of significant regression quantiles describes the rate of change in fire selectivity (avoidance or preference) as a function of fire size. We used Monte-Carlo methods to randomly permutate fires in order to obtain a distribution of fire selectivity due to chance. This distribution was used to test the null hypotheses that 1) mean fire selectivity does not differ from that obtained by randomly relocating observed fire perimeters; 2) that land cover proneness to fire does not vary with fire size. Our results show that land cover proneness to fire is higher for shrublands and pine forests than for annual crops and evergreen oak woodlands. As fire size increases, selectivity decreases for all land cover types tested. Moreover, the rate of change in selectivity with fire size is higher for preference than for avoidance. Comparison between observed and randomized data led us to reject both null hypotheses tested ( = 0.05) and to conclude it is very unlikely the observed values of fire selectivity and change in selectivity with fire size are due to chance. PMID:24454747
Wason, James M. S.; Mander, Adrian P.
2012-01-01
Two-stage designs are commonly used for Phase II trials. Optimal two-stage designs have the lowest expected sample size for a specific treatment effect, for example, the null value, but can perform poorly if the true treatment effect differs. Here we introduce a design for continuous treatment responses that minimizes the maximum expected sample size across all possible treatment effects. The proposed design performs well for a wider range of treatment effects and so is useful for Phase II trials. We compare the design to a previously used optimal design and show it has superior expected sample size properties. PMID:22651118
A large scale test of the gaming-enhancement hypothesis
Wang, John C.
2016-01-01
A growing research literature suggests that regular electronic game play and game-based training programs may confer practically significant benefits to cognitive functioning. Most evidence supporting this idea, the gaming-enhancement hypothesis, has been collected in small-scale studies of university students and older adults. This research investigated the hypothesis in a general way with a large sample of 1,847 school-aged children. Our aim was to examine the relations between young people’s gaming experiences and an objective test of reasoning performance. Using a Bayesian hypothesis testing approach, evidence for the gaming-enhancement and null hypotheses were compared. Results provided no substantive evidence supporting the idea that having preference for or regularly playing commercially available games was positively associated with reasoning ability. Evidence ranged from equivocal to very strong in support for the null hypothesis over what was predicted. The discussion focuses on the value of Bayesian hypothesis testing for investigating electronic gaming effects, the importance of open science practices, and pre-registered designs to improve the quality of future work. PMID:27896035
Shimoda, Yoshiteru; Ogawa, Yoshikazu; Endo, Hidenori; Watanabe, Mika; Tominaga, Teiji
2012-01-01
Coexistence of brain tumors of different pathologies is rare, and the majority of the cases were related to genetic disorders or secondary tumors occurring after radiotherapy. A 73-year-old man was introduced to the outpatient department suffering from severe nausea and vertigo. Magnetic resonance imaging showed a cystic tumor in the left cerebellar hemisphere and another lesion in the sella turcica. There was no evident family history of von Hippel-Lindau (VHL) disease, and the systemic investigation failed to detect any other tumors or signs of VHL disease. Treatment was performed in two stages, and he was discharged with remaining slight ataxic gait. The diagnoses were cerebellar hemangioblastoma and pituitary null cell adenoma. Additional immunohistochemical investigation using VHL disease gene-related protein in both tumors showed minute granular positive staining in the cytoplasm of stromal cells in the former, and diffuse and strong granular cytoplasmic positive staining in the latter. Further analysis is required to confirm the true implication of the VHL gene mutation, and the possible involvement of VHL gene-related protein in the pathogenesis of these coexisting tumors.
A test for patterns of modularity in sequences of developmental events.
Poe, Steven
2004-08-01
This study presents a statistical test for modularity in the context of relative timing of developmental events. The test assesses whether sets of developmental events show special phylogenetic conservation of rank order. The test statistic is the correlation coefficient of developmental ranks of the N events of the hypothesized module across taxa. The null distribution is obtained by taking correlation coefficients for randomly sampled sets of N events. This test was applied to two datasets, including one where phylogenetic information was taken into account. The events of limb development in two frog species were found to behave as a module.
NASA Astrophysics Data System (ADS)
Mulkerrin, Elizabeth A.
The purpose of this study was to determine the effect of an 11th-grade and 12th-grade zoo-based academic high school experiential science program compared to a same school-district school-based academic high school experiential science program on students' pretest and posttest science, math, and reading achievement, and student perceptions of program relevance, rigor, and relationships. Science coursework delivery site served as the study's independent variable for the two naturally formed groups representing students (n = 18) who completed a zoo-based experiential academic high school science program and students (n = 18) who completed a school-based experiential academic high school science program. Students in the first group, a zoo-based experiential academic high school science program, completed real world, hands-on projects at the zoo while students in the second group, those students who completed a school-based experiential academic high school science program, completed real world, simulated projects in the classroom. These groups comprised the two research arms of the study. Both groups of students were selected from the same school district. The study's two dependent variables were achievement and school climate. Achievement was analyzed using norm-referenced 11th-grade pretest PLAN and 12th-grade posttest ACT test composite scores. Null hypotheses were rejected in the direction of improved test scores for both science program groups---students who completed the zoo-based experiential academic high school science program (p < .001) and students who completed the school-based experiential academic high school science program (p < .001). The posttest-posttest ACT test composite score comparison was not statistically different ( p = .93) indicating program equipoise for students enrolled in both science programs. No overall weighted grade point average score improvement was observed for students in either science group, however, null hypotheses were rejected in the direction of improved science grade point average scores for 11th-grade (p < .01) and 12th-grade (p = .01) students who completed the zoo-based experiential academic high school science program. Null hypotheses were not rejected for between group posttest science grade point average scores and school district criterion reference math and reading test scores. Finally, students who completed the zoo-based experiential academic high school science program had statistically improved pretest-posttest perceptions of program relationship scores (p < .05) and compared to students who completed the school-based experiential academic high school science program had statistically greater posttest perceptions of program relevance (p < .001), perceptions of program rigor (p < .001), and perceptions of program relationships (p < .001).
Toward, Marie A.; Abdala, Ana P.; Knopp, Sharon J.; Paton, Julian F. R.; Bissonnette, John M.
2013-01-01
Mice deficient in the transcription factor methyl-CpG-binding protein 2 (Mecp2), a mouse model of Rett syndrome, display reduced CO2 chemosensitivity, which may contribute to their breathing abnormalities. In addition, patients with Rett syndrome and male mice that are null for Mecp2 show reduced levels of brain serotonin (5-HT). Serotonin is known to play a role in central chemosensitivity, and we hypothesized that increasing the availability of 5-HT in this mouse model would improve their respiratory response to CO2. Here we determined the apnoeic threshold in heterozygous Mecp2-deficient female mice and examined the effects of blocking 5-HT reuptake on the CO2 response in Mecp2-null male mice. Studies were performed in B6.129P2(C)-Mecp2τm1.1Bird null males and heterozygous females. In an in situ preparation, seven of eight Mecp2-deficient heterozygous females showed arrest of phrenic nerve activity when arterial CO2 was lowered to 3%, whereas the wild-types maintained phrenic nerve amplitude at 53 ± 3% of maximal. In vivo plethysmography studies were used to determine CO2 chemosensitivity in null males. These mice were exposed sequentially to 1, 3 and 5% CO2. The percentage increase in minute ventilation in response to increased inspired CO2 was less in Mecp2−/y than in Mecp2+/y mice. Pretreatment with citalopram, a selective 5-HT reuptake inhibitor (2.5 mg kg−1 I.P.), 40 min prior to CO2 exposure, in Mecp2−/y mice resulted in an improvement in CO2 chemosensitivity to wild-type levels. These results suggest that decreased 5-HT in Mecp2-deficient mice reduces CO2 chemosensitivity, and restoring 5-HT levels can reverse this effect. PMID:23180809
Eid, Ashraf A.; Komabayashi, Takashi; Watanabe, Etsuko; Shiraishi, Takanobu; Watanabe, Ikuya
2012-01-01
Introduction Mineral trioxide aggregate (MTA) has been used successfully for perforation repair, vital pulpotomies, and direct pulp capping. However, little is known about the interactions between MTA and glass ionomer cement (GIC) in final restorations. In this study, 2 null hypotheses were tested: (1) GIC placement time does not affect the MTA-GIC structural interface and hardness and (2) moisture does not affect the MTA-GIC structural interface and hardness. Methods Fifty cylinders were half filled with MTA and divided into 5 groups. The other half was filled with resin-modified GIC either immediately after MTA placement or after 1 or 7 days of temporization in the presence or absence of a wet cotton pellet. The specimens were then sectioned, carbon coated, and examined using a scanning electron microscope and an electron probe micro-analyzer (SEM-EPMA) for interfacial adaptation, gap formation, and elemental analysis. The Vickers hardness numbers of the interfacial MTA were recorded 24 hours after GIC placement and 8 days after MTA placement and analyzed using the analysis of variance test. Results Hardness testing 24 hours after GIC placement revealed a significant increase in hardness with an increase of temporization time but not with a change of moisture conditions (P < .05). Hardness testing 8 days after MTA placement indicated no significant differences among groups. SEM-EPMA showed interfacial adaptation to improve with temporization time and moisture. Observed changes were limited to the outermost layer of MTA. The 2 null hypotheses were not rejected. Conclusions GIC can be applied over freshly mixed MTA with minimal effects on the MTA, which seemed to decrease with time. PMID:22794220
Tsubakita, Takashi; Shimazaki, Kazuyo
2016-01-01
To examine the factorial validity of the Maslach Burnout Inventory-Student Survey, using a sample of 2061 Japanese university students majoring in the medical and natural sciences (67.9% male, 31.8% female; Mage = 19.6 years, standard deviation = 1.5). The back-translated scale used unreversed items to assess inefficacy. The inventory's descriptive properties and Cronbach's alphas were calculated using SPSS software. The present authors compared fit indices of the null, one factor, and default three factor models via confirmatory factor analysis with maximum-likelihood estimation using AMOS software, version 21.0. Intercorrelations between exhaustion, cynicism, and inefficacy were relatively higher than in prior studies. Cronbach's alphas were 0.76, 0.85, and 0.78, respectively. Although fit indices of the hypothesized three factor model did not meet the respective criteria, the model demonstrated better fit than did the null and one factor models. The present authors added four paths between error variables within items, but the modified model did not show satisfactory fit. Subsequent analysis revealed that a bi-factor model fit the data better than did the hypothesized or modified three factor models. The Japanese version of the Maslach Burnout Inventory-Student Survey needs minor changes to improve the fit of its three factor model, but the scale as a whole can be used to adequately assess overall academic burnout in Japanese university students. Although the scale was back-translated, two items measuring exhaustion whose expressions overlapped should be modified, and all items measuring inefficacy should be reversed in order to statistically clarify the factorial difference between the scale's three factors. © 2015 The Authors. Japan Journal of Nursing Science © 2015 Japan Academy of Nursing Science.
Phase II design with sequential testing of hypotheses within each stage.
Poulopoulou, Stavroula; Karlis, Dimitris; Yiannoutsos, Constantin T; Dafni, Urania
2014-01-01
The main goal of a Phase II clinical trial is to decide, whether a particular therapeutic regimen is effective enough to warrant further study. The hypothesis tested by Fleming's Phase II design (Fleming, 1982) is [Formula: see text] versus [Formula: see text], with level [Formula: see text] and with a power [Formula: see text] at [Formula: see text], where [Formula: see text] is chosen to represent the response probability achievable with standard treatment and [Formula: see text] is chosen such that the difference [Formula: see text] represents a targeted improvement with the new treatment. This hypothesis creates a misinterpretation mainly among clinicians that rejection of the null hypothesis is tantamount to accepting the alternative, and vice versa. As mentioned by Storer (1992), this introduces ambiguity in the evaluation of type I and II errors and the choice of the appropriate decision at the end of the study. Instead of testing this hypothesis, an alternative class of designs is proposed in which two hypotheses are tested sequentially. The hypothesis [Formula: see text] versus [Formula: see text] is tested first. If this null hypothesis is rejected, the hypothesis [Formula: see text] versus [Formula: see text] is tested next, in order to examine whether the therapy is effective enough to consider further testing in a Phase III study. For the derivation of the proposed design the exact binomial distribution is used to calculate the decision cut-points. The optimal design parameters are chosen, so as to minimize the average sample number (ASN) under specific upper bounds for error levels. The optimal values for the design were found using a simulated annealing method.
Caffrey, James R; Hughes, Barry D; Britto, Joanne M; Landman, Kerry A
2014-01-01
The characteristic six-layered appearance of the neocortex arises from the correct positioning of pyramidal neurons during development and alterations in this process can cause intellectual disabilities and developmental delay. Malformations in cortical development arise when neurons either fail to migrate properly from the germinal zones or fail to cease migration in the correct laminar position within the cortical plate. The Reelin signalling pathway is vital for correct neuronal positioning as loss of Reelin leads to a partially inverted cortex. The precise biological function of Reelin remains controversial and debate surrounds its role as a chemoattractant or stop signal for migrating neurons. To investigate this further we developed an in silico agent-based model of cortical layer formation. Using this model we tested four biologically plausible hypotheses for neuron motility and four biologically plausible hypotheses for the loss of neuron motility (conversion from migration). A matrix of 16 combinations of motility and conversion rules was applied against the known structure of mouse cortical layers in the wild-type cortex, the Reelin-null mutant, the Dab1-null mutant and a conditional Dab1 mutant. Using this approach, many combinations of motility and conversion mechanisms can be rejected. For example, the model does not support Reelin acting as a repelling or as a stopping signal. In contrast, the study lends very strong support to the notion that the glycoprotein Reelin acts as a chemoattractant for neurons. Furthermore, the most viable proposition for the conversion mechanism is one in which conversion is affected by a motile neuron sensing in the near vicinity neurons that have already converted. Therefore, this model helps elucidate the function of Reelin during neuronal migration and cortical development.
Global Well-Posedness of the Incompressible Magnetohydrodynamics
NASA Astrophysics Data System (ADS)
Cai, Yuan; Lei, Zhen
2018-06-01
This paper studies the Cauchy problem of the incompressible magnetohydro dynamic systems with or without viscosity ν. Under the assumption that the initial velocity field and the displacement of the initialmagnetic field froma non-zero constant are sufficiently small in certain weighted Sobolev spaces, the Cauchy problem is shown to be globally well-posed for all ν ≧ 0 and all spaces with dimension n ≧ 2. Such a result holds true uniformly in nonnegative viscosity parameters. The proof is based on the inherent strong null structure of the systems introduced by Lei (Commun Pure Appl Math 69(11):2072-2106, 2016) and the ghost weight technique introduced by Alinhac (Invent Math 145(3):597-618, 2001).
The acute monocytic leukemias: multidisciplinary studies in 45 patients.
Straus, D J; Mertelsmann, R; Koziner, B; McKenzie, S; de Harven, E; Arlin, Z A; Kempin, S; Broxmeyer, H; Moore, M A; Menendez-Botet, C J; Gee, T S; Clarkson, B D
1980-11-01
The clinical and laboratory features of 37 patients with variants of acute monocytic leukemia are described. Three of these 37 patients who had extensive extramedullary leukemic tissue infiltration are examples of true histiocytic "lymphomas." Three additional patients with undifferentiated leukemias, one patient with refractory anemia with excess of blasts, one patient with chronic myelomonocytic leukemia, one patient with B-lymphocyte diffuse "histiocytic" lymphoma and one patient with "null" cell, terminal deoxynucleotidyl transferase-positive lymphoblastic lymphoma had bone marrow cells with monocytic features. Another patient had dual populations of lymphoid and monocytoid leukemic cells. The true monocytic leukemias, acute monocytic leukemia (AMOL) and acute myelomonocytic leukemia (AMMOL), are closely related to acute myelocytic leukemia (AML) morphologically and by their response to chemotherapy. like AML, the leukemic cells from the AMMOL and AMOL patients form leukemic clusters in semisolid media. Cytochemical staining of leukemic cells for nonspecific esterases, presence of Fc receptor on the cell surface, phagocytic ability, low TdT activity, presence of surface "ruffles" and "ridges" on scanning EM, elevations of serum lysozyme, and clinical manifestations of leukemic tissue infiltration are features which accompanied monocytic differentiation in these cases.
DOE Office of Scientific and Technical Information (OSTI.GOV)
Hearin, Andrew P.; Zentner, Andrew R., E-mail: aph15@pitt.edu, E-mail: zentner@pitt.edu
Forthcoming projects such as the Dark Energy Survey, Joint Dark Energy Mission, and the Large Synoptic Survey Telescope, aim to measure weak lensing shear correlations with unprecedented accuracy. Weak lensing observables are sensitive to both the distance-redshift relation and the growth of structure in the Universe. If the cause of accelerated cosmic expansion is dark energy within general relativity, both cosmic distances and structure growth are governed by the properties of dark energy. Consequently, one may use lensing to check for this consistency and test general relativity. After reviewing the phenomenology of such tests, we address a major challenge tomore » such a program. The evolution of the baryonic component of the Universe is highly uncertain and can influence lensing observables, manifesting as modified structure growth for a fixed cosmic distance scale. Using two proposed methods, we show that one could be led to reject the null hypothesis of general relativity when it is the true theory if this uncertainty in baryonic processes is neglected. Recent simulations suggest that we can correct for baryonic effects using a parameterized model in which the halo mass-concentration relation is modified. The correction suffices to render biases small compared to statistical uncertainties. We study the ability of future weak lensing surveys to constrain the internal structures of halos and test the null hypothesis of general relativity simultaneously. Compared to alternative methods which null information from small-scales to mitigate sensitivity to baryonic physics, this internal calibration program should provide limits on deviations from general relativity that are several times more constraining. Specifically, we find that limits on general relativity in the case of internal calibration are degraded by only {approx} 30% or less compared to the case of perfect knowledge of nonlinear structure.« less
Ling, Zhi-Qiang; Wang, Yi; Mukaisho, Kenichi; Hattori, Takanori; Tatsuta, Takeshi; Ge, Ming-Hua; Jin, Li; Mao, Wei-Min; Sugihara, Hiroyuki
2010-06-01
Tests of differentially expressed genes (DEGs) from microarray experiments are based on the null hypothesis that genes that are irrelevant to the phenotype/stimulus are expressed equally in the target and control samples. However, this strict hypothesis is not always true, as there can be several transcriptomic background differences between target and control samples, including different cell/tissue types, different cell cycle stages and different biological donors. These differences lead to increased false positives, which have little biological/medical significance. In this article, we propose a statistical framework to identify DEGs between target and control samples from expression microarray data allowing transcriptomic background differences between these samples by introducing a modified null hypothesis that the gene expression background difference is normally distributed. We use an iterative procedure to perform robust estimation of the null hypothesis and identify DEGs as outliers. We evaluated our method using our own triplicate microarray experiment, followed by validations with reverse transcription-polymerase chain reaction (RT-PCR) and on the MicroArray Quality Control dataset. The evaluations suggest that our technique (i) results in less false positive and false negative results, as measured by the degree of agreement with RT-PCR of the same samples, (ii) can be applied to different microarray platforms and results in better reproducibility as measured by the degree of DEG identification concordance both intra- and inter-platforms and (iii) can be applied efficiently with only a few microarray replicates. Based on these evaluations, we propose that this method not only identifies more reliable and biologically/medically significant DEG, but also reduces the power-cost tradeoff problem in the microarray field. Source code and binaries freely available for download at http://comonca.org.cn/fdca/resources/softwares/deg.zip.
Gauran, Iris Ivy M; Park, Junyong; Lim, Johan; Park, DoHwan; Zylstra, John; Peterson, Thomas; Kann, Maricel; Spouge, John L
2017-09-22
In recent mutation studies, analyses based on protein domain positions are gaining popularity over gene-centric approaches since the latter have limitations in considering the functional context that the position of the mutation provides. This presents a large-scale simultaneous inference problem, with hundreds of hypothesis tests to consider at the same time. This article aims to select significant mutation counts while controlling a given level of Type I error via False Discovery Rate (FDR) procedures. One main assumption is that the mutation counts follow a zero-inflated model in order to account for the true zeros in the count model and the excess zeros. The class of models considered is the Zero-inflated Generalized Poisson (ZIGP) distribution. Furthermore, we assumed that there exists a cut-off value such that smaller counts than this value are generated from the null distribution. We present several data-dependent methods to determine the cut-off value. We also consider a two-stage procedure based on screening process so that the number of mutations exceeding a certain value should be considered as significant mutations. Simulated and protein domain data sets are used to illustrate this procedure in estimation of the empirical null using a mixture of discrete distributions. Overall, while maintaining control of the FDR, the proposed two-stage testing procedure has superior empirical power. 2017 The Authors. Biometrics published by Wiley Periodicals, Inc. on behalf of International Biometric Society This is an open access article under the terms of the Creative Commons Attribution License, which permits use, distribution and reproduction in any medium, provided the original work is properly cited.
Lopez, G H; Morrison, J; Condon, J A; Wilson, B; Martin, J R; Liew, Y-W; Flower, R L; Hyland, C A
2015-10-01
Duffy blood group phenotypes can be predicted by genotyping for single nucleotide polymorphisms (SNPs) responsible for the Fy(a) /Fy(b) polymorphism, for weak Fy(b) antigen, and for the red cell null Fy(a-b-) phenotype. This study correlates Duffy phenotype predictions with serotyping to assess the most reliable procedure for typing. Samples, n = 155 (135 donors and 20 patients), were genotyped by high-resolution melt PCR and by microarray. Samples were in three serology groups: 1) Duffy patterns expected n = 79, 2) weak and equivocal Fy(b) patterns n = 29 and 3) Fy(a-b-) n = 47 (one with anti-Fy3 antibody). Discrepancies were observed for five samples. For two, SNP genotyping predicted weak Fy(b) expression discrepant with Fy(b-) (Group 1 and 3). For three, SNP genotyping predicted Fy(a) , discrepant with Fy(a-b-) (Group 3). DNA sequencing identified silencing mutations in these FY*A alleles. One was a novel FY*A 719delG. One, the sample with the anti-Fy3, was homozygous for a 14-bp deletion (FY*01N.02); a true null. Both the high-resolution melting analysis and SNP microarray assays were concordant and showed genotyping, as well as phenotyping, is essential to ensure 100% accuracy for Duffy blood group assignments. Sequencing is important to resolve phenotype/genotype conflicts which here identified alleles, one novel, that carry silencing mutations. The risk of alloimmunisation may be dependent on this zygosity status. © 2015 International Society of Blood Transfusion.
The Novelty Exploration Bonus and Its Attentional Modulation
ERIC Educational Resources Information Center
Krebs, Ruth M.; Schott, Bjorn H.; Schutze, Hartmut; Duzel, Emrah
2009-01-01
We hypothesized that novel stimuli represent salient learning signals that can motivate "exploration" in search for potential rewards. In computational theories of reinforcement learning, this is referred to as the novelty "exploration bonus" for rewards. If true, stimulus novelty should enhance the reward anticipation signals in brain areas that…
Adaptation in pronoun resolution: Evidence from Brazilian and European Portuguese.
Fernandes, Eunice G; Luegi, Paula; Correa Soares, Eduardo; de la Fuente, Israel; Hemforth, Barbara
2018-04-26
Previous research accounting for pronoun resolution as a problem of probabilistic inference has not explored the phenomenon of adaptation, whereby the processor constantly tracks and adapts, rationally, to changes in a statistical environment. We investigate whether Brazilian (BP) and European Portuguese (EP) speakers adapt to variations in the probability of occurrence of ambiguous overt and null pronouns, in two experiments assessing resolution toward subject and object referents. For each variety (BP, EP), participants were faced with either the same number of null and overt pronouns (equal distribution), or with an environment with fewer overt (than null) pronouns (unequal distribution). We find that the preference for interpreting overt pronouns as referring back to an object referent (object-biased interpretation) is higher when there are fewer overt pronouns (i.e., in the unequal, relative to the equal distribution condition). This is especially the case for BP, a variety with higher prior frequency and smaller object-biased interpretation of overt pronouns, suggesting that participants adapted incrementally and integrated prior statistical knowledge with the knowledge obtained in the experiment. We hypothesize that comprehenders adapted rationally, with the goal of maintaining, across variations in pronoun probability, the likelihood of subject and object referents. Our findings unify insights from research in pronoun resolution and in adaptation, and add to previous studies in both topics: They provide evidence for the influence of pronoun probability in pronoun resolution, and for an adaptation process whereby the language processor not only tracks statistical information, but uses it to make interpretational inferences. (PsycINFO Database Record (c) 2018 APA, all rights reserved).
Maranon, Rodrigo; Lima, Roberta; Spradley, Frank T; do Carmo, Jussara M; Zhang, Howei; Smith, Andrew D; Bui, Elizabeth; Thomas, R Lucas; Moulana, Mohadetheh; Hall, John E; Granger, Joey P; Reckelhoff, Jane F
2015-04-15
Women with polycystic ovary syndrome (PCOS) have hyperandrogenemia and increased prevalence of risk factors for cardiovascular disease, including elevated blood pressure. We recently characterized a hyperandrogenemic female rat (HAF) model of PCOS [chronic dihydrotestosterone (DHT) beginning at 4 wk of age] that exhibits similar characteristics as women with PCOS. In the present studies we tested the hypotheses that the elevated blood pressure in HAF rats is mediated in part by sympathetic activation, renal nerves, and melanocortin-4 receptor (MC4R) activation. Adrenergic blockade with terazosin and propranolol or renal denervation reduced mean arterial pressure (MAP by telemetry) in HAF rats but not controls. Hypothalamic MC4R expression was higher in HAF rats than controls, and central nervous system MC4R antagonism with SHU-9119 (1 nmol/h icv) reduced MAP in HAF rats. Taking a genetic approach, MC4R null and wild-type (WT) female rats were treated with DHT or placebo from 5 to 16 wk of age. MC4R null rats were obese and had higher MAP than WT control rats, and while DHT increased MAP in WT controls, DHT failed to further increase MAP in MC4R null rats. These data suggest that increases in MAP with chronic hyperandrogenemia in female rats are due, in part, to activation of the sympathetic nervous system, renal nerves, and MC4R and may provide novel insights into the mechanisms responsible for hypertension in women with hyperandrogenemia such as PCOS. Copyright © 2015 the American Physiological Society.
Stanton, M. Mark; Nelson, Lisa K.; Benediktsson, Hallgrimur; Hollenberg, Morley D.; Buret, Andre G.; Ceri, Howard
2013-01-01
Background. Nonbacterial prostatitis has no established etiology. We hypothesized that proteinase-activated receptor-1 (PAR1) can play a role in prostatitis. We therefore investigated the effects of PAR1 stimulation in the context of a new model of murine nonbacterial prostatitis. Methods. Using a hapten (ethanol-dinitrobenzene sulfonic acid- (DNBS-)) induced prostatitis model with both wild-type and PAR1-null mice, we examined (1) the location of PAR1 in the mouse prostate and (2) the impact of a PAR1-activating peptide (TFLLR-NH2: PAR1-TF) on ethanol-DNBS-induced inflammation. Results. Ethanol-DNBS-induced inflammation was maximal at 2 days. In the tissue, PAR1 was expressed predominantly along the apical acini of prostatic epithelium. Although PAR1-TF on its own did not cause inflammation, its coadministration with ethanol-DNBS reduced all indices of acute prostatitis. Further, PAR1-TF administration doubled the prostatic production of interleukin-10 (IL-10) compared with ethanol-DNBS treatment alone. This enhanced IL-10 was not observed in PAR1-null mice and was not caused by the reverse-sequence receptor-inactive peptide, RLLFT-NH2. Surprisingly, PAR1-TF, also diminished ethanol-DNBS-induced inflammation in PAR1-null mice. Conclusions. PAR1 is expressed in the mouse prostate and its activation by PAR1-TF elicits immunomodulatory effects during ethanol-DNBS-induced prostatitis. However, PAR1-TF also diminishes ethanol-DNBS-induced inflammation via a non-PAR1 mechanism by activating an as-yet unknown receptor. PMID:24459330
Sim1 Neurons Are Sufficient for MC4R-Mediated Sexual Function in Male Mice.
Semple, Erin; Hill, Jennifer W
2018-01-01
Sexual dysfunction is a poorly understood condition that affects up to one-third of men around the world. Existing treatments that target the periphery do not work for all men. Previous studies have shown that central melanocortins, which are released by pro-opiomelanocortin neurons in the arcuate nucleus of the hypothalamus, can lead to male erection and increased libido. Several studies specifically implicate the melanocortin 4 receptor (MC4R) in the central control of sexual function, but the specific neural circuitry involved is unknown. We hypothesized that single-minded homolog 1 (Sim1) neurons play an important role in the melanocortin-mediated regulation of male sexual behavior. To test this hypothesis, we examined the sexual behavior of mice expressing MC4R only on Sim1-positive neurons (tbMC4Rsim1 mice) in comparison with tbMC4R null mice and wild-type controls. In tbMC4Rsim1 mice, MC4R reexpression was found in the medial amygdala and paraventricular nucleus of the hypothalamus. These mice were paired with sexually experienced females, and their sexual function and behavior was scored based on mounting, intromission, and ejaculation. tbMC4R null mice showed a longer latency to mount, a reduced intromission efficiency, and an inability to reach ejaculation. Expression of MC4R only on Sim1 neurons reversed the sexual deficits seen in tbMC4R null mice. This study implicates melanocortin signaling via the MC4R on Sim1 neurons in the central control of male sexual behavior. Copyright © 2018 Endocrine Society.
The dynamic interplay between perceived true self-knowledge and decision satisfaction.
Schlegel, Rebecca J; Hicks, Joshua A; Davis, William E; Hirsch, Kelly A; Smith, Christina M
2013-03-01
The present research used multiple methods to examine the hypothesis that perceived true self-knowledge and decision satisfaction are inextricably linked together by a widely held "true-self-as-guide" lay theory of decision making. Consistent with this proposition, Study 1 found that participants rated using the true self as a guide as more important for achieving personal satisfaction than a variety of other potential decision-making strategies. After establishing the prevalence of this lay theory, the remaining studies then focused on examining the proposed consequent relationship between perceived true self-knowledge and decision satisfaction. Consistent with hypotheses, 2 cross-sectional correlational studies (Studies 2 and 3) found a positive relationship between perceived true self-knowledge and decision satisfaction for different types of major decisions. Study 4 used daily diary methods to demonstrate that fluctuations in perceived true self-knowledge reliably covary with fluctuations in decision satisfaction. Finally, 2 studies directly examined the causal direction of this relationship through experimental manipulation and revealed that the relationship is truly bidirectional. More specifically, Study 5 showed that manipulating perceived knowledge of the true self (but not other self-concepts) directly affects decision satisfaction. Study 6 showed that this effect also works in reverse by manipulating feelings of decision satisfaction, which directly affected perceived knowledge of the true self (but not other self-concepts). Taken together, these studies suggest that people believe the true self should be used as a guide when making major life decisions and that this belief has observable consequences for the self and decision making. PsycINFO Database Record (c) 2013 APA, all rights reserved
2014-01-01
Background CLIC4, a member of the CLIC family of proteins, was recently demonstrated to translocate to the nucleus in differentiating keratinocytes where it potentiates TGFβ-driven gene regulation. Since TGFβ signaling is known to play important roles in the fibrotic response to acute kidney injury, and since CLIC4 is abundantly expressed in kidney, we hypothesized that CLIC4 may play a role in the response to acute kidney injury. Methods Previously described Clic4 null mice were analyzed for the effect of absence of CLIC4 on growth, development and response to kidney injury. Kidney size, glomerular counts and density of peritubular capillaries of matched WT and Clic4 null mice were determined. Cohorts of WT and Clic4 null mice were subjected to the folic acid model of acute kidney injury. Extent of acute injury and long term functional recovery were assessed by plasma blood urea nitrogen (BUN); long term fibrosis/scarring was determined by histochemical assessment of kidney sections and by residual renal mass. Activation of the TGFβ signaling pathway was assessed by semi-quantitative western blots of phosphorylated SMADs 2 and 3. Results CLIC4 is abundantly expressed in the apical pole of renal proximal tubule cells, and in endothelial cells of glomerular and peritubular capillaries. CLIC4 null mice are small, have smaller kidneys with fewer glomeruli and less dense peritubular capillary networks, and have increased proteinuria. The Clic4 null mice show increased susceptibility to folic acid-induced acute kidney injury but no difference in recovery from acute injury, no nuclear redistribution of CLIC4 following injury, and no significant difference in activation of the TGFβ-signaling pathway as reflected in the level of phosphorylation of SMADs 2 and 3. Conclusions Absence of CLIC4 results in morphologic changes consistent with its known role in angiogenesis. These changes may be at least partially responsible for the increased susceptibility to acute kidney injury. However, the absence of CLIC4 has no significant impact on the extent of functional recovery or fibrosis following acute injury, indicating that CLIC4 does not play a major non-redundant role in the TGFβ signaling involved in response to acute kidney injury. PMID:24708746
Wicherts, Jelte M.; Bakker, Marjan; Molenaar, Dylan
2011-01-01
Background The widespread reluctance to share published research data is often hypothesized to be due to the authors' fear that reanalysis may expose errors in their work or may produce conclusions that contradict their own. However, these hypotheses have not previously been studied systematically. Methods and Findings We related the reluctance to share research data for reanalysis to 1148 statistically significant results reported in 49 papers published in two major psychology journals. We found the reluctance to share data to be associated with weaker evidence (against the null hypothesis of no effect) and a higher prevalence of apparent errors in the reporting of statistical results. The unwillingness to share data was particularly clear when reporting errors had a bearing on statistical significance. Conclusions Our findings on the basis of psychological papers suggest that statistical results are particularly hard to verify when reanalysis is more likely to lead to contrasting conclusions. This highlights the importance of establishing mandatory data archiving policies. PMID:22073203
Shen, Xiaomeng; Hu, Qiang; Li, Jun; Wang, Jianmin; Qu, Jun
2015-10-02
Comprehensive and accurate evaluation of data quality and false-positive biomarker discovery is critical to direct the method development/optimization for quantitative proteomics, which nonetheless remains challenging largely due to the high complexity and unique features of proteomic data. Here we describe an experimental null (EN) method to address this need. Because the method experimentally measures the null distribution (either technical or biological replicates) using the same proteomic samples, the same procedures and the same batch as the case-vs-contol experiment, it correctly reflects the collective effects of technical variability (e.g., variation/bias in sample preparation, LC-MS analysis, and data processing) and project-specific features (e.g., characteristics of the proteome and biological variation) on the performances of quantitative analysis. To show a proof of concept, we employed the EN method to assess the quantitative accuracy and precision and the ability to quantify subtle ratio changes between groups using different experimental and data-processing approaches and in various cellular and tissue proteomes. It was found that choices of quantitative features, sample size, experimental design, data-processing strategies, and quality of chromatographic separation can profoundly affect quantitative precision and accuracy of label-free quantification. The EN method was also demonstrated as a practical tool to determine the optimal experimental parameters and rational ratio cutoff for reliable protein quantification in specific proteomic experiments, for example, to identify the necessary number of technical/biological replicates per group that affords sufficient power for discovery. Furthermore, we assessed the ability of EN method to estimate levels of false-positives in the discovery of altered proteins, using two concocted sample sets mimicking proteomic profiling using technical and biological replicates, respectively, where the true-positives/negatives are known and span a wide concentration range. It was observed that the EN method correctly reflects the null distribution in a proteomic system and accurately measures false altered proteins discovery rate (FADR). In summary, the EN method provides a straightforward, practical, and accurate alternative to statistics-based approaches for the development and evaluation of proteomic experiments and can be universally adapted to various types of quantitative techniques.
Tucker, Kristal R.; Godbey, Steven J.; Thiebaud, Nicolas; Fadool, Debra Ann
2012-01-01
Physiological and nutritional state can modify sensory ability and perception through hormone signaling. Obesity and related metabolic disorders present a chronic imbalance in hormonal signaling that could impact sensory systems. In the olfactory system, external chemical cues are transduced into electrical signals to encode information. It is becoming evident that this system can also detect internal chemical cues in the form of molecules of energy homeostasis and endocrine hormones, whereby neurons of the olfactory system are modulated to change animal behavior towards olfactory cues. We hypothesized that chronic imbalance in hormonal signaling and energy homeostasis due to obesity would thereby disrupt olfactory behaviors in mice. To test this idea, we utilized three mouse models of varying body weight, metabolic hormones, and visceral adiposity – 1) C57BL6/J mice maintained on a condensed-milk based, moderately high-fat diet (MHF) of 32% fat for 6 months as the diet-induced obesity model, 2) an obesity-resistant, lean line of mice due to a gene-targeted deletion of a voltage-dependent potassium channel (Kv1.3-null), and 3) a genetic model of obesity as a result of a gene-targeted deletion of the melanocortin 4 receptor (MC4R-null). Diet-induced obese (DIO) mice failed to find fatty-scented hidden peanut butter cracker, based solely on olfactory cues, any faster than an unscented hidden marble, initially suggesting general anosmia. However, when these DIO mice were challenged to find a sweet-scented hidden chocolate candy, they had no difficulty. Furthermore, DIO mice were able to discriminate between fatty acids that differ by a single double bond and are components of the MHF diet (linoleic and oleic acid) in a habituation-dishabituation paradigm. Obesity-resistant, Kv1.3-null mice exhibited no change in scented object retrieval when placed on the MHF-diet, nor did they perform differently than wild-type mice in parallel habituation-dishabituation paradigms of fatty food-related odor components. Genetically obese, MC4R-null mice successfully found hidden scented objects, but did so more slowly than lean, wild-type mice, in an object-dependent fashion. In habituation-dishabituation trials of general odorants, MC4R-null mice failed to discriminate a novel odor, but were able to distinguish two fatty acids. Object memory recognition tests for short- and long-term memory retention demonstrated that maintenance on the MHF diet did not modify ability to perform these tasks independent of whether mice became obese or were resistant to weight gain (Kv1.3-null), however, the genetically predisposed obese mice (MC4R-null) failed the long-term object memory recognition performed at 24 hours. These results demonstrate that even though both the DIO mice and genetically predisposed obese mice are obese, they vary in the degree to which they exhibit behavioral deficits in odor detection, odor discrimination, and long-term memory. PMID:22995978
A mixture gatekeeping procedure based on the Hommel test for clinical trial applications.
Brechenmacher, Thomas; Xu, Jane; Dmitrienko, Alex; Tamhane, Ajit C
2011-07-01
When conducting clinical trials with hierarchically ordered objectives, it is essential to use multiplicity adjustment methods that control the familywise error rate in the strong sense while taking into account the logical relations among the null hypotheses. This paper proposes a gatekeeping procedure based on the Hommel (1988) test, which offers power advantages compared to other p value-based tests proposed in the literature. A general description of the procedure is given and details are presented on how it can be applied to complex clinical trial designs. Two clinical trial examples are given to illustrate the methodology developed in the paper.
Biomedical perspectives on locomotion in null gravity
NASA Technical Reports Server (NTRS)
Cavanagh, Peter R.
1989-01-01
A number of important features of various locomotor activities are discussed, and approaches to the study of these activities in the context of space flight are suggested. In particular, the magnitude of peak forces and the rates of change of force during terrestrial cycling, walking, and running are compared. It is shown that subtle changes in the conditions and techniques of locomotion can have a major influence on the biomechanical consequences to the skeleton. The various hypotheses that identify locomotor exercise as a countermeasure to bone demineralization during weightlessness deserve to be tested with some degree of biomechanical rigor. Various approaches for achieving such scrutiny are discussed.
Opposing Patterns of Seasonal Change in Functional and Phylogenetic Diversity of Tadpole Assemblages
Strauß, Axel; Guilhaumon, François; Randrianiaina, Roger Daniel; Wollenberg Valero, Katharina C.; Vences, Miguel; Glos, Julian
2016-01-01
Assemblages that are exposed to recurring temporal environmental changes can show changes in their ecological properties. These can be expressed by differences in diversity and assembly rules. Both can be identified using two measures of diversity: functional (FD) and phylogenetic diversity (PD). Frog communities are understudied in this regard, especially during the tadpole life stage. We utilised tadpole assemblages from Madagascan rainforest streams to test predictions of seasonal changes on diversity and assemblage composition and on diversity measures. From the warm-wet to the cool-dry season, species richness (SR) of tadpole assemblages decreased. Also FD and PD decreased, but FD less and PD more than expected by chance. During the dry season, tadpole assemblages were characterised by functional redundancy (among assemblages—with increasing SR), high FD (compared to a null model), and low PD (phylogenetic clustering; compared to a null model). Although mutually contradictory at first glance, these results indicate competition as tadpole community assembly driving force. This is true during the limiting cool-dry season but not during the more suitable warm-wet season. We thereby show that assembly rules can strongly depend on season, that comparing FD and PD can reveal such forces, that FD and PD are not interchangeable, and that conclusions on assembly rules based on FD alone are critical. PMID:27014867
Underpowered samples, false negatives, and unconscious learning.
Vadillo, Miguel A; Konstantinidis, Emmanouil; Shanks, David R
2016-02-01
The scientific community has witnessed growing concern about the high rate of false positives and unreliable results within the psychological literature, but the harmful impact of false negatives has been largely ignored. False negatives are particularly concerning in research areas where demonstrating the absence of an effect is crucial, such as studies of unconscious or implicit processing. Research on implicit processes seeks evidence of above-chance performance on some implicit behavioral measure at the same time as chance-level performance (that is, a null result) on an explicit measure of awareness. A systematic review of 73 studies of contextual cuing, a popular implicit learning paradigm, involving 181 statistical analyses of awareness tests, reveals how underpowered studies can lead to failure to reject a false null hypothesis. Among the studies that reported sufficient information, the meta-analytic effect size across awareness tests was d z = 0.31 (95 % CI 0.24-0.37), showing that participants' learning in these experiments was conscious. The unusually large number of positive results in this literature cannot be explained by selective publication. Instead, our analyses demonstrate that these tests are typically insensitive and underpowered to detect medium to small, but true, effects in awareness tests. These findings challenge a widespread and theoretically important claim about the extent of unconscious human cognition.
Scientific U-Turns: Eight Occasions When Science Changed Its Mind
ERIC Educational Resources Information Center
Sosabowski, Michael Hal; Gard, Paul R.
2017-01-01
The Scientific Method is the series of processes by which hypotheses, ideas and theories are shown to be true beyond a reasonable scientific doubt. Most science "fact" is expressed in terms of probabilities rather than certainties. Thus, by means of statistical calculations, researchers aim to determine whether an observed association…
History and Literature: A Trial Separation.
ERIC Educational Resources Information Center
Westfall, William
1990-01-01
Hypothesizes about the relationship between history and fiction. Discusses the contribution of history to understanding human affairs, stressing that history is true and fiction is not. Asserts that each generation uses the same materials to construct a new reality. States fiction gives only a partial view whereas history sees the world as whole.…
Tongue-Pressure and Hyoid Movement Timing in Healthy Liquid Swallowing
ERIC Educational Resources Information Center
Steele, Catriona; Sasse, Caroline; Bressmann, Tim
2012-01-01
It was hypothesized that tongue-palate pressure generation might directly facilitate hyoid movement in swallowing through the anatomical connections of the extrinsic tongue muscles. If true, non-invasive measures of tongue-palate pressure timing might serve as a proxy measure of hyoid excursion. The timing relationships between events in the…
Ambiguous Tilt and Translation Motion Cues in Astronauts after Space Flight
NASA Technical Reports Server (NTRS)
Clement, G.; Harm, D. L.; Rupert, A. H.; Beaton, K. H.; Wood, S. J.
2008-01-01
Adaptive changes during space flight in how the brain integrates vestibular cues with visual, proprioceptive, and somatosensory information can lead to impaired movement coordination, vertigo, spatial disorientation, and perceptual illusions following transitions between gravity levels. This joint ESA-NASA pre- and post-flight experiment is designed to examine both the physiological basis and operational implications for disorientation and tilt-translation disturbances in astronauts following short-duration space flights. The first specific aim is to examine the effects of stimulus frequency on adaptive changes in eye movements and motion perception during independent tilt and translation motion profiles. Roll motion is provided by a variable radius centrifuge. Pitch motion is provided by NASA's Tilt-Translation Sled in which the resultant gravitoinertial vector remains aligned with the body longitudinal axis during tilt motion (referred to as the Z-axis gravitoinertial or ZAG paradigm). We hypothesize that the adaptation of otolith-mediated responses to these stimuli will have specific frequency characteristics, being greatest in the mid-frequency range where there is a crossover of tilt and translation. The second specific aim is to employ a closed-loop nulling task in which subjects are tasked to use a joystick to null-out tilt motion disturbances on these two devices. The stimuli consist of random steps or sum-of-sinusoids stimuli, including the ZAG profiles on the Tilt-Translation Sled. We hypothesize that the ability to control tilt orientation will be compromised following space flight, with increased control errors corresponding to changes in self-motion perception. The third specific aim is to evaluate how sensory substitution aids can be used to improve manual control performance. During the closed-loop nulling task on both devices, small tactors placed around the torso vibrate according to the actual body tilt angle relative to gravity. We hypothesize that performance on the closed-loop tilt control task will be improved with this tactile display feedback of tilt orientation. The current plans include testing on eight crewmembers following Space Shuttle missions or short stay onboard the International Space Station. Measurements are obtained pre-flight at L-120 (plus or minus 30), L-90 (plus or minus 30), and L-30, (plus or minus 10) days and post-flight at R+0, R+1, R+2 or 3, R+4 or 5, and R+8 days. Pre-and post-flight testing (from R+1 on) is performed in the Neuroscience Laboratory at the NASA Johnson Space Center on both the Tilt-Translation Device and a variable radius centrifuge. A second variable radius centrifuge, provided by DLR for another joint ESA-NASA project, has been installed at the Baseline Data Collection Facility at Kennedy Space Center to collect data immediately after landing. ZAG was initiated with STS-122/1E and the first post-flight testing will take place after STS-123/1JA landing.
Birth order has no effect on intelligence: a reply and extension of previous findings.
Wichman, Aaron L; Rodgers, Joseph Lee; Maccallum, Robert C
2007-09-01
We address points raised by Zajonc and Sulloway, who reject findings showing that birth order has no effect on intelligence. Many objections to findings of null birth-order results seem to stem from a misunderstanding of the difference between study designs where birth order is confounded with true causal influences on intelligence across families and designs that control for some of these influences. We discuss some of the consequences of not appreciating the nature of this difference. When between-family confounds are controlled using appropriate study designs and techniques such as multilevel modeling, birth order is shown not to influence intelligence. We conclude with an empirical investigation of the replicability and generalizability of this approach.
The appearance, motion, and disappearance of three-dimensional magnetic null points
DOE Office of Scientific and Technical Information (OSTI.GOV)
Murphy, Nicholas A., E-mail: namurphy@cfa.harvard.edu; Parnell, Clare E.; Haynes, Andrew L.
2015-10-15
While theoretical models and simulations of magnetic reconnection often assume symmetry such that the magnetic null point when present is co-located with a flow stagnation point, the introduction of asymmetry typically leads to non-ideal flows across the null point. To understand this behavior, we present exact expressions for the motion of three-dimensional linear null points. The most general expression shows that linear null points move in the direction along which the magnetic field and its time derivative are antiparallel. Null point motion in resistive magnetohydrodynamics results from advection by the bulk plasma flow and resistive diffusion of the magnetic field,more » which allows non-ideal flows across topological boundaries. Null point motion is described intrinsically by parameters evaluated locally; however, global dynamics help set the local conditions at the null point. During a bifurcation of a degenerate null point into a null-null pair or the reverse, the instantaneous velocity of separation or convergence of the null-null pair will typically be infinite along the null space of the Jacobian matrix of the magnetic field, but with finite components in the directions orthogonal to the null space. Not all bifurcating null-null pairs are connected by a separator. Furthermore, except under special circumstances, there will not exist a straight line separator connecting a bifurcating null-null pair. The motion of separators cannot be described using solely local parameters because the identification of a particular field line as a separator may change as a result of non-ideal behavior elsewhere along the field line.« less
Raybould, Alan; Macdonald, Phil
2018-01-01
We describe two contrasting methods of comparative environmental risk assessment for genetically modified (GM) crops. Both are science-based, in the sense that they use science to help make decisions, but they differ in the relationship between science and policy. Policy-led comparative risk assessment begins by defining what would be regarded as unacceptable changes when the use a particular GM crop replaces an accepted use of another crop. Hypotheses that these changes will not occur are tested using existing or new data, and corroboration or falsification of the hypotheses is used to inform decision-making. Science-led comparative risk assessment, on the other hand, tends to test null hypotheses of no difference between a GM crop and a comparator. The variables that are compared may have little or no relevance to any previously stated policy objective and hence decision-making tends to be ad hoc in response to possibly spurious statistical significance. We argue that policy-led comparative risk assessment is the far more effective method. With this in mind, we caution that phenotypic profiling of GM crops, particularly with omics methods, is potentially detrimental to risk assessment. PMID:29755975
Vartanian, Oshin; Martindale, Colin; Kwiatkowski, Jonna
2003-05-01
This study was an investigation of the relationship between potential creativityas measured by fluency scores on the Alternate Uses Testand performance on Wasons 246 task. As hypothesized, participants who were successful in discovering the rule had significantly higher fluency scores. Successful participants also generated higher frequencies of confirmatory and disconfirmatory hypotheses, but a multiple regression analysis using the stepwise method revealed that the frequency of generating disconfirmatory hypotheses and fluency scores were the only two significant factors in task outcome. The results also supported earlier studies where disconfirmation was shown to play a more important role in the later stages of hypothesis testing. This was especially true of successful participants, who employed a higher frequency of disconfirmatory hypotheses after receiving feedback on the first announcement. These results imply that successful participants benefited from the provision of feedback on the first announcement by switching to a more successful strategy in the hypothesis-testing sequence.
Influence of Vibrotactile Feedback on Controlling Tilt Motion After Spaceflight
NASA Technical Reports Server (NTRS)
Wood, S. J.; Rupert, A. H.; Vanya, R. D.; Esteves, J. T.; Clement, G.
2011-01-01
We hypothesize that adaptive changes in how inertial cues from the vestibular system are integrated with other sensory information leads to perceptual disturbances and impaired manual control following transitions between gravity environments. The primary goals of this ongoing post-flight investigation are to quantify decrements in manual control of tilt motion following short-duration spaceflight and to evaluate vibrotactile feedback of tilt as a sensorimotor countermeasure. METHODS. Data is currently being collected on 9 astronaut subjects during 3 preflight sessions and during the first 8 days after Shuttle landings. Variable radius centrifugation (216 deg/s, <20 cm radius) in a darkened room is utilized to elicit otolith reflexes in the lateral plane without concordant canal or visual cues. A Tilt-Translation Sled (TTS) is capable of synchronizing pitch tilt with fore-aft translation to align the resultant gravitoinertial vector with the longitudinal body axis, thereby eliciting canal reflexes without concordant otolith or visual cues. A simple 4 tactor system was implemented to provide feedback when tilt position exceeded predetermined levels in either device. Closed-loop nulling tasks are performed during random tilt steps or sum-of-sines (TTS only) with and without vibrotactile feedback of chair position. RESULTS. On landing day the manual control performance without vibrotactile feedback was reduced by >30% based on the gain or the amount of tilt disturbance successfully nulled. Manual control performance tended to return to baseline levels within 1-2 days following landing. Root-mean-square position error and tilt velocity were significantly reduced with vibrotactile feedback. CONCLUSIONS. These preliminary results are consistent with our hypothesis that adaptive changes in vestibular processing corresponds to reduced manual control performance following G-transitions. A simple vibrotactile prosthesis improves the ability to null out tilt motion within a limited range of motion disturbances.
Miyasaka, Yuki; Suzuki, Sari; Ohshiba, Yasuhiro; Watanabe, Kei; Sagara, Yoshihiko; Yasuda, Shumpei P; Matsuoka, Kunie; Shitara, Hiroshi; Yonekawa, Hiromichi; Kominami, Ryo; Kikkawa, Yoshiaki
2013-01-01
The waltzer (v) mouse mutant harbors a mutation in Cadherin 23 (Cdh23) and is a model for Usher syndrome type 1D, which is characterized by congenital deafness, vestibular dysfunction, and prepubertal onset of progressive retinitis pigmentosa. In mice, functionally null Cdh23 mutations affect stereociliary morphogenesis and the polarity of both cochlear and vestibular hair cells. In contrast, the murine Cdh23(ahl) allele, which harbors a hypomorphic mutation, causes an increase in susceptibility to age-related hearing loss in many inbred strains. We produced congenic mice by crossing mice carrying the v niigata (Cdh23(v-ngt)) null allele with mice carrying the hypomorphic Cdh23(ahl) allele on the C57BL/6J background, and we then analyzed the animals' balance and hearing phenotypes. Although the Cdh23(v-ngt/ahl) compound heterozygous mice exhibited normal vestibular function, their hearing ability was abnormal: the mice exhibited higher thresholds of auditory brainstem response (ABR) and rapid age-dependent elevation of ABR thresholds compared with Cdh23(ahl/ahl) homozygous mice. We found that the stereocilia developed normally but were progressively disrupted in Cdh23(v-ngt/ahl) mice. In hair cells, CDH23 localizes to the tip links of stereocilia, which are thought to gate the mechanoelectrical transduction channels in hair cells. We hypothesize that the reduction of Cdh23 gene dosage in Cdh23(v-ngt/ahl) mice leads to the degeneration of stereocilia, which consequently reduces tip link tension. These findings indicate that CDH23 plays an important role in the maintenance of tip links during the aging process.
Zeng, Qi; Fu, Juan; Korrer, Michael; Gorbounov, Mikhail; Murray, Peter J; Pardoll, Drew; Masica, David L; Kim, Young J
2018-05-01
Immunosuppressive myeloid-derived suppressive cells (MDSCs) are characterized by their phenotypic and functional heterogeneity. To better define their T cell-independent functions within the tumor, sorted monocytic CD14 + CD11b + HLA-DR low/- MDSCs (mMDSC) from squamous cell carcinoma patients showed upregulated caspase-1 activity, which was associated with increased IL1β and IL18 expression. In vitro studies demonstrated that mMDSCs promoted caspase-1-dependent proliferation of multiple squamous carcinoma cell lines in both human and murine systems. In vivo , growth rates of B16, MOC1, and Panc02 were significantly blunted in chimeric mice adoptively transferred with caspase-1 null bone marrow cells under T cell-depleted conditions. Adoptive transfer of wild-type Gr-1 + CD11b + MDSCs from tumor-bearing mice reversed this antitumor response, whereas caspase-1 inhibiting thalidomide-treated MDSCs phenocopied the antitumor response found in caspase-1 null mice. We further hypothesized that MDSC caspase-1 activity could promote tumor-intrinsic MyD88-dependent carcinogenesis. In mice with wild-type caspase-1, MyD88-silenced tumors displayed reduced growth rate, but in chimeric mice with caspase-1 null bone marrow cells, MyD88-silenced tumors did not display differential tumor growth rate. When we queried the TCGA database, we found that caspase-1 expression is correlated with overall survival in squamous cell carcinoma patients. Taken together, our findings demonstrated that caspase-1 in MDSCs is a direct T cell-independent mediator of tumor proliferation. Cancer Immunol Res; 6(5); 566-77. ©2018 AACR . ©2018 American Association for Cancer Research.
The frequentist implications of optional stopping on Bayesian hypothesis tests.
Sanborn, Adam N; Hills, Thomas T
2014-04-01
Null hypothesis significance testing (NHST) is the most commonly used statistical methodology in psychology. The probability of achieving a value as extreme or more extreme than the statistic obtained from the data is evaluated, and if it is low enough, the null hypothesis is rejected. However, because common experimental practice often clashes with the assumptions underlying NHST, these calculated probabilities are often incorrect. Most commonly, experimenters use tests that assume that sample sizes are fixed in advance of data collection but then use the data to determine when to stop; in the limit, experimenters can use data monitoring to guarantee that the null hypothesis will be rejected. Bayesian hypothesis testing (BHT) provides a solution to these ills because the stopping rule used is irrelevant to the calculation of a Bayes factor. In addition, there are strong mathematical guarantees on the frequentist properties of BHT that are comforting for researchers concerned that stopping rules could influence the Bayes factors produced. Here, we show that these guaranteed bounds have limited scope and often do not apply in psychological research. Specifically, we quantitatively demonstrate the impact of optional stopping on the resulting Bayes factors in two common situations: (1) when the truth is a combination of the hypotheses, such as in a heterogeneous population, and (2) when a hypothesis is composite-taking multiple parameter values-such as the alternative hypothesis in a t-test. We found that, for these situations, while the Bayesian interpretation remains correct regardless of the stopping rule used, the choice of stopping rule can, in some situations, greatly increase the chance of experimenters finding evidence in the direction they desire. We suggest ways to control these frequentist implications of stopping rules on BHT.
Ciccacci, Cinzia; Latini, Andrea; Politi, Cristina; Mancinelli, Sandro; Marazzi, Maria C; Novelli, Giuseppe; Palombi, Leonardo; Borgiani, Paola
2017-10-01
Nevirapine (NVP) is used in developing countries as first-line treatment of HIV infection. Unfortunately, its use is associated with common serious adverse drug reactions, such as liver toxicity and the most severe and rare Stevens-Johnson syndrome (SJS) and toxic epidermal necrolysis (TEN). GSTT1 and GSTM1 genes code for enzymes involved in the metabolism of a wide range of drugs. We hypothesized that this gene variability could be implicated in NVP adverse reactions. We analyzed the GSTM1 and GSTT1 null genotypes by multiplex PCR in a population of 181 patients from Mozambique, treated with NVP. A case/control association study was performed. We also counted the number of risk alleles in SJS/TEN patients and in controls, including the GSTM1 null genotype and four previously identified risk alleles in CYP2B6, HCP5, and TRAF3IP2 genes. Among patients, 27 had developed SJS/TEN and 76 had developed hepatotoxicity during the treatment. The GSTM1 null genotype was more frequent in the cases with SJS/TEN than in the controls (OR = 2.94, P = 0.027). This association is also observed when other risk factors are taken into account, by a multivariate analysis (P = 0.024 and OR = 3.58). The risk allele counting analysis revealed a significantly higher risk for SJS/TEN in patients carrying three or four risk alleles. Moreover, all subjects with five or six risk alleles developed SJS/TEN, while subjects without any risk alleles were present only in the control group. We observed an association between GSTM1 and SJS/TEN susceptibility. Moreover, GSTM1 contributes to the definition of a genetic risk profile for SJS/TEN susceptibility.
Outlier Removal and the Relation with Reporting Errors and Quality of Psychological Research
Bakker, Marjan; Wicherts, Jelte M.
2014-01-01
Background The removal of outliers to acquire a significant result is a questionable research practice that appears to be commonly used in psychology. In this study, we investigated whether the removal of outliers in psychology papers is related to weaker evidence (against the null hypothesis of no effect), a higher prevalence of reporting errors, and smaller sample sizes in these papers compared to papers in the same journals that did not report the exclusion of outliers from the analyses. Methods and Findings We retrieved a total of 2667 statistical results of null hypothesis significance tests from 153 articles in main psychology journals, and compared results from articles in which outliers were removed (N = 92) with results from articles that reported no exclusion of outliers (N = 61). We preregistered our hypotheses and methods and analyzed the data at the level of articles. Results show no significant difference between the two types of articles in median p value, sample sizes, or prevalence of all reporting errors, large reporting errors, and reporting errors that concerned the statistical significance. However, we did find a discrepancy between the reported degrees of freedom of t tests and the reported sample size in 41% of articles that did not report removal of any data values. This suggests common failure to report data exclusions (or missingness) in psychological articles. Conclusions We failed to find that the removal of outliers from the analysis in psychological articles was related to weaker evidence (against the null hypothesis of no effect), sample size, or the prevalence of errors. However, our control sample might be contaminated due to nondisclosure of excluded values in articles that did not report exclusion of outliers. Results therefore highlight the importance of more transparent reporting of statistical analyses. PMID:25072606
McDuffie, Marcia; Maybee, Nelly A.; Keller, Susanna R.; Stevens, Brian K.; Garmey, James C.; Morris, Margaret A.; Kropf, Elizabeth; Rival, Claudia; Ma, Kaiwen; Carter, Jeffrey D.; Tersey, Sarah A.; Nunemaker, Craig S.; Nadler, Jerry L.
2010-01-01
OBJECTIVE 12/15-lipoxygenase (12/15-LO), one of a family of fatty acid oxidoreductase enzymes, reacts with polyenoic fatty acids to produce proinflammatory lipids. 12/15-LO is expressed in macrophages and pancreatic β-cells. It enhances interleukin 12 production by macrophages, and several of its products induce apoptosis of β-cells at nanomolar concentrations in vitro. We had previously demonstrated a role for 12/15-LO in β-cell damage in the streptozotocin model of diabetes. Since the gene encoding 12/15-LO (gene designation Alox15) lies within the Idd4 diabetes susceptibility interval in NOD mice, we hypothesized that 12/15-LO is also a key regulator of diabetes susceptibility in the NOD mouse. RESEARCH DESIGN AND METHODS We developed NOD mice carrying an inactivated 12/15-LO locus (NOD-Alox15null) using a “speed congenic” protocol, and the mice were monitored for development of insulitis and diabetes. RESULTS NOD mice deficient in 12/15-LO develop diabetes at a markedly reduced rate compared with NOD mice (2.5 vs. >60% in females by 30 weeks). Nondiabetic female NOD-Alox15null mice demonstrate improved glucose tolerance, as well as significantly reduced severity of insulitis and improved β-cell mass, when compared with age-matched nondiabetic NOD females. Disease resistance is associated with decreased numbers of islet-infiltrating activated macrophages at 4 weeks of age in NOD-Alox15null mice, preceding the development of insulitis. Subsequently, islet-associated infiltrates are characterized by decreased numbers of CD4+ T cells and increased Foxp3+ cells. CONCLUSIONS These results suggest an important role for 12/15-LO in conferring susceptibility to autoimmune diabetes in NOD mice through its effects on macrophage recruitment or activation. PMID:17940120
Davis, Hayley; Lewis, Annabelle; Spencer-Dene, Bradley; Tateossian, Hilda; Stamp, Gordon; Behrens, Axel; Tomlinson, Ian
2011-01-01
FBXW7 is the substrate recognition component of a SCF-type E3 ubiquitin ligase. It has multiple targets such as Notch1, c-Jun, and cyclin E that function in critical developmental and signalling pathways. Mutations in FBXW7 are often found in many types of cancer. In most cases, these mutations do not inactivate the protein, but are mono-allelic missense changes at specific arginine resides involved in substrate binding. We have hypothesized that FBXW7 mutations are selected in cancers for reasons other than haploinsufficiency or full loss-of-function. Given that the existing mutant Fbxw7 mice carry null alleles, we created a mouse model carrying one of the commonly occurring point mutations (Fbxw7) in the WD40 substrate recognition domain of Fbxw7. Mice heterozygous for this mutation apparently developed normally in utero, died perinatally due to a defect in lung development, and in some cases showed cleft palate and eyelid fusion defects. By comparison, Fbxw7+/− mice were viable and developed normally. Fbxw7−/− animals died of vascular abnormalities at E10.5. We screened known FBXW7 targets for changes in the lungs of the Fbxw7R482Q/+ mice and found Tgif1 and Klf5 to be up-regulated. Fbxw7 alleles are not functionally equivalent to heterozygous or homozygous null alleles, and we propose that they are selected in tumourigenesis because they cause a selective or partial loss of FBXW7 function. Copyright © 2011 Pathological Society of Great Britain and Ireland. Published by John Wiley & Sons, Ltd. PMID:21503901
DOE Office of Scientific and Technical Information (OSTI.GOV)
Boadas-Vaello, Pere; Jover, Eric; Diez-Padrisa, Nuria
2007-12-15
Several alkylnitriles are toxic to sensory systems, including the vestibular system, through yet undefined mechanisms. This study addressed the hypothesis that the vestibular toxicity of cis-crotononitrile depends on CYP2E1-mediated bioactivation. Wild-type (129S1) and CYP2E1-null female mice were exposed to cis-crotononitrile at 0, 2, 2.25 or 2.5 mmol/kg (p.o.) in either a baseline condition or following exposure to 1% acetone in drinking water to induce CYP2E1 expression. The exposed animals were assessed for vestibular toxicity using a behavioral test battery and through surface observation of the vestibular sensory epithelia by scanning electron microscopy. In parallel groups, concentrations of cis-crotononitrile and cyanidemore » were assessed in whole blood. Contrary to our hypothesis, CYP2E1-null mice were slightly more susceptible to the vestibular toxicity of cis-crotononitrile than were control 129S1 mice. Similarly, rather than enhance vestibular toxicity, acetone pretreatment actually reduced it slightly in 129S1 controls, although not in CYP2E1-null mice. In addition, significant differences in mortality were recorded, with the greatest mortality occurring in 129S1 mice after acetone pretreatment. The highest mortality recorded in the 129S1 + acetone mice was associated with the lowest blood concentrations of cis-crotononitrile and the highest concentrations of cyanide at 6 h after nitrile exposure, the time when deaths were initially recorded. We conclude that cis-crotononitrile is a CYP2E1 substrate as hypothesized, but that CYP2E1-mediated metabolism of this nitrile is not necessary for vestibular toxicity; rather, this metabolism constitutes a major pathway for cyanide release and subsequent lethality.« less
Spiegel, S; Chiu, A; James, A S; Jentsch, J D; Karlsgodt, K H
2015-11-01
Numerous studies have implicated DTNBP1, the gene encoding dystrobrevin-binding protein or dysbindin, as a candidate risk gene for schizophrenia, though this relationship remains somewhat controversial. Variation in dysbindin, and its location on chromosome 6p, has been associated with cognitive processes, including those relying on a complex system of glutamatergic and dopaminergic interactions. Dysbindin is one of the seven protein subunits that comprise the biogenesis of lysosome-related organelles complex 1 (BLOC-1). Dysbindin protein levels are lower in mice with null mutations in pallidin, another gene in the BLOC-1, and pallidin levels are lower in mice with null mutations in the dysbindin gene, suggesting that multiple subunit proteins must be present to form a functional oligomeric complex. Furthermore, pallidin and dysbindin have similar distribution patterns in a mouse and human brain. Here, we investigated whether the apparent correspondence of pallid and dysbindin at the level of gene expression is also found at the level of behavior. Hypothesizing a mutation leading to underexpression of either of these proteins should show similar phenotypic effects, we studied recognition memory in both strains using the novel object recognition task (NORT) and social novelty recognition task (SNRT). We found that mice with a null mutation in either gene are impaired on SNRT and NORT when compared with wild-type controls. These results support the conclusion that deficits consistent with recognition memory impairment, a cognitive function that is impaired in schizophrenia, result from either pallidin or dysbindin mutations, possibly through degradation of BLOC-1 expression and/or function. © 2015 John Wiley & Sons Ltd and International Behavioural and Neural Genetics Society.
Rigor and academic achievement: Career academies versus traditional class structure
NASA Astrophysics Data System (ADS)
Kyees, Linda L.
The purpose of this study was to determine if students who attended high school Career Academy classes, as part of Career and Technical Education, showed greater academic achievement than students who attended traditional high school classes. While all participants attended schools in the same school district, and were seeking the same goal of graduation with a standard diploma, the Career Academy students had the benefit of all classes being directed by a team of teachers who helped them connect their learning to their desired career through collaborative learning projects and assignments. The traditional high school classes taught each subject independent of other subjects and did not have specific connections to desired career goals of the students. The study used a causal-comparative research design and the participants included 1,142 students from 11th and 12th grades who attended 9 high schools in a diversely populated area of central Florida with 571 enrolled in the Career Academies and 571 enrolled in traditional classes. The 10th-grade FCAT scores served as the dependent variable. All students attended similar classes with similar content, making the primary variable the difference in academic gains between students participating in the Career Academy design and the traditional design classes. Using the Man-Whitney U Test resulted in the Career Academy group achieving the higher scores overall. This resulted in rejection of the first null-hypothesis. Further examination determined that the 10th-grade FCAT scores were greater for the average students group, which comprised the largest portion of the participant group, also resulted in rejection of the second null-hypothesis. The gifted and at-risk student group scores resulted in failure to reject the third and fourth null-hypotheses.
Hsieh, Minnie; Boerboom, Derek; Shimada, Masayuki; Lo, Yuet; Parlow, Albert F; Luhmann, Ulrich F O; Berger, Wolfgang; Richards, JoAnne S
2005-12-01
Previous studies showed that transcripts encoding specific Wnt ligands and Frizzled receptors including Wnt4, Frizzled1 (Fzd1), and Frizzled4 (Fzd4) were expressed in a cell-specific manner in the adult mouse ovary. Overlapping expression of Wnt4 and Fzd4 mRNA in small follicles and corpora lutea led us to hypothesize that the infertility of mice null for Fzd4 (Fzd4-/-) might involve impaired follicular growth or corpus luteum formation. Analyses at defined stages of reproductive function indicate that immature Fzd4-/- mouse ovaries contain follicles at many stages of development and respond to exogenous hormone treatments in a manner similar to their wild-type littermates, indicating that the processes controlling follicular development and follicular cell responses to gonadotropins are intact. Adult Fzd4-/- mice also exhibit normal mating behavior and ovulate, indicating that endocrine events controlling these processes occur. However, Fzd4-/- mice fail to become pregnant and do not produce offspring. Histological and functional analyses of ovaries from timed mating pairs at Days 1.5-7.5 postcoitus (p.c.) indicate that the corpora lutea of the Fzd4-/- mice do not develop normally. Expression of luteal cell-specific mRNAs (Lhcgr, Prlr, Cyp11a1 and Sfrp4) is reduced, luteal cell morphology is altered, and markers of angiogenesis and vascular formation (Efnb1, Efnb2, Ephb4, Vegfa, Vegfc) are low in the Fzd4-/- mice. Although a recently identified, high-affinity FZD4 ligand Norrin (Norrie disease pseudoglioma homolog) is expressed in the ovary, adult Ndph-/- mice contain functional corpora lutea and do not phenocopy Fzd4-/- mice. Thus, Fzd4 appears to impact the formation of the corpus luteum by mechanisms that more closely phenocopy Prlr null mice.
Clement, Tracy M.; Inselman, Amy L.; Goulding, Eugenia H.; Willis, William D.; Eddy, Edward M.
2015-01-01
While cyclin dependent kinase 1 (CDK1) has a critical role in controlling resumption of meiosis in oocytes, its role has not been investigated directly in spermatocytes. Unique aspects of male meiosis led us to hypothesize that its role is different in male meiosis than in female meiosis. We generated a conditional knockout (cKO) of the Cdk1 gene in mouse spermatocytes to test this hypothesis. We found that CDK1-null spermatocytes undergo synapsis, chiasmata formation, and desynapsis as is seen in oocytes. Additionally, CDK1-null spermatocytes relocalize SYCP3 to centromeric foci, express H3pSer10, and initiate chromosome condensation. However, CDK1-null spermatocytes fail to form condensed bivalent chromosomes in prophase of meiosis I and instead are arrested at prometaphase. Thus, CDK1 has an essential role in male meiosis that is consistent with what is known about the role of CDK1 in female meiosis, where it is required for formation of condensed bivalent metaphase chromosomes and progression to the first meiotic division. We found that cKO spermatocytes formed fully condensed bivalent chromosomes in the presence of okadaic acid, suggesting that cKO chromosomes are competent to condense, although they do not do so in vivo. Additionally, arrested cKO spermatocytes exhibited irregular cell shape, irregular large nuclei, and large distinctive nucleoli. These cells persist in the seminiferous epithelium through the next seminiferous epithelial cycle with a lack of stage XII checkpoint-associated cell death. This indicates that CDK1 is required upstream of a checkpoint-associated cell death as well as meiotic metaphase progression in mouse spermatocytes. PMID:26490841
Ameloblast Modulation and Transport of Cl−, Na+, and K+ during Amelogenesis
Bronckers, A.L.J.J.; Lyaruu, D.; Jalali, R.; Medina, J.F.; Zandieh-Doulabi, B.; DenBesten, P.K.
2015-01-01
Ameloblasts express transmembrane proteins for transport of mineral ions and regulation of pH in the enamel space. Two major transporters recently identified in ameloblasts are the Na+K+-dependent calcium transporter NCKX4 and the Na+-dependent HPO42– (Pi) cotransporter NaPi-2b. To regulate pH, ameloblasts express anion exchanger 2 (Ae2a,b), chloride channel Cftr, and amelogenins that can bind protons. Exposure to fluoride or null mutation of Cftr, Ae2a,b, or Amelx each results in formation of hypomineralized enamel. We hypothesized that enamel hypomineralization associated with disturbed pH regulation results from reduced ion transport by NCKX4 and NaPi-2b. This was tested by correlation analyses among the levels of Ca, Pi, Cl, Na, and K in forming enamel of mice with null mutation of Cftr, Ae2a,b, and Amelx, according to quantitative x-ray electron probe microanalysis. Immunohistochemistry, polymerase chain reaction analysis, and Western blotting confirmed the presence of apical NaPi-2b and Nckx4 in maturation-stage ameloblasts. In wild-type mice, K levels in enamel were negatively correlated with Ca and Cl but less negatively or even positively in fluorotic enamel. Na did not correlate with P or Ca in enamel of wild-type mice but showed strong positive correlation in fluorotic and nonfluorotic Ae2a,b- and Cftr-null enamel. In hypomineralizing enamel of all models tested, 1) Cl− was strongly reduced; 2) K+ and Na+ accumulated (Na+ not in Amelx-null enamel); and 3) modulation was delayed or blocked. These results suggest that a Na+K+-dependent calcium transporter (likely NCKX4) and a Na+-dependent Pi transporter (potentially NaPi-2b) located in ruffle-ended ameloblasts operate in a coordinated way with the pH-regulating machinery to transport Ca2+, Pi, and bicarbonate into maturation-stage enamel. Acidification and/or associated physicochemical/electrochemical changes in ion levels in enamel fluid near the apical ameloblast membrane may reduce the transport activity of mineral transporters, which results in hypomineralization. PMID:26403673
Ameloblast Modulation and Transport of Cl⁻, Na⁺, and K⁺ during Amelogenesis.
Bronckers, A L J J; Lyaruu, D; Jalali, R; Medina, J F; Zandieh-Doulabi, B; DenBesten, P K
2015-12-01
Ameloblasts express transmembrane proteins for transport of mineral ions and regulation of pH in the enamel space. Two major transporters recently identified in ameloblasts are the Na(+)K(+)-dependent calcium transporter NCKX4 and the Na(+)-dependent HPO4 (2-) (Pi) cotransporter NaPi-2b. To regulate pH, ameloblasts express anion exchanger 2 (Ae2a,b), chloride channel Cftr, and amelogenins that can bind protons. Exposure to fluoride or null mutation of Cftr, Ae2a,b, or Amelx each results in formation of hypomineralized enamel. We hypothesized that enamel hypomineralization associated with disturbed pH regulation results from reduced ion transport by NCKX4 and NaPi-2b. This was tested by correlation analyses among the levels of Ca, Pi, Cl, Na, and K in forming enamel of mice with null mutation of Cftr, Ae2a,b, and Amelx, according to quantitative x-ray electron probe microanalysis. Immunohistochemistry, polymerase chain reaction analysis, and Western blotting confirmed the presence of apical NaPi-2b and Nckx4 in maturation-stage ameloblasts. In wild-type mice, K levels in enamel were negatively correlated with Ca and Cl but less negatively or even positively in fluorotic enamel. Na did not correlate with P or Ca in enamel of wild-type mice but showed strong positive correlation in fluorotic and nonfluorotic Ae2a,b- and Cftr-null enamel. In hypomineralizing enamel of all models tested, 1) Cl(-) was strongly reduced; 2) K(+) and Na(+) accumulated (Na(+) not in Amelx-null enamel); and 3) modulation was delayed or blocked. These results suggest that a Na(+)K(+)-dependent calcium transporter (likely NCKX4) and a Na(+)-dependent Pi transporter (potentially NaPi-2b) located in ruffle-ended ameloblasts operate in a coordinated way with the pH-regulating machinery to transport Ca(2+), Pi, and bicarbonate into maturation-stage enamel. Acidification and/or associated physicochemical/electrochemical changes in ion levels in enamel fluid near the apical ameloblast membrane may reduce the transport activity of mineral transporters, which results in hypomineralization. © International & American Associations for Dental Research 2015.
Clinical trial designs for testing biomarker-based personalized therapies
Lai, Tze Leung; Lavori, Philip W; Shih, Mei-Chiung I; Sikic, Branimir I
2014-01-01
Background Advances in molecular therapeutics in the past decade have opened up new possibilities for treating cancer patients with personalized therapies, using biomarkers to determine which treatments are most likely to benefit them, but there are difficulties and unresolved issues in the development and validation of biomarker-based personalized therapies. We develop a new clinical trial design to address some of these issues. The goal is to capture the strengths of the frequentist and Bayesian approaches to address this problem in the recent literature and to circumvent their limitations. Methods We use generalized likelihood ratio tests of the intersection null and enriched strategy null hypotheses to derive a novel clinical trial design for the problem of advancing promising biomarker-guided strategies toward eventual validation. We also investigate the usefulness of adaptive randomization (AR) and futility stopping proposed in the recent literature. Results Simulation studies demonstrate the advantages of testing both the narrowly focused enriched strategy null hypothesis related to validating a proposed strategy and the intersection null hypothesis that can accommodate to a potentially successful strategy. AR and early termination of ineffective treatments offer increased probability of receiving the preferred treatment and better response rates for patients in the trial, at the expense of more complicated inference under small-to-moderate total sample sizes and some reduction in power. Limitations The binary response used in the development phase may not be a reliable indicator of treatment benefit on long-term clinical outcomes. In the proposed design, the biomarker-guided strategy (BGS) is not compared to ‘standard of care’, such as physician’s choice that may be informed by patient characteristics. Therefore, a positive result does not imply superiority of the BGS to ‘standard of care’. The proposed design and tests are valid asymptotically. Simulations are used to examine small-to-moderate sample properties. Conclusion Innovative clinical trial designs are needed to address the difficulties and issues in the development and validation of biomarker-based personalized therapies. The article shows the advantages of using likelihood inference and interim analysis to meet the challenges in the sample size needed and in the constantly evolving biomarker landscape and genomic and proteomic technologies. PMID:22397801
Measuring Speed, Ability, or Motivation: A Comment on Goldhammer (2015)
ERIC Educational Resources Information Center
Kuhn, Jörg-Tobias; Ranger, Jochen
2015-01-01
In this commentary, Kuhn and Ranger hypothesize that most people are aware that talent does not guarantee success in case one is lazy. This is also true for the performance in achievement tests that depends on, among other factors, achievement potential (ability) and willingness to achieve (test-taking motivation) of the test taker. They add that…
Valid randomization-based p-values for partially post hoc subgroup analyses.
Lee, Joseph J; Rubin, Donald B
2015-10-30
By 'partially post-hoc' subgroup analyses, we mean analyses that compare existing data from a randomized experiment-from which a subgroup specification is derived-to new, subgroup-only experimental data. We describe a motivating example in which partially post hoc subgroup analyses instigated statistical debate about a medical device's efficacy. We clarify the source of such analyses' invalidity and then propose a randomization-based approach for generating valid posterior predictive p-values for such partially post hoc subgroups. Lastly, we investigate the approach's operating characteristics in a simple illustrative setting through a series of simulations, showing that it can have desirable properties under both null and alternative hypotheses. Copyright © 2015 John Wiley & Sons, Ltd.
Long memory and multifractality: A joint test
NASA Astrophysics Data System (ADS)
Goddard, John; Onali, Enrico
2016-06-01
The properties of statistical tests for hypotheses concerning the parameters of the multifractal model of asset returns (MMAR) are investigated, using Monte Carlo techniques. We show that, in the presence of multifractality, conventional tests of long memory tend to over-reject the null hypothesis of no long memory. Our test addresses this issue by jointly estimating long memory and multifractality. The estimation and test procedures are applied to exchange rate data for 12 currencies. Among the nested model specifications that are investigated, in 11 out of 12 cases, daily returns are most appropriately characterized by a variant of the MMAR that applies a multifractal time-deformation process to NIID returns. There is no evidence of long memory.
The Heuristic Value of p in Inductive Statistical Inference
Krueger, Joachim I.; Heck, Patrick R.
2017-01-01
Many statistical methods yield the probability of the observed data – or data more extreme – under the assumption that a particular hypothesis is true. This probability is commonly known as ‘the’ p-value. (Null Hypothesis) Significance Testing ([NH]ST) is the most prominent of these methods. The p-value has been subjected to much speculation, analysis, and criticism. We explore how well the p-value predicts what researchers presumably seek: the probability of the hypothesis being true given the evidence, and the probability of reproducing significant results. We also explore the effect of sample size on inferential accuracy, bias, and error. In a series of simulation experiments, we find that the p-value performs quite well as a heuristic cue in inductive inference, although there are identifiable limits to its usefulness. We conclude that despite its general usefulness, the p-value cannot bear the full burden of inductive inference; it is but one of several heuristic cues available to the data analyst. Depending on the inferential challenge at hand, investigators may supplement their reports with effect size estimates, Bayes factors, or other suitable statistics, to communicate what they think the data say. PMID:28649206
The Heuristic Value of p in Inductive Statistical Inference.
Krueger, Joachim I; Heck, Patrick R
2017-01-01
Many statistical methods yield the probability of the observed data - or data more extreme - under the assumption that a particular hypothesis is true. This probability is commonly known as 'the' p -value. (Null Hypothesis) Significance Testing ([NH]ST) is the most prominent of these methods. The p -value has been subjected to much speculation, analysis, and criticism. We explore how well the p -value predicts what researchers presumably seek: the probability of the hypothesis being true given the evidence, and the probability of reproducing significant results. We also explore the effect of sample size on inferential accuracy, bias, and error. In a series of simulation experiments, we find that the p -value performs quite well as a heuristic cue in inductive inference, although there are identifiable limits to its usefulness. We conclude that despite its general usefulness, the p -value cannot bear the full burden of inductive inference; it is but one of several heuristic cues available to the data analyst. Depending on the inferential challenge at hand, investigators may supplement their reports with effect size estimates, Bayes factors, or other suitable statistics, to communicate what they think the data say.
Döhnel, Katrin; Schuwerk, Tobias; Meinhardt, Jörg; Sodian, Beate; Hajak, Göran; Sommer, Monika
2012-04-15
Since false belief reasoning requires mental state representation independently of the state of reality, it is seen as a key ability in Theory of Mind (ToM). Although true beliefs do not have to be processed independently of the state of reality, growing behavioural evidence indicates that true belief reasoning is different from just reasoning about the state of reality. So far, neural studies on true and false belief reasoning revealed inconsistent findings in the medial prefrontal cortex (MPFC) and in the right temporo-parietal junction (R-TPJ), brain regions that are hypothesized to play an important role in ToM. To further explore true and false belief reasoning, the present functional magnetic resonance imaging (fMRI) study in eighteen adult subjects used methodological refinements such as ensuring that the true belief trials did not elicit false belief reasoning, as well as including paralleled control conditions requiring reasoning about the state of reality. When compared to its control condition, common R-TPJ activity was observed for true and false belief reasoning, supporting its role in belief reasoning in general, and indicating that, at least in adults, also true belief reasoning appears to be different from reasoning about the state of reality. Differential activity was observed in a broad network of brain regions such as the MPFC, the inferior frontal cortex, and the precuneus. False over true belief reasoning induced activation in the posterior MPFC (pMPFC), supporting its role in the decoupling mechanisms, which is defined as processing a mental state independently of the state of reality. Copyright © 2012 Elsevier Inc. All rights reserved.
Guibert, Nicolas; Mazieres, Julien; Delaunay, Myriam; Casanova, Anne; Farella, Magali; Keller, Laura; Favre, Gilles; Pradines, Anne
2017-01-01
Objectives Pseudo-progression is a rare but worrying situation for both clinicians and patients during immunotherapy. Dedicated ir-RECIST criteria have been established to improve this situation. However, this can be sometimes considered inadequate and patients experiencing true progression may then receive inefficient treatments. Additional reliable tools to discriminate pseudo from true progression are thus needed. So far, no biomarker has been identified to distinguish pseudo from true progression. We hypothesize that biomarkers associated with the molecular characteristics of the tumor may be of interest. To avoid a tumor re-biopsy, circulating markers appear to be a less invasive and reproducible procedure. As ctDNA kinetics correlate with the response to treatment in KRAS-mutated adenocarcinoma, we anticipated that this analysis could be of interest. Materials and methods We monitored the level of KRAS-mutated ctDNA by digital droplet PCR in serial plasma samples from two patients who had experienced pseudo-progression and compared the variations with those from of a patient that had true progression. Results ctDNA showed rapid and dramatic decreases in pseudo-progressive patients, whereas it was strongly increased in the progressive patient. Conclusions ddPCR of ctDNA may thus be an additional tool to discriminate pseudo-progression from true progression for tumors that harbor an oncogenic addiction. PMID:28445137
Detection of long nulls in PSR B1706-16, a pulsar with large timing irregularities
NASA Astrophysics Data System (ADS)
Naidu, Arun; Joshi, Bhal Chandra; Manoharan, P. K.; Krishnakumar, M. A.
2018-04-01
Single pulse observations, characterizing in detail, the nulling behaviour of PSR B1706-16 are being reported for the first time in this paper. Our regular long duration monitoring of this pulsar reveals long nulls of 2-5 h with an overall nulling fraction of 31 ± 2 per cent. The pulsar shows two distinct phases of emission. It is usually in an active phase, characterized by pulsations interspersed with shorter nulls, with a nulling fraction of about 15 per cent, but it also rarely switches to an inactive phase, consisting of long nulls. The nulls in this pulsar are concurrent between 326.5 and 610 MHz. Profile mode changes accompanied by changes in fluctuation properties are seen in this pulsar, which switches from mode A before a null to mode B after the null. The distribution of null durations in this pulsar is bimodal. With its occasional long nulls, PSR B1706-16 joins the small group of intermediate nullers, which lie between the classical nullers and the intermittent pulsars. Similar to other intermediate nullers, PSR B1706-16 shows high timing noise, which could be due to its rare long nulls if one assumes that the slowdown rate during such nulls is different from that during the bursts.
Community Detection for Correlation Matrices
NASA Astrophysics Data System (ADS)
MacMahon, Mel; Garlaschelli, Diego
2015-04-01
A challenging problem in the study of complex systems is that of resolving, without prior information, the emergent, mesoscopic organization determined by groups of units whose dynamical activity is more strongly correlated internally than with the rest of the system. The existing techniques to filter correlations are not explicitly oriented towards identifying such modules and can suffer from an unavoidable information loss. A promising alternative is that of employing community detection techniques developed in network theory. Unfortunately, this approach has focused predominantly on replacing network data with correlation matrices, a procedure that we show to be intrinsically biased because of its inconsistency with the null hypotheses underlying the existing algorithms. Here, we introduce, via a consistent redefinition of null models based on random matrix theory, the appropriate correlation-based counterparts of the most popular community detection techniques. Our methods can filter out both unit-specific noise and system-wide dependencies, and the resulting communities are internally correlated and mutually anticorrelated. We also implement multiresolution and multifrequency approaches revealing hierarchically nested subcommunities with "hard" cores and "soft" peripheries. We apply our techniques to several financial time series and identify mesoscopic groups of stocks which are irreducible to a standard, sectorial taxonomy; detect "soft stocks" that alternate between communities; and discuss implications for portfolio optimization and risk management.
Visual artificial grammar learning in dyslexia: A meta-analysis.
van Witteloostuijn, Merel; Boersma, Paul; Wijnen, Frank; Rispens, Judith
2017-11-01
Literacy impairments in dyslexia have been hypothesized to be (partly) due to an implicit learning deficit. However, studies of implicit visual artificial grammar learning (AGL) have often yielded null results. The aim of this study is to weigh the evidence collected thus far by performing a meta-analysis of studies on implicit visual AGL in dyslexia. Thirteen studies were selected through a systematic literature search, representing data from 255 participants with dyslexia and 292 control participants (mean age range: 8.5-36.8 years old). If the 13 selected studies constitute a random sample, individuals with dyslexia perform worse on average than non-dyslexic individuals (average weighted effect size=0.46, 95% CI [0.14 … 0.77], p=0.008), with a larger effect in children than in adults (p=0.041; average weighted effect sizes 0.71 [sig.] versus 0.16 [non-sig.]). However, the presence of a publication bias indicates the existence of missing studies that may well null the effect. While the studies under investigation demonstrate that implicit visual AGL is impaired in dyslexia (more so in children than in adults, if in adults at all), the detected publication bias suggests that the effect might in fact be zero. Copyright © 2017 The Author(s). Published by Elsevier Ltd.. All rights reserved.
Normalization, bias correction, and peak calling for ChIP-seq
Diaz, Aaron; Park, Kiyoub; Lim, Daniel A.; Song, Jun S.
2012-01-01
Next-generation sequencing is rapidly transforming our ability to profile the transcriptional, genetic, and epigenetic states of a cell. In particular, sequencing DNA from the immunoprecipitation of protein-DNA complexes (ChIP-seq) and methylated DNA (MeDIP-seq) can reveal the locations of protein binding sites and epigenetic modifications. These approaches contain numerous biases which may significantly influence the interpretation of the resulting data. Rigorous computational methods for detecting and removing such biases are still lacking. Also, multi-sample normalization still remains an important open problem. This theoretical paper systematically characterizes the biases and properties of ChIP-seq data by comparing 62 separate publicly available datasets, using rigorous statistical models and signal processing techniques. Statistical methods for separating ChIP-seq signal from background noise, as well as correcting enrichment test statistics for sequence-dependent and sonication biases, are presented. Our method effectively separates reads into signal and background components prior to normalization, improving the signal-to-noise ratio. Moreover, most peak callers currently use a generic null model which suffers from low specificity at the sensitivity level requisite for detecting subtle, but true, ChIP enrichment. The proposed method of determining a cell type-specific null model, which accounts for cell type-specific biases, is shown to be capable of achieving a lower false discovery rate at a given significance threshold than current methods. PMID:22499706
de Broglie-Proca and Bopp-Podolsky massive photon gases in cosmology
NASA Astrophysics Data System (ADS)
Cuzinatto, R. R.; de Morais, E. M.; Medeiros, L. G.; Naldoni de Souza, C.; Pimentel, B. M.
2017-04-01
We investigate the influence of massive photons on the evolution of the expanding universe. Two particular models for generalized electrodynamics are considered, namely de Broglie-Proca and Bopp-Podolsky electrodynamics. We obtain the equation of state (EOS) P=P(\\varepsilon) for each case using dispersion relations derived from both theories. The EOS are inputted into the Friedmann equations of a homogeneous and isotropic space-time to determine the cosmic scale factor a(t). It is shown that the photon non-null mass does not significantly alter the result a\\propto t1/2 valid for a massless photon gas; this is true either in de Broglie-Proca's case (where the photon mass m is extremely small) or in Bopp-Podolsky theory (for which m is extremely large).
Too good to be true: publication bias in two prominent studies from experimental psychology.
Francis, Gregory
2012-04-01
Empirical replication has long been considered the final arbiter of phenomena in science, but replication is undermined when there is evidence for publication bias. Evidence for publication bias in a set of experiments can be found when the observed number of rejections of the null hypothesis exceeds the expected number of rejections. Application of this test reveals evidence of publication bias in two prominent investigations from experimental psychology that have purported to reveal evidence of extrasensory perception and to indicate severe limitations of the scientific method. The presence of publication bias suggests that those investigations cannot be taken as proper scientific studies of such phenomena, because critical data are not available to the field. Publication bias could partly be avoided if experimental psychologists started using Bayesian data analysis techniques.
The Underrepresentation of African Americans in Army Combat Arms Branches
2014-12-04
a starting point for the Army to determine true causality. This monograph is simply reviewing data and identifying correlation, and based on...correlation, assigning causality based on historical information and scholarly literature. These potential causes are not fact, and provide a starting ...1988 is the starting point for the commissioning statistics. Subject matter experts hypothesized that the number African American officers
Isotopic evidence indicates saprotrophy in post-fire Morchella in Oregon and Alaska
Erik A. Hobbie; Samuel F. Rice; Nancy S. Weber; Jane E. Smith
2016-01-01
We assessed the nutritional strategy of true morels (genus Morchella) collected in 2003 and 2004 in Oregon and Alaska, 1 or 2 y after forest fires. We hypothesized that the patterns of stable isotopes (δ13C and δ15N) in the sporocarps would match those of saprotrophic fungi and that radiocarbon (Î
Avian predators are less abundant during periodical cicada emergences, but why?
Koenig, Walter D; Ries, Leslie; Olsen, V Beth K; Liebhold, Andrew M
2011-03-01
Despite a substantial resource pulse, numerous avian insectivores known to depredate periodical cicadas (Magicicada spp.) are detected less commonly during emergence years than in either the previous or following years. We used data on periodical cicada calls collected by volunteers conducting North American Breeding Bird Surveys within the range of cicada Brood X to test three hypotheses for this observation: lower detection rates could be caused by bird calls being obscured by cicada calls ("detectability" hypothesis), by birds avoiding areas with cicadas ("repel" hypothesis), or because bird abundances are generally lower during emergence years for some reason unrelated to the current emergence event ("true decline" hypothesis). We tested these hypotheses by comparing bird detections at stations coincident with calling cicadas vs. those without calling cicadas in the year prior to and during cicada emergences. At four distinct levels (stop, route, range, and season), parallel declines of birds in groups exposed and not exposed to cicada calls supported the true decline hypothesis. We discuss several potential mechanisms for this pattern, including the possibility that it is a consequence of the ecological and evolutionary interactions between predators of this extraordinary group of insects.
Hoogeslag, Roy A G; Brouwer, Reinoud W; Huis In 't Veld, Rianne; Stephen, Joanna M; Amis, Andrew A
2018-02-03
There is a lack of objective evidence investigating how previous non-augmented ACL suture repair techniques and contemporary augmentation techniques in ACL suture repair restrain anterior tibial translation (ATT) across the arc of flexion, and after cyclic loading of the knee. The purpose of this work was to test the null hypotheses that there would be no statistically significant difference in ATT after non-, static- and dynamic-augmented ACL suture repair, and they will not restore ATT to normal values across the arc of flexion of the knee after cyclic loading. Eleven human cadaveric knees were mounted in a test rig, and knee kinematics from 0° to 90° of flexion were recorded by use of an optical tracking system. Measurements were recorded without load and with 89-N tibial anterior force. The knees were tested in the following states: ACL-intact, ACL-deficient, non-augmented suture repair, static tape augmentation and dynamic augmentation after 10 and 300 loading cycles. Only static tape augmentation and dynamic augmentation restored ATT to values similar to the ACL-intact state directly postoperation, and maintained this after cyclic loading. However, contrary to dynamic augmentation, the ATT after static tape augmentation failed to remain statistically less than for the ACL-deficient state after cyclic loading. Moreover, after cyclic loading, ATT was significantly less with dynamic augmentation when compared to static tape augmentation. In contrast to non-augmented ACL suture repair and static tape augmentation, only dynamic augmentation resulted in restoration of ATT values similar to the ACL-intact knee and decreased ATT values when compared to the ACL-deficient knee immediately post-operation and also after cyclic loading, across the arc of flexion, thus allowing the null hypotheses to be rejected. This may assist healing of the ruptured ACL. Therefore, this study would support further clinical evaluation of dynamic augmentation of ACL repair.
Effects of lead and exercise on endurance and learning in young herring gulls.
Burger, Joanna; Gochfeld, Michael
2004-02-01
In this paper, we report the use of young herring gulls, Larus argentatus, to examine the effect of lead and exercise on endurance, performance, and learning on a treadmill. Eighty 1-day-old herring gull chicks were randomly assigned to either a control group or a lead treatment group that received a single dose of lead acetate solution (100mg/kg) at day 2. Controls were injected with an equal volume of isotonic saline at the same age. Half of the lead treatment group and half of the control group were randomly assigned to an exercise regime of walking on a treadmill twice each day. The other group remained in their cages. We test the null hypotheses that neither lead nor exercise affected performance of herring gull chicks when subsequently tested on the treadmill at 7, 11, and 17 days post-injection. Performance measures included latency to orient forward initially, to move continuously, forward on the treadmill, and to avoiding being bumped against the back of the test chamber. Also measured were the number of calls per 15 s, and the time to tire out. Latency to face forward and avoiding being bumped against the back of the test chamber were measures of learning, and time to tire out was a measure of endurance. We found significant differences as a function of lead, exercise, and their interaction, and rejected the null hypotheses. For all measures of behavior and endurance, lead had the greatest contribution to accounting for variability. In general, lead-treated birds showed better performance improvement from the daily exercise than did controlled non-lead birds, with respect to endurance and learning. We suggest that in nature, exercise can improve performance of lead-exposed birds by partially mitigating the effects of lead, thereby increasing survival of lead-impaired chicks.
System and Method for Null-Lens Wavefront Sensing
NASA Technical Reports Server (NTRS)
Hill, Peter C. (Inventor); Thompson, Patrick L. (Inventor); Aronstein, David L. (Inventor); Bolcar, Matthew R. (Inventor); Smith, Jeffrey S. (Inventor)
2015-01-01
A method of measuring aberrations in a null-lens including assembly and alignment aberrations. The null-lens may be used for measuring aberrations in an aspheric optic with the null-lens. Light propagates from the aspheric optic location through the null-lens, while sweeping a detector through the null-lens focal plane. Image data being is collected at locations about said focal plane. Light is simulated propagating to the collection locations for each collected image. Null-lens aberrations may extracted, e.g., applying image-based wavefront-sensing to collected images and simulation results. The null-lens aberrations improve accuracy in measuring aspheric optic aberrations.
Adaptive jammer nulling in EHF communications satellites
NASA Astrophysics Data System (ADS)
Bhagwan, Jai; Kavanagh, Stephen; Yen, J. L.
A preliminary investigation is reviewed concerning adaptive null steering multibeam uplink receiving system concepts for future extremely high frequency communications satellites. Primary alternatives in the design of the uplink antenna, the multibeam adaptive nulling receiver, and the processing algorithm and optimization criterion are discussed. The alternatives are phased array, lens or reflector antennas, nulling at radio frequency or an intermediate frequency, wideband versus narrowband nulling, and various adaptive nulling algorithms. A primary determinant of the hardware complexity is the receiving system architecture, which is described for the alternative antenna and nulling concepts. The final concept chosen will be influenced by the nulling performance requirements, cost, and technological readiness.
Broken chiral symmetry on a null plane
DOE Office of Scientific and Technical Information (OSTI.GOV)
Beane, Silas R., E-mail: silas@physics.unh.edu
2013-10-15
On a null-plane (light-front), all effects of spontaneous chiral symmetry breaking are contained in the three Hamiltonians (dynamical Poincaré generators), while the vacuum state is a chiral invariant. This property is used to give a general proof of Goldstone’s theorem on a null-plane. Focusing on null-plane QCD with N degenerate flavors of light quarks, the chiral-symmetry breaking Hamiltonians are obtained, and the role of vacuum condensates is clarified. In particular, the null-plane Gell-Mann–Oakes–Renner formula is derived, and a general prescription is given for mapping all chiral-symmetry breaking QCD condensates to chiral-symmetry conserving null-plane QCD condensates. The utility of the null-planemore » description lies in the operator algebra that mixes the null-plane Hamiltonians and the chiral symmetry charges. It is demonstrated that in a certain non-trivial limit, the null-plane operator algebra reduces to the symmetry group SU(2N) of the constituent quark model. -- Highlights: •A proof (the first) of Goldstone’s theorem on a null-plane is given. •The puzzle of chiral-symmetry breaking condensates on a null-plane is solved. •The emergence of spin-flavor symmetries in null-plane QCD is demonstrated.« less
Memory and other properties of multiple test procedures generated by entangled graphs.
Maurer, Willi; Bretz, Frank
2013-05-10
Methods for addressing multiplicity in clinical trials have attracted much attention during the past 20 years. They include the investigation of new classes of multiple test procedures, such as fixed sequence, fallback and gatekeeping procedures. More recently, sequentially rejective graphical test procedures have been introduced to construct and visualize complex multiple test strategies. These methods propagate the local significance level of a rejected null hypothesis to not-yet rejected hypotheses. In the graph defining the test procedure, hypotheses together with their local significance levels are represented by weighted vertices and the propagation rule by weighted directed edges. An algorithm provides the rules for updating the local significance levels and the transition weights after rejecting an individual hypothesis. These graphical procedures have no memory in the sense that the origin of the propagated significance level is ignored in subsequent iterations. However, in some clinical trial applications, memory is desirable to reflect the underlying dependence structure of the study objectives. In such cases, it would allow the further propagation of significance levels to be dependent on their origin and thus reflect the grouped parent-descendant structures of the hypotheses. We will give examples of such situations and show how to induce memory and other properties by convex combination of several individual graphs. The resulting entangled graphs provide an intuitive way to represent the underlying relative importance relationships between the hypotheses, are as easy to perform as the original individual graphs, remain sequentially rejective and control the familywise error rate in the strong sense. Copyright © 2012 John Wiley & Sons, Ltd.
Fienup, Daniel M; Critchfield, Thomas S
2010-01-01
Computerized lessons that reflect stimulus equivalence principles were used to teach college students concepts related to inferential statistics and hypothesis decision making. Lesson 1 taught participants concepts related to inferential statistics, and Lesson 2 taught them to base hypothesis decisions on a scientific hypothesis and the direction of an effect. Lesson 3 taught the conditional influence of inferential statistics over decisions regarding the scientific and null hypotheses. Participants entered the study with low scores on the targeted skills and left the study demonstrating a high level of accuracy on these skills, which involved mastering more relations than were taught formally. This study illustrates the efficiency of equivalence-based instruction in establishing academic skills in sophisticated learners. PMID:21358904
A maximum likelihood analysis of the CoGeNT public dataset
DOE Office of Scientific and Technical Information (OSTI.GOV)
Kelso, Chris, E-mail: ckelso@unf.edu
The CoGeNT detector, located in the Soudan Underground Laboratory in Northern Minnesota, consists of a 475 grams (fiducial mass of 330 grams) target mass of p-type point contact germanium detector that measures the ionization charge created by nuclear recoils. This detector has searched for recoils created by dark matter since December of 2009. We analyze the public dataset from the CoGeNT experiment to search for evidence of dark matter interactions with the detector. We perform an unbinned maximum likelihood fit to the data and compare the significance of different WIMP hypotheses relative to each other and the null hypothesis ofmore » no WIMP interactions. This work presents the current status of the analysis.« less
A step-up test procedure to find the minimum effective dose.
Wang, Weizhen; Peng, Jianan
2015-01-01
It is of great interest to find the minimum effective dose (MED) in dose-response studies. A sequence of decreasing null hypotheses to find the MED is formulated under the assumption of nondecreasing dose response means. A step-up multiple test procedure that controls the familywise error rate (FWER) is constructed based on the maximum likelihood estimators for the monotone normal means. When the MED is equal to one, the proposed test is uniformly more powerful than Hsu and Berger's test (1999). Also, a simulation study shows a substantial power improvement for the proposed test over four competitors. Three R-codes are provided in Supplemental Materials for this article. Go to the publishers online edition of Journal of Biopharmaceutical Statistics to view the files.
What makes Darwinian hydrology "Darwinian"? Asking a different kind of question about landscapes
NASA Astrophysics Data System (ADS)
Harman, C.; Troch, P. A.
2014-02-01
There have been repeated calls for a Darwinian approach to hydrologic science, or for a synthesis of Darwinian and Newtonian approaches, to deepen understanding of the hydrologic system in the larger landscape context, and so develop a better basis for predictions now and in an uncertain future. But what exactly makes a Darwinian approach to hydrology "Darwinian"? While there have now been a number of discussions of Darwinian approaches, many referencing Harte (2002), the term is potentially a source of confusion because its connections to Darwin remain allusive rather than explicit. Here we suggest that the Darwinian approach to hydrology follows the example of Charles Darwin by focusing attention on the patterns of variation in populations and seeking hypotheses that explain these patterns in terms of the mechanisms and conditions that determine their historical development. These hypotheses do not simply catalog patterns or predict them statistically - they connect the present structure with processes operating in the past. Nor are they explanations presented without independent evidence or critical analysis - Darwin's hypotheses about the mechanisms underlying present-day variation could be independently tested and validated. With a Darwinian framework in mind, it is easy to see that a great deal of hydrologic research has already been done that contributes to a Darwinian hydrology - whether deliberately or not. We discuss some practical and philosophical issues with this approach to hydrologic science: how are explanatory hypotheses generated? What constitutes a good hypothesis? How are hypotheses tested? "Historical" sciences - including paleohydrology - have long grappled with these questions, as must a Darwinian hydrologic science. We can draw on Darwin's own example for some answers, though there are ongoing debates about the philosophical nature of his methods and reasoning. Darwin used a range of methods of historical reasoning to develop explanatory hypotheses: extrapolating mechanisms, space for time substitution, and looking for signatures of history. Some of these are already in use, while others are not and could be used to develop new insights. He sought explanatory hypotheses that intelligibly unified disparate facts, were testable against evidence, and had fertile implications for further research. He provided evidence to support his hypotheses by deducing corollary conditions ("if explanation A is true, then B will also be true") and comparing these to observations. While a synthesis of the Darwinian and Newtonian approaches remains a goal, the Darwinian approach to hydrologic science has significant value of its own. The Darwinian hydrology that has been conducted already has not been coordinated or linked into a general body of theory and knowledge, but the time is coming when this will be possible.
[Dilemma of null hypothesis in ecological hypothesis's experiment test.
Li, Ji
2016-06-01
Experimental test is one of the major test methods of ecological hypothesis, though there are many arguments due to null hypothesis. Quinn and Dunham (1983) analyzed the hypothesis deduction model from Platt (1964) and thus stated that there is no null hypothesis in ecology that can be strictly tested by experiments. Fisher's falsificationism and Neyman-Pearson (N-P)'s non-decisivity inhibit statistical null hypothesis from being strictly tested. Moreover, since the null hypothesis H 0 (α=1, β=0) and alternative hypothesis H 1 '(α'=1, β'=0) in ecological progresses are diffe-rent from classic physics, the ecological null hypothesis can neither be strictly tested experimentally. These dilemmas of null hypothesis could be relieved via the reduction of P value, careful selection of null hypothesis, non-centralization of non-null hypothesis, and two-tailed test. However, the statistical null hypothesis significance testing (NHST) should not to be equivalent to the causality logistical test in ecological hypothesis. Hence, the findings and conclusions about methodological studies and experimental tests based on NHST are not always logically reliable.
What story does geographic separation of insular bats tell? A case study on Sardinian rhinolophids.
Russo, Danilo; Di Febbraro, Mirko; Rebelo, Hugo; Mucedda, Mauro; Cistrone, Luca; Agnelli, Paolo; De Pasquale, Pier Paolo; Martinoli, Adriano; Scaravelli, Dino; Spilinga, Cristiano; Bosso, Luciano
2014-01-01
Competition may lead to changes in a species' environmental niche in areas of sympatry and shifts in the niche of weaker competitors to occupy areas where stronger ones are rarer. Although mainland Mediterranean (Rhinolophus euryale) and Mehely's (R. mehelyi) horseshoe bats mitigate competition by habitat partitioning, this may not be true on resource-limited systems such as islands. We hypothesize that Sardinian R. euryale (SAR) have a distinct ecological niche suited to persist in the south of Sardinia where R. mehelyi is rarer. Assuming that SAR originated from other Italian populations (PES)--mostly allopatric with R. mehelyi--once on Sardinia the former may have undergone niche displacement driven by R. mehelyi. Alternatively, its niche could have been inherited from a Maghrebian source population. We: a) generated Maxent Species Distribution Models (SDM) for Sardinian populations; b) calibrated a model with PES occurrences and projected it to Sardinia to see whether PES niche would increase R. euryale's sympatry with R. mehelyi; and c) tested for niche similarity between R. mehelyi and PES, PES and SAR, and R. mehelyi and SAR. Finally we predicted R. euryale's range in Northern Africa both in the present and during the Last Glacial Maximum (LGM) by calibrating SDMs respectively with SAR and PES occurrences and projecting them to the Maghreb. R. mehelyi and PES showed niche similarity potentially leading to competition. According to PES' niche, R. euryale would show a larger sympatry with R. mehelyi on Sardinia than according to SAR niche. Such niches have null similarity. The current and LGM Maghrebian ranges of R. euryale were predicted to be wide according to SAR's niche, negligible according to PES' niche. SAR's niche allows R. euryale to persist where R. mehelyi is rarer and competition probably mild. Possible explanations may be competition-driven niche displacement or Maghrebian origin.
What Story Does Geographic Separation of Insular Bats Tell? A Case Study on Sardinian Rhinolophids
Russo, Danilo; Di Febbraro, Mirko; Rebelo, Hugo; Mucedda, Mauro; Cistrone, Luca; Agnelli, Paolo; De Pasquale, Pier Paolo; Martinoli, Adriano; Scaravelli, Dino; Spilinga, Cristiano; Bosso, Luciano
2014-01-01
Competition may lead to changes in a species’ environmental niche in areas of sympatry and shifts in the niche of weaker competitors to occupy areas where stronger ones are rarer. Although mainland Mediterranean (Rhinolophus euryale) and Mehely’s (R. mehelyi) horseshoe bats mitigate competition by habitat partitioning, this may not be true on resource-limited systems such as islands. We hypothesize that Sardinian R. euryale (SAR) have a distinct ecological niche suited to persist in the south of Sardinia where R. mehelyi is rarer. Assuming that SAR originated from other Italian populations (PES) – mostly allopatric with R. mehelyi – once on Sardinia the former may have undergone niche displacement driven by R. mehelyi. Alternatively, its niche could have been inherited from a Maghrebian source population. We: a) generated Maxent Species Distribution Models (SDM) for Sardinian populations; b) calibrated a model with PES occurrences and projected it to Sardinia to see whether PES niche would increase R. euryale’s sympatry with R. mehelyi; and c) tested for niche similarity between R. mehelyi and PES, PES and SAR, and R. mehelyi and SAR. Finally we predicted R. euryale’s range in Northern Africa both in the present and during the Last Glacial Maximum (LGM) by calibrating SDMs respectively with SAR and PES occurrences and projecting them to the Maghreb. R. mehelyi and PES showed niche similarity potentially leading to competition. According to PES’ niche, R. euryale would show a larger sympatry with R. mehelyi on Sardinia than according to SAR niche. Such niches have null similarity. The current and LGM Maghrebian ranges of R. euryale were predicted to be wide according to SAR’s niche, negligible according to PES’ niche. SAR’s niche allows R. euryale to persist where R. mehelyi is rarer and competition probably mild. Possible explanations may be competition-driven niche displacement or Maghrebian origin. PMID:25340737
Meterwavelength Single-pulse Polarimetric Emission Survey. III. The Phenomenon of Nulling in Pulsars
DOE Office of Scientific and Technical Information (OSTI.GOV)
Basu, Rahul; Mitra, Dipanjan; Melikidze, George I., E-mail: rahulbasu.astro@gmail.com
A detailed analysis of nulling was conducted for the pulsars studied in the Meterwavelength Single-pulse Polarimetric Emission Survey. We characterized nulling in 36 pulsars including 17 pulsars where the phenomenon was reported for the first time. The most dominant nulls lasted for a short duration, less than five periods. Longer duration nulls extending to hundreds of periods were also seen in some cases. A careful analysis showed the presence of periodicities in the transition from the null to the burst states in 11 pulsars. In our earlier work, fluctuation spectrum analysis showed multiple periodicities in 6 of these 11 pulsars.more » We demonstrate that the longer periodicity in each case was associated with nulling. The shorter periodicities usually originate from subpulse drifting. The nulling periodicities were more aligned with the periodic amplitude modulation, indicating a possible common origin for both. The most prevalent nulls last for a single period and can be potentially explained using random variations affecting the plasma processes in the pulsar magnetosphere. On the other hand, longer-duration nulls require changes in the pair-production processes, which need an external triggering mechanism for the changes. The presence of periodic nulling puts an added constraint on the triggering mechanism, which also needs to be periodic.« less
MAGNETIC NULL POINTS IN KINETIC SIMULATIONS OF SPACE PLASMAS
DOE Office of Scientific and Technical Information (OSTI.GOV)
Olshevsky, Vyacheslav; Innocenti, Maria Elena; Cazzola, Emanuele
2016-03-01
We present a systematic attempt to study magnetic null points and the associated magnetic energy conversion in kinetic particle-in-cell simulations of various plasma configurations. We address three-dimensional simulations performed with the semi-implicit kinetic electromagnetic code iPic3D in different setups: variations of a Harris current sheet, dipolar and quadrupolar magnetospheres interacting with the solar wind, and a relaxing turbulent configuration with multiple null points. Spiral nulls are more likely created in space plasmas: in all our simulations except lunar magnetic anomaly (LMA) and quadrupolar mini-magnetosphere the number of spiral nulls prevails over the number of radial nulls by a factor of 3–9.more » We show that often magnetic nulls do not indicate the regions of intensive energy dissipation. Energy dissipation events caused by topological bifurcations at radial nulls are rather rare and short-lived. The so-called X-lines formed by the radial nulls in the Harris current sheet and LMA simulations are rather stable and do not exhibit any energy dissipation. Energy dissipation is more powerful in the vicinity of spiral nulls enclosed by magnetic flux ropes with strong currents at their axes (their cross sections resemble 2D magnetic islands). These null lines reminiscent of Z-pinches efficiently dissipate magnetic energy due to secondary instabilities such as the two-stream or kinking instability, accompanied by changes in magnetic topology. Current enhancements accompanied by spiral nulls may signal magnetic energy conversion sites in the observational data.« less
Rothman, N; Shields, P G; Poirier, M C; Harrington, A M; Ford, D P; Strickland, P T
1995-09-01
Carcinogenic polycyclic aromatic hydrocarbons (PAHs) form DNA adducts via a complex metabolic activation pathway that includes cytochrome P450 (CYP) 1A1, whereas intermediate metabolites can be detoxified by conjugation through pathways including glutathione s-transferase M1 (GSTM1). PAH-DNA adducts can be measured in peripheral white blood cells (WBCs) and should reflect the net effect of competing activation and detoxification pathways and DNA repair as well as exposure. We have previously shown that WBC PAH-DNA adducts measured by an enzyme-linked immunosorbent assay (ELISA) were associated with recent, frequent consumption of charbroiled food among 47 nonsmoking wildland fire-fighters who provided two blood samples 8 wk apart. In the investigation reported here, which was performed in the same population, we measured the association between the GSTM1 null genotype, which results in loss of enzyme activity, and PAH-DNA adduct levels, hypothesizing that subjects with this genotype would have higher levels of DNA adducts because of their decreased ability to detoxify PAH metabolites. However, PAH-DNA adduct levels were nonsignificantly lower in subjects with the GSTM1 null genotype (n = 28) compared with other subjects (n = 19) (median 0.04 fmol/microgram DNA vs 0.07 fmol/microgram DNA, respectively, P = 0.45, Wilcoxon rank-sum test). Adduct levels were also lower in the nine subjects heterozygous or homozygous for the CYP1A1 exon 7 polymorphism (which codes for a valine rather than isoleucine and is thought to be associated with greater CYP1A1 activity) compared with the 38 wild-type subjects (P = 0.12). In the entire group, there was a positive association between consuming charbroiled food and PAH-DNA adduct formation (r = 0.24, P = 0.02, Spearman rank-order correlation). This association was weaker in the subgroup of subjects with the GSTM1 null genotype (r = 0.03, P = 0.84) and stronger among the remaining subjects (r = 0.57, P = 0.0002). These results suggest that the GSTM1 null genotype and CYP1A1 exon 7 polymorphism are not associated with increased susceptibility for PAH-DNA adduct formation in peripheral WBCs measured by ELISA in nonsmoking populations.
Tolerance analysis of null lenses using an end-use system performance criterion
NASA Astrophysics Data System (ADS)
Rodgers, J. Michael
2000-07-01
An effective method of assigning tolerances to a null lens is to determine the effects of null-lens fabrication and alignment errors on the end-use system itself, not simply the null lens. This paper describes a method to assign null- lens tolerances based on their effect on any performance parameter of the end-use system.
Effects of Layoffs and Plant Closings on Depression Among Older Workers*
Brand, Jennie E.; Levy, Becca R.; Gallo, William T.
2009-01-01
Job displacement is widely considered a negative life event associated with subsequent economic decline and depression as established by numerous prior studies. However, little is known about whether the form of job displacement (i.e. layoffs versus plant closings) differentially affects depression. We assess the effects of different ways in which a worker is displaced on subsequent depression among U.S. men and women nearing retirement. We hypothesize that layoffs should be associated with larger effects on depression than plant closings, particularly among men. Our findings generally support our hypotheses. We find that men have significant increases in depression as a result of layoffs, but not as a result of plant closings, while the reverse is true among women. PMID:20011238
Qu, Wei; Diwan, Bhalchandra A.; Liu, Jie; Goyer, Robert A.; Dawson, Tammy; Horton, John L.; Cherian, M. George; Waalkes, Michael P.
2002-01-01
Susceptibility to lead toxicity in MT-null mice and cells, lacking the major forms of the metallothionein (MT) gene, was compared to wild-type (WT) mice or cells. Male MT-null and WT mice received lead in the drinking water (0 to 4000 ppm) for 10 to 20 weeks. Lead did not alter body weight in any group. Unlike WT mice, lead-treated MT-null mice showed dose-related nephromegaly. In addition, after lead exposure renal function was significantly diminished in MT-null mice in comparison to WT mice. MT-null mice accumulated less renal lead than WT mice and did not form lead inclusion bodies, which were present in the kidneys of WT mice. In gene array analysis, renal glutathione S-transferases were up-regulated after lead in MT-null mice only. In vitro studies on fibroblast cell lines derived from MT-null and WT mice showed that MT-null cells were much more sensitive to lead cytotoxicity. MT-null cells accumulated less lead and formed no inclusion bodies. The MT-null phenotype seems to preclude lead-induced inclusion body formation and increases lead toxicity at the organ and cellular level despite reducing lead accumulation. This study reveals important roles for MT in chronic lead toxicity, lead accumulation, and inclusion body formation. PMID:11891201
Wetland Microtopographic Structure is Revealed with Terrestrial Laser Scanning
NASA Astrophysics Data System (ADS)
Diamond, J.; Stovall, A. E.; Mclaughlin, D. L.; Slesak, R.
2017-12-01
Wetland microtopographic structure and its function has been the subject of research for decades, and several investigations suggest that microtopography is generated by autogenic ecohydrologic processes. But due to the difficulty of capturing the true spatial variability of wetland microtopography, many of the hypotheses for self-organization have remained elusive to test. We employ a novel method of Terrestrial Laser Scanning (TLS) that reveals an unprecedented high-resolution (<0.5 cm) glimpse at the true spatial structure of wetland microtopography in 10 black ash (Fraxinus nigra) stands of northern Minnesota, USA. Here we present the first efforts to synthesize this information and show that TLS provides a good representation of real microtopographic structure, where TLS accurately measured hummock height, but occlusion of low points led to a slight negative bias. We further show that TLS can accurately locate microtopographic high points (hummocks), as well as estimate their height and area. Using these new data, we estimate distributions in both microtopographic elevation and hummock area in each wetland and relate these to monitored hydrologic regime; in doing so, we test hypotheses linking emergent microtopographic patterns to putative hydrologic controls. Finally, we discuss future efforts to enumerate consequent influences of microtopography on wetland systems (soil properties and vegetation composition).
DOE Office of Scientific and Technical Information (OSTI.GOV)
Johnson, D.W.
2001-01-17
As a prelude to a basic program on soil leaching, some chemical characteristics of two forested Ultisols in eastern Tennessee and two forested Inceptisols in western Washington are discussed in relation to the production and mobility of anions. These soils were chosen in an attempt to provide a range of free iron (Fe) and aluminum (Al) contents (which are hypothesized to be related to anion adsorption) and carbon:nitrogen (C:N) ratios (which are hypothesized to be related to nitrate and bicarbonate production) for field experiments involving C, N, and anion salt additions. The Washington Inceptisols had high free Fe and Almore » in surface horizons and decreasing free Fe and Al levels with depth, whereas the reverse was true of the Tennessee Ultisols. The alderwood-red alder and Tarklin (sinkhole) soils had higher N concentrations and lower C:N ratios in their surface horizons than the Alderwood-Douglas-fir and Fullerton soils, respectively, but the reverse was true of subsurface horizons. Patterns of and relationships among the above properties and pH, Bray phosphorus (No. 2); adsorbed and soluble SO{sub 4}{sup 2-}, Cl{sup -}, and NO{sub 3}{sup -}; cation exchange capacity; and exchangeable cations are discussed.« less
Adaptive Nulling for the Terrestrial Planet Finder Interferometer
NASA Technical Reports Server (NTRS)
Peters, Robert D.; Lay, Oliver P.; Jeganathan, Muthu; Hirai, Akiko
2006-01-01
A description of adaptive nulling for Terrestrial Planet Finder Interferometer (TPFI) is presented. The topics include: 1) Nulling in TPF-I; 2) Why Do Adaptive Nulling; 3) Parallel High-Order Compensator Design; 4) Phase and Amplitude Control; 5) Development Activates; 6) Requirements; 7) Simplified Experimental Setup; 8) Intensity Correction; and 9) Intensity Dispersion Stability. A short summary is also given on adaptive nulling for the TPFI.
Zhu, Luchang; Lin, Jingjun; Kuang, Zhizhou; Vidal, Jorge E; Lau, Gee W
2015-07-01
The competence regulon of Streptococcus pneumoniae (pneumococcus) is crucial for genetic transformation. During competence development, the alternative sigma factor ComX is activated, which in turn, initiates transcription of 80 'late' competence genes. Interestingly, only 16 late genes are essential for genetic transformation. We hypothesized that these late genes that are dispensable for competence are beneficial to pneumococcal fitness during infection. These late genes were systematically deleted, and the resulting mutants were examined for their fitness during mouse models of bacteremia and acute pneumonia. Among these, 14 late genes were important for fitness in mice. Significantly, deletion of some late genes attenuated pneumococcal fitness to the same level in both wild-type and ComX-null genetic backgrounds, suggesting that the constitutive baseline expression of these genes was important for bacterial fitness. In contrast, some mutants were attenuated only in the wild-type genetic background but not in the ComX-null background, suggesting that specific expression of these genes during competence state contributed to pneumococcal fitness. Increased virulence during competence state was partially caused by the induction of allolytic enzymes that enhanced pneumolysin release. These results distinguish the role of basal expression versus competence induction in virulence functions encoded by ComX-regulated late competence genes. © 2015 John Wiley & Sons Ltd.
Hormone treatment enhances WT1 activation of Renilla luciferase constructs in LNCaP cells.
Hanson, Julie; Reese, Jennifer; Gorman, Jacquelyn; Cash, Jennifer; Fraizer, Gail
2007-01-01
The zinc finger transcription factor, WT1, regulates many growth control genes, repressing or activating transcription depending on the gene and cell type. Based on earlier analyses of the effect of WT1 on androgen responsive genes, we hypothesized that there may be an interaction between the androgen signaling pathway and WT1, such that the commonly used Renilla luciferase control vectors were activated in LNCaP prostate cancer cells. Using cotransfection assays we tested the effects of WT1 and/or the androgen analog, R1881, on two Renilla luciferase vectors, pRL-SV40 and the promoter-less pRL-null. To determine whether the zinc finger DNA binding domain was required, the zinc finger mutant DDS-WT1 (R394W) was tested; but it had no significant effect on the Renilla luciferase vectors. To determine whether the androgen signaling pathway was required, WT1 was co-transfected with Renilla vectors in cells with varied hormone responsiveness. The WT1 effect on pRL-null varied from no significant effect in 293 and PC3 cells to very strong enhancement in LNCaP cells treated with 5 nM R1881. Overall, these results suggest that hormone enhanced WT1 mediated activation of Renilla luciferase and that these interactions require an intact WT1 zinc finger DNA binding domain.
Permutation entropy of finite-length white-noise time series.
Little, Douglas J; Kane, Deb M
2016-08-01
Permutation entropy (PE) is commonly used to discriminate complex structure from white noise in a time series. While the PE of white noise is well understood in the long time-series limit, analysis in the general case is currently lacking. Here the expectation value and variance of white-noise PE are derived as functions of the number of ordinal pattern trials, N, and the embedding dimension, D. It is demonstrated that the probability distribution of the white-noise PE converges to a χ^{2} distribution with D!-1 degrees of freedom as N becomes large. It is further demonstrated that the PE variance for an arbitrary time series can be estimated as the variance of a related metric, the Kullback-Leibler entropy (KLE), allowing the qualitative N≫D! condition to be recast as a quantitative estimate of the N required to achieve a desired PE calculation precision. Application of this theory to statistical inference is demonstrated in the case of an experimentally obtained noise series, where the probability of obtaining the observed PE value was calculated assuming a white-noise time series. Standard statistical inference can be used to draw conclusions whether the white-noise null hypothesis can be accepted or rejected. This methodology can be applied to other null hypotheses, such as discriminating whether two time series are generated from different complex system states.
Theoretical rationale for music selection in oncology intervention research: an integrative review.
Burns, Debra S
2012-01-01
Music-based interventions have helped patients with cancer improve their quality of life, decrease treatment related distress, and manage pain. However, quantitative findings from music intervention studies are inconsistent. The purpose of this review was to explore the theoretical underpinnings for the selection of the music stimuli used to influence targeted outcomes. It was hypothesized that disparate findings were due in part to the atheoretical nature of music selection and the resulting diversity in music stimuli between and within studies. A systematic research synthesis including a comprehensive database and reference list search resulted in 22 studies. Included studies were compiled into two tables cataloging intervention theory, intervention content, and outcomes. A majority of studies did not provide a rationale or intervention theory for the delivery of music or choice of outcomes. Recorded music was the most common delivery method, but the specific music was rarely included within the report. Only two studies that included a theoretical framework reported null results on at least some of the outcomes. Null results are partially explained by an incomplete or mismatch in intervention theory and music selection and delivery. While the inclusion of an intervention theory does not guarantee positive results, including a theoretical rationale for the use of music, particular therapeutic processes or mechanisms, and the specifics of how music is selected and delivered increases scientific rigor and the probability of clinical translation.
Soldánová, Miroslava; Kuris, Armand M.; Scholz, Tomáš; Lafferty, Kevin D.
2012-01-01
We assessed how spatial and temporal heterogeneity and competition structure larval trematode communities in the pulmonate snail Lymnaea stagnalis. To postulate a dominance hierarchy, mark-release-recapture was used to monitor replacements of trematode species within snails over time. In addition, we sampled the trematode community in snails in different ponds in 3 consecutive years. A total of 7,623 snails (10,382 capture events) was sampled in 7 fishponds in the Jindřichův Hradec and Třeboň areas in South Bohemia (Czech Republic) from August 2006 to October 2008. Overall, 39% of snails were infected by a community of 14 trematode species; 7% of snails were infected with more than 1 trematode species (constituting 16 double- and 4 triple-species combinations). Results of the null-model analyses suggested that spatial heterogeneity in recruitment among ponds isolated trematode species from each other, whereas seasonal pulses in recruitment increased species interactions in some ponds. Competitive exclusion among trematodes led to a rarity of multiple infections compared to null-model expectations. Competitive relationships among trematode species were hypothesized as a dominance hierarchy based on direct evidence of replacement and invasion and on indirect evidence. Seven top dominant species with putatively similar competitive abilities (6 rediae and 1 sporocyst species) reduced the prevalence of the other trematode species developing in sporocysts only.
Implosive Collapse about Magnetic Null Points: A Quantitative Comparison between 2D and 3D Nulls
NASA Astrophysics Data System (ADS)
Thurgood, Jonathan O.; Pontin, David I.; McLaughlin, James A.
2018-03-01
Null collapse is an implosive process whereby MHD waves focus their energy in the vicinity of a null point, forming a current sheet and initiating magnetic reconnection. We consider, for the first time, the case of collapsing 3D magnetic null points in nonlinear, resistive MHD using numerical simulation, exploring key physical aspects of the system as well as performing a detailed parameter study. We find that within a particular plane containing the 3D null, the plasma and current density enhancements resulting from the collapse are quantitatively and qualitatively as per the 2D case in both the linear and nonlinear collapse regimes. However, the scaling with resistivity of the 3D reconnection rate—which is a global quantity—is found to be less favorable when the magnetic null point is more rotationally symmetric, due to the action of increased magnetic back-pressure. Furthermore, we find that, with increasing ambient plasma pressure, the collapse can be throttled, as is the case for 2D nulls. We discuss this pressure-limiting in the context of fast reconnection in the solar atmosphere and suggest mechanisms by which it may be overcome. We also discuss the implications of the results in the context of null collapse as a trigger mechanism of Oscillatory Reconnection, a time-dependent reconnection mechanism, and also within the wider subject of wave–null point interactions. We conclude that, in general, increasingly rotationally asymmetric nulls will be more favorable in terms of magnetic energy release via null collapse than their more symmetric counterparts.
Jain, Shikha; Shetty, K Sadashiva; Jain, Shweta; Jain, Sachin; Prakash, A T; Agrawal, Mamta
2015-07-01
To assess the null hypothesis that there is no difference in the rate of dental development and the occurrence of selected developmental anomalies related to shape, number, structure, and position of teeth between subjects with impacted mandibular canines and those with normally erupted canines. Pretreatment records of 42 subjects diagnosed with mandibular canines impaction (impaction group: IG) were compared with those of 84 subjects serving as a control reference sample (control group: CG). Independent t-tests were used to compare mean dental ages between the groups. Intergroup differences in distribution of subjects based on the rate of dental development and occurrence of selected dental anomalies were assessed using χ(2) tests. Odds of late, normal, and early developers and various categories of developmental anomalies between the IG and the CG were evaluated in terms of odds ratios. Mean dental age for the IG was lower than that for the CG in general. Specifically, this was true for girls (P < .05). Differences in the distribution of the subjects based on the rate of dental development and occurrence of positional anomalies also reached statistical significance (P < .05). The IG showed a higher frequency of late developers and positional anomalies compared with controls (odds ratios 3.00 and 2.82, respectively; P < .05). The null hypothesis was rejected. We identified close association of female subjects in the IG with retarded dental development compared with the female orthodontic patients. Increased frequency of positional developmental anomalies was also remarkable in the IG.
Magliocca, Nicholas R; Brown, Daniel G; Ellis, Erle C
2014-01-01
Local changes in land use result from the decisions and actions of land-users within land systems, which are structured by local and global environmental, economic, political, and cultural contexts. Such cross-scale causation presents a major challenge for developing a general understanding of how local decision-making shapes land-use changes at the global scale. This paper implements a generalized agent-based model (ABM) as a virtual laboratory to explore how global and local processes influence the land-use and livelihood decisions of local land-users, operationalized as settlement-level agents, across the landscapes of six real-world test sites. Test sites were chosen in USA, Laos, and China to capture globally-significant variation in population density, market influence, and environmental conditions, with land systems ranging from swidden to commercial agriculture. Publicly available global data were integrated into the ABM to model cross-scale effects of economic globalization on local land-use decisions. A suite of statistics was developed to assess the accuracy of model-predicted land-use outcomes relative to observed and random (i.e. null model) landscapes. At four of six sites, where environmental and demographic forces were important constraints on land-use choices, modeled land-use outcomes were more similar to those observed across sites than the null model. At the two sites in which market forces significantly influenced land-use and livelihood decisions, the model was a poorer predictor of land-use outcomes than the null model. Model successes and failures in simulating real-world land-use patterns enabled the testing of hypotheses on land-use decision-making and yielded insights on the importance of missing mechanisms. The virtual laboratory approach provides a practical framework for systematic improvement of both theory and predictive skill in land change science based on a continual process of experimentation and model enhancement.
Chertemps, Thomas; Younus, Faisal; Steiner, Claudia; Durand, Nicolas; Coppin, Chris W; Pandey, Gunjan; Oakeshott, John G; Maïbèche, Martine
2015-01-01
Reception of odorant molecules within insect olfactory organs involves several sequential steps, including their transport through the sensillar lymph, interaction with the respective sensory receptors, and subsequent inactivation. Odorant-degrading enzymes (ODEs) putatively play a role in signal dynamics by rapid degradation of odorants in the vicinity of the receptors, but this hypothesis is mainly supported by in vitro results. We have recently shown that an extracellular carboxylesterase, esterase-6 (EST-6), is involved in the physiological and behavioral dynamics of the response of Drosophila melanogaster to its volatile pheromone ester, cis-vaccenyl acetate. However, as the expression pattern of the Est-6 gene in the antennae is not restricted to the pheromone responding sensilla, we tested here if EST-6 could play a broader function in the antennae. We found that recombinant EST-6 is able to efficiently hydrolyse several volatile esters that would be emitted by its natural food in vitro. Electrophysiological comparisons of mutant Est-6 null flies and a control strain (on the same genetic background) showed that the dynamics of the antennal response to these compounds is influenced by EST-6, with the antennae of the null mutants showing prolonged activity in response to them. Antennal responses to the strongest odorant, pentyl acetate, were then studied in more detail, showing that the repolarization dynamics were modified even at low doses but without modification of the detection threshold. Behavioral choice experiments with pentyl acetate also showed differences between genotypes; attraction to this compound was observed at a lower dose among the null than control flies. As EST-6 is able to degrade various bioactive odorants emitted by food and plays a role in the response to these compounds, we hypothesize a role as an ODE for this enzyme toward food volatiles.
Prewitt, Allison R.; Ghose, Sampa; Frump, Andrea L.; Datta, Arumima; Austin, Eric D.; Kenworthy, Anne K.; de Caestecker, Mark P.
2015-01-01
Hereditary pulmonary arterial hypertension (HPAH) is a rare, fatal disease of the pulmonary vasculature. The majority of HPAH patients inherit mutations in the bone morphogenetic protein type 2 receptor gene (BMPR2), but how these promote pulmonary vascular disease is unclear. HPAH patients have features of pulmonary endothelial cell (PEC) dysfunction including increased vascular permeability and perivascular inflammation associated with decreased PEC barrier function. Recently, frameshift mutations in the caveolar structural protein gene Caveolin-1 (CAV-1) were identified in two patients with non-BMPR2-associated HPAH. Because caveolae regulate endothelial function and vascular permeability, we hypothesized that defects in caveolar function might be a common mechanism by which BMPR2 mutations promote pulmonary vascular disease. To explore this, we isolated PECs from mice carrying heterozygous null Bmpr2 mutations (Bmpr2+/−) similar to those found in the majority of HPAH patients. We show that Bmpr2+/− PECs have increased numbers and intracellular localization of caveolae and caveolar structural proteins CAV-1 and Cavin-1 and that these defects are reversed after blocking endocytosis with dynasore. SRC kinase is also constitutively activated in Bmpr2+/− PECs, and localization of CAV-1 to the plasma membrane is restored after treating Bmpr2+/− PECs with the SRC kinase inhibitor 3-(4-chlorophenyl)-1-(1,1-dimethylethyl)-1H-pyrazolo[3,4-d]pyrimidin-4-amine (PP2). Late outgrowth endothelial progenitor cells isolated from HPAH patients show similar increased activation of SRC kinase. Moreover, Bmpr2+/− PECs have impaired endothelial barrier function, and barrier function is restored after treatment with PP2. These data suggest that heterozygous null BMPR2 mutations promote SRC-dependent caveolar trafficking defects in PECs and that this may contribute to pulmonary endothelial barrier dysfunction in HPAH patients. PMID:25411245
Magliocca, Nicholas R.; Brown, Daniel G.; Ellis, Erle C.
2014-01-01
Local changes in land use result from the decisions and actions of land-users within land systems, which are structured by local and global environmental, economic, political, and cultural contexts. Such cross-scale causation presents a major challenge for developing a general understanding of how local decision-making shapes land-use changes at the global scale. This paper implements a generalized agent-based model (ABM) as a virtual laboratory to explore how global and local processes influence the land-use and livelihood decisions of local land-users, operationalized as settlement-level agents, across the landscapes of six real-world test sites. Test sites were chosen in USA, Laos, and China to capture globally-significant variation in population density, market influence, and environmental conditions, with land systems ranging from swidden to commercial agriculture. Publicly available global data were integrated into the ABM to model cross-scale effects of economic globalization on local land-use decisions. A suite of statistics was developed to assess the accuracy of model-predicted land-use outcomes relative to observed and random (i.e. null model) landscapes. At four of six sites, where environmental and demographic forces were important constraints on land-use choices, modeled land-use outcomes were more similar to those observed across sites than the null model. At the two sites in which market forces significantly influenced land-use and livelihood decisions, the model was a poorer predictor of land-use outcomes than the null model. Model successes and failures in simulating real-world land-use patterns enabled the testing of hypotheses on land-use decision-making and yielded insights on the importance of missing mechanisms. The virtual laboratory approach provides a practical framework for systematic improvement of both theory and predictive skill in land change science based on a continual process of experimentation and model enhancement. PMID:24489696
A Bayesian Method for Evaluating and Discovering Disease Loci Associations
Jiang, Xia; Barmada, M. Michael; Cooper, Gregory F.; Becich, Michael J.
2011-01-01
Background A genome-wide association study (GWAS) typically involves examining representative SNPs in individuals from some population. A GWAS data set can concern a million SNPs and may soon concern billions. Researchers investigate the association of each SNP individually with a disease, and it is becoming increasingly commonplace to also analyze multi-SNP associations. Techniques for handling so many hypotheses include the Bonferroni correction and recently developed Bayesian methods. These methods can encounter problems. Most importantly, they are not applicable to a complex multi-locus hypothesis which has several competing hypotheses rather than only a null hypothesis. A method that computes the posterior probability of complex hypotheses is a pressing need. Methodology/Findings We introduce the Bayesian network posterior probability (BNPP) method which addresses the difficulties. The method represents the relationship between a disease and SNPs using a directed acyclic graph (DAG) model, and computes the likelihood of such models using a Bayesian network scoring criterion. The posterior probability of a hypothesis is computed based on the likelihoods of all competing hypotheses. The BNPP can not only be used to evaluate a hypothesis that has previously been discovered or suspected, but also to discover new disease loci associations. The results of experiments using simulated and real data sets are presented. Our results concerning simulated data sets indicate that the BNPP exhibits both better evaluation and discovery performance than does a p-value based method. For the real data sets, previous findings in the literature are confirmed and additional findings are found. Conclusions/Significance We conclude that the BNPP resolves a pressing problem by providing a way to compute the posterior probability of complex multi-locus hypotheses. A researcher can use the BNPP to determine the expected utility of investigating a hypothesis further. Furthermore, we conclude that the BNPP is a promising method for discovering disease loci associations. PMID:21853025
Fermilab Education Office - Special Events for Students and Families
students and families. These include: null Fermilab Outdoor Family Fair (K-12) null Wonders of Science (2-7 ) null Family Open House (3-12) null STEM Career Expo (9-12) Search Programs - Search Science Adventures
Pelvic form and locomotor adaptation in strepsirrhine primates.
Lewton, Kristi L
2015-01-01
The pelvic girdle is a complex structure with a critical role in locomotion, but efforts to model the mechanical effects of locomotion on its shape remain difficult. Traditional approaches to understanding form and function include univariate adaptive hypothesis-testing derived from mechanical models. Geometric morphometric (GM) methods can yield novel insight into overall three-dimensional shape similarities and differences across groups, although the utility of GM in assessing functional differences has been questioned. This study evaluates the contributions of both univariate and GM approaches to unraveling the trait-function associations between pelvic form and locomotion. Three-dimensional landmarks were collected on a phylogenetically-broad sample of 180 pelves from nine primate taxa. Euclidean interlandmark distances were calculated to facilitate testing of biomechanical hypotheses, and a principal components (PC) analysis was performed on Procrustes coordinates to examine overall shape differences. Both linear dimensions and PC scores were subjected to phylogenetic ANOVA. Many of the null hypotheses relating linear dimensions to locomotor loading were not rejected. Although both analytical approaches suggest that ilium width and robusticity differ among locomotor groups, the GM analysis also suggests that ischiopubic shape differentiates groups. Although GM provides additional quantitative results beyond the univariate analyses, this study highlights the need for new GM methods to more specifically address functional shape differences among species. Until these methods are developed, it would be prudent to accompany tests of directional biomechanical hypotheses with current GM methods for a more nuanced understanding of shape and function. © 2014 Wiley Periodicals, Inc.
Obi, James; Ibidunni, Ayodotun Stephen; Tolulope, Atolagbe; Olokundun, Maxwell Ayodele; Amaihian, Augusta Bosede; Borishade, Taiye Tairat; Fred, Peter
2018-06-01
The focus of this research was to present a data article on the contribution of SMEs to economic development in a transiting economy. Descriptive research design was adopted in this study. Data were obtained from 600 respondents in 60 small-scale enterprises located in different parts of the country (20 small-scale enterprises located in Lagos State, 20 in Anambra State and 20 in Kano State of Nigeria respectively). Data analysis was carried out using tables and percentages and the null hypotheses of the study was tested using chi-square ( X 2 ) inferential statistical model at 5% level of significance. The findings revealed that there is a significant relationship between the operation of small and medium-scale enterprises and economic growth in developing nations.
On the Interpretation and Use of Mediation: Multiple Perspectives on Mediation Analysis.
Agler, Robert; De Boeck, Paul
2017-01-01
Mediation analysis has become a very popular approach in psychology, and it is one that is associated with multiple perspectives that are often at odds, often implicitly. Explicitly discussing these perspectives and their motivations, advantages, and disadvantages can help to provide clarity to conversations and research regarding the use and refinement of mediation models. We discuss five such pairs of perspectives on mediation analysis, their associated advantages and disadvantages, and their implications: with vs. without a mediation hypothesis, specific effects vs. a global model, directness vs. indirectness of causation, effect size vs. null hypothesis testing, and hypothesized vs. alternative explanations. Discussion of the perspectives is facilitated by a small simulation study. Some philosophical and linguistic considerations are briefly discussed, as well as some other perspectives we do not develop here.
Brain-derived neurotrophic factor mediates cognitive improvements following acute exercise.
Borror, Andrew
2017-09-01
The mechanisms causing improved cognition following acute exercise are poorly understood. This article proposes that brain-derived neurotrophic factor (BDNF) is the main factor contributing to improved cognition following exercise. Additionally, it argues that cerebral blood flow (CBF) and oxidative stress explain the release of BDNF from cerebral endothelial cells. One way to test these hypotheses is to block endothelial function and measure the effect on BDNF levels and cognitive performance. The CBF and oxidative stress can also be examined in relationship to BDNF using a multiple linear regression. If these hypotheses are true, there would be a linear relationship between CBF+oxidative stress and BDNF levels as well as between BDNF levels and cognitive performance. The novelty of these hypotheses comes from the emphasis on the cerebral endothelium and the interplay between BDNF, CBF, and oxidative stress. If found to be valid, these hypotheses would draw attention to the cerebral endothelium and provide direction for future research regarding methods to optimize BDNF release and enhance cognition. Elucidating these mechanisms would provide direction for expediting recovery in clinical populations, such as stroke, and maintaining quality of life in the elderly. Copyright © 2017 Elsevier Ltd. All rights reserved.
A Gaussian Mixture Model for Nulling Pulsars
NASA Astrophysics Data System (ADS)
Kaplan, D. L.; Swiggum, J. K.; Fichtenbauer, T. D. J.; Vallisneri, M.
2018-03-01
The phenomenon of pulsar nulling—where pulsars occasionally turn off for one or more pulses—provides insight into pulsar-emission mechanisms and the processes by which pulsars turn off when they cross the “death line.” However, while ever more pulsars are found that exhibit nulling behavior, the statistical techniques used to measure nulling are biased, with limited utility and precision. In this paper, we introduce an improved algorithm, based on Gaussian mixture models, for measuring pulsar nulling behavior. We demonstrate this algorithm on a number of pulsars observed as part of a larger sample of nulling pulsars, and show that it performs considerably better than existing techniques, yielding better precision and no bias. We further validate our algorithm on simulated data. Our algorithm is widely applicable to a large number of pulsars even if they do not show obvious nulls. Moreover, it can be used to derive nulling probabilities of nulling for individual pulses, which can be used for in-depth studies.
Modular Hamiltonians on the null plane and the Markov property of the vacuum state
NASA Astrophysics Data System (ADS)
Casini, Horacio; Testé, Eduardo; Torroba, Gonzalo
2017-09-01
We compute the modular Hamiltonians of regions having the future horizon lying on a null plane. For a CFT this is equivalent to regions with a boundary of arbitrary shape lying on the null cone. These Hamiltonians have a local expression on the horizon formed by integrals of the stress tensor. We prove this result in two different ways, and show that the modular Hamiltonians of these regions form an infinite dimensional Lie algebra. The corresponding group of unitary transformations moves the fields on the null surface locally along the null generators with arbitrary null line dependent velocities, but act non-locally outside the null plane. We regain this result in greater generality using more abstract tools on the algebraic quantum field theory. Finally, we show that modular Hamiltonians on the null surface satisfy a Markov property that leads to the saturation of the strong sub-additive inequality for the entropies and to the strong super-additivity of the relative entropy.
Error analysis and system optimization of non-null aspheric testing system
NASA Astrophysics Data System (ADS)
Luo, Yongjie; Yang, Yongying; Liu, Dong; Tian, Chao; Zhuo, Yongmo
2010-10-01
A non-null aspheric testing system, which employs partial null lens (PNL for short) and reverse iterative optimization reconstruction (ROR for short) technique, is proposed in this paper. Based on system modeling in ray tracing software, the parameter of each optical element is optimized and this makes system modeling more precise. Systematic error of non-null aspheric testing system is analyzed and can be categorized into two types, the error due to surface parameters of PNL in the system modeling and the rest from non-null interferometer by the approach of error storage subtraction. Experimental results show that, after systematic error is removed from testing result of non-null aspheric testing system, the aspheric surface is precisely reconstructed by ROR technique and the consideration of systematic error greatly increase the test accuracy of non-null aspheric testing system.
Li, Hongkai; Yuan, Zhongshang; Ji, Jiadong; Xu, Jing; Zhang, Tao; Zhang, Xiaoshuai; Xue, Fuzhong
2016-03-09
We propose a novel Markov Blanket-based repeated-fishing strategy (MBRFS) in attempt to increase the power of existing Markov Blanket method (DASSO-MB) and maintain its advantages in omic data analysis. Both simulation and real data analysis were conducted to assess its performances by comparing with other methods including χ(2) test with Bonferroni and B-H adjustment, least absolute shrinkage and selection operator (LASSO) and DASSO-MB. A serious of simulation studies showed that the true discovery rate (TDR) of proposed MBRFS was always close to zero under null hypothesis (odds ratio = 1 for each SNPs) with excellent stability in all three scenarios of independent phenotype-related SNPs without linkage disequilibrium (LD) around them, correlated phenotype-related SNPs without LD around them, and phenotype-related SNPs with strong LD around them. As expected, under different odds ratio and minor allel frequency (MAFs), MBRFS always had the best performances in capturing the true phenotype-related biomarkers with higher matthews correlation coefficience (MCC) for all three scenarios above. More importantly, since proposed MBRFS using the repeated fishing strategy, it still captures more phenotype-related SNPs with minor effects when non-significant phenotype-related SNPs emerged under χ(2) test after Bonferroni multiple correction. The various real omics data analysis, including GWAS data, DNA methylation data, gene expression data and metabolites data, indicated that the proposed MBRFS always detected relatively reasonable biomarkers. Our proposed MBRFS can exactly capture the true phenotype-related biomarkers with the reduction of false negative rate when the phenotype-related biomarkers are independent or correlated, as well as the circumstance that phenotype-related biomarkers are associated with non-phenotype-related ones.
Role of Plasmodium vivax Duffy-binding protein 1 in invasion of Duffy-null Africans
Gunalan, Karthigayan; Lo, Eugenia; Hostetler, Jessica B.; Yewhalaw, Delenasaw; Mu, Jianbing; Neafsey, Daniel E.; Yan, Guiyun; Miller, Louis H.
2016-01-01
The ability of the malaria parasite Plasmodium vivax to invade erythrocytes is dependent on the expression of the Duffy blood group antigen on erythrocytes. Consequently, Africans who are null for the Duffy antigen are not susceptible to P. vivax infections. Recently, P. vivax infections in Duffy-null Africans have been documented, raising the possibility that P. vivax, a virulent pathogen in other parts of the world, may expand malarial disease in Africa. P. vivax binds the Duffy blood group antigen through its Duffy-binding protein 1 (DBP1). To determine if mutations in DBP1 resulted in the ability of P. vivax to bind Duffy-null erythrocytes, we analyzed P. vivax parasites obtained from two Duffy-null individuals living in Ethiopia where Duffy-null and -positive Africans live side-by-side. We determined that, although the DBP1s from these parasites contained unique sequences, they failed to bind Duffy-null erythrocytes, indicating that mutations in DBP1 did not account for the ability of P. vivax to infect Duffy-null Africans. However, an unusual DNA expansion of DBP1 (three and eight copies) in the two Duffy-null P. vivax infections suggests that an expansion of DBP1 may have been selected to allow low-affinity binding to another receptor on Duffy-null erythrocytes. Indeed, we show that Salvador (Sal) I P. vivax infects Squirrel monkeys independently of DBP1 binding to Squirrel monkey erythrocytes. We conclude that P. vivax Sal I and perhaps P. vivax in Duffy-null patients may have adapted to use new ligand–receptor pairs for invasion. PMID:27190089
Reciprocity of weighted networks
Squartini, Tiziano; Picciolo, Francesco; Ruzzenenti, Franco; Garlaschelli, Diego
2013-01-01
In directed networks, reciprocal links have dramatic effects on dynamical processes, network growth, and higher-order structures such as motifs and communities. While the reciprocity of binary networks has been extensively studied, that of weighted networks is still poorly understood, implying an ever-increasing gap between the availability of weighted network data and our understanding of their dyadic properties. Here we introduce a general approach to the reciprocity of weighted networks, and define quantities and null models that consistently capture empirical reciprocity patterns at different structural levels. We show that, counter-intuitively, previous reciprocity measures based on the similarity of mutual weights are uninformative. By contrast, our measures allow to consistently classify different weighted networks according to their reciprocity, track the evolution of a network's reciprocity over time, identify patterns at the level of dyads and vertices, and distinguish the effects of flux (im)balances or other (a)symmetries from a true tendency towards (anti-)reciprocation. PMID:24056721
Reciprocity of weighted networks.
Squartini, Tiziano; Picciolo, Francesco; Ruzzenenti, Franco; Garlaschelli, Diego
2013-01-01
In directed networks, reciprocal links have dramatic effects on dynamical processes, network growth, and higher-order structures such as motifs and communities. While the reciprocity of binary networks has been extensively studied, that of weighted networks is still poorly understood, implying an ever-increasing gap between the availability of weighted network data and our understanding of their dyadic properties. Here we introduce a general approach to the reciprocity of weighted networks, and define quantities and null models that consistently capture empirical reciprocity patterns at different structural levels. We show that, counter-intuitively, previous reciprocity measures based on the similarity of mutual weights are uninformative. By contrast, our measures allow to consistently classify different weighted networks according to their reciprocity, track the evolution of a network's reciprocity over time, identify patterns at the level of dyads and vertices, and distinguish the effects of flux (im)balances or other (a)symmetries from a true tendency towards (anti-)reciprocation.
Stegen, James C.
2018-04-10
To improve predictions of ecosystem function in future environments, we need to integrate the ecological and environmental histories experienced by microbial communities with hydrobiogeochemistry across scales. A key issue is whether we can derive generalizable scaling relationships that describe this multiscale integration. There is a strong foundation for addressing these challenges. We have the ability to infer ecological history with null models and reveal impacts of environmental history through laboratory and field experimentation. Recent developments also provide opportunities to inform ecosystem models with targeted omics data. A major next step is coupling knowledge derived from such studies with multiscale modelingmore » frameworks that are predictive under non-steady-state conditions. This is particularly true for systems spanning dynamic interfaces, which are often hot spots of hydrobiogeochemical function. Here, we can advance predictive capabilities through a holistic perspective focused on the nexus of history, ecology, and hydrobiogeochemistry.« less
Job Strain and Ambulatory Blood Pressure: A Meta-Analysis and Systematic Review
Dobson, Marnie; Koutsouras, George; Schnall, Peter
2013-01-01
We reviewed evidence of the relationship between job strain and ambulatory blood pressure (ABP) in 29 studies (1985–2012). We conducted a quantitative meta-analysis on 22 cross-sectional studies of a single exposure to job strain. We systematically reviewed 1 case–control study, 3 studies of cumulative exposure to job strain, and 3 longitudinal studies. Single exposure to job strain in cross-sectional studies was associated with higher work systolic and diastolic ABP. Associations were stronger in men than women and in studies of broad-based populations than those with limited occupational variance. Biases toward the null were common, suggesting that our summary results underestimated the true association. Job strain is a risk factor for blood pressure elevation. Workplace surveillance programs are needed to assess the prevalence of job strain and high ABP and to facilitate workplace cardiovascular risk reduction interventions. PMID:23327240
DOE Office of Scientific and Technical Information (OSTI.GOV)
Stegen, James C.
To improve predictions of ecosystem function in future environments, we need to integrate the ecological and environmental histories experienced by microbial communities with hydrobiogeochemistry across scales. A key issue is whether we can derive generalizable scaling relationships that describe this multiscale integration. There is a strong foundation for addressing these challenges. We have the ability to infer ecological history with null models and reveal impacts of environmental history through laboratory and field experimentation. Recent developments also provide opportunities to inform ecosystem models with targeted omics data. A major next step is coupling knowledge derived from such studies with multiscale modelingmore » frameworks that are predictive under non-steady-state conditions. This is particularly true for systems spanning dynamic interfaces, which are often hot spots of hydrobiogeochemical function. Here, we can advance predictive capabilities through a holistic perspective focused on the nexus of history, ecology, and hydrobiogeochemistry.« less
Stegen, James C
2018-01-01
To improve predictions of ecosystem function in future environments, we need to integrate the ecological and environmental histories experienced by microbial communities with hydrobiogeochemistry across scales. A key issue is whether we can derive generalizable scaling relationships that describe this multiscale integration. There is a strong foundation for addressing these challenges. We have the ability to infer ecological history with null models and reveal impacts of environmental history through laboratory and field experimentation. Recent developments also provide opportunities to inform ecosystem models with targeted omics data. A major next step is coupling knowledge derived from such studies with multiscale modeling frameworks that are predictive under non-steady-state conditions. This is particularly true for systems spanning dynamic interfaces, which are often hot spots of hydrobiogeochemical function. We can advance predictive capabilities through a holistic perspective focused on the nexus of history, ecology, and hydrobiogeochemistry.
Toroidally symmetric plasma vortex at tokamak divertor null point
Umansky, M. V.; Ryutov, D. D.
2016-03-09
Reduced MHD equations are used for studying toroidally symmetric plasma dynamics near the divertor null point. Numerical solution of these equations exhibits a plasma vortex localized at the null point with the time-evolution defined by interplay of the curvature drive, magnetic restoring force, and dissipation. Convective motion is easier to achieve for a second-order null (snowflake) divertor than for a regular x-point configuration, and the size of the convection zone in a snowflake configuration grows with plasma pressure at the null point. In conclusion, the trends in simulations are consistent with tokamak experiments which indicate the presence of enhanced transportmore » at the null point.« less
Off-Axis Nulling Transfer Function Measurement: A First Assessment
NASA Technical Reports Server (NTRS)
Vedova, G. Dalla; Menut, J.-L.; Millour, F.; Petrov, R.; Cassaing, F.; Danchi, W. C.; Jacquinod, S.; Lhome, E.; Lopez, B.; Lozi, J.;
2013-01-01
We want to study a polychromatic inverse problem method with nulling interferometers to obtain information on the structures of the exozodiacal light. For this reason, during the first semester of 2013, thanks to the support of the consortium PERSEE, we launched a campaign of laboratory measurements with the nulling interferometric test bench PERSEE, operating with 9 spectral channels between J and K bands. Our objective is to characterise the transfer function, i.e. the map of the null as a function of wavelength for an off-axis source, the null being optimised on the central source or on the source photocenter. We were able to reach on-axis null depths better than 10(exp -4). This work is part of a broader project aiming at creating a simulator of a nulling interferometer in which typical noises of a real instrument are introduced. We present here our first results.
IN VITRO INTERACTIONS BETWEEN LACTIC ACID SOLUTION AND ART GLASS-IONOMER CEMENTS
Wang, Linda; Cefaly, Daniela Francisca Gigo; dos Santos, Janaína Lima; dos Santos, Jean Rodrigo; Lauris, José Roberto Pereira; Mondelli, Rafael Francisco Lia; Atta, Maria Teresa
2009-01-01
Objectives: Production of acids such as lactic acid contributes to establish a cariogenic environment that leads to dental substrate demineralization. Fluoride plays an important role in this case and, as fluoride-releasing materials, glass-ionomer cements are expected to contribute to minimize deleterious reactions. This study evaluated interactions of glass-ionomer cements used in atraumatic restorative treatment (ART-GICs) with an aqueous lactic acid solution, testing the null hypotheses that no changes occur in the pH of the solution or on the surface roughness and mass of the ART-GICs when exposed to lactic acid solution over a 6-week period. Material and Methods: Ketac Molar, Fuji IX, Vitro Molar and Magic Glass were tested, and compared to Filtek Z250 and Ketac Fil Plus as control groups. Six specimens of each material were made according to manufacturers' instructions. The pH of the solution and roughness and mass changes of each specimen were determined over 6 weeks. Each specimen was individually stored in 2 mL of 0.02 M lactic acid solution for 1 week, renewing the solution every week. pH of solution and mass of the specimens were monitored weekly, and surface roughness of the specimens was assessed before and at the end of the 6-week acid challenge. pH and mass data were analyzed statistically by repeated measures using one-way ANOVA and Tukey's post-hoc tests for each material. Paired t-tests were used for roughness analysis. Tukey's post-hoc tests were applied to verify differences of final roughness among the materials. Significance level was set at 5%. Results: The null hypotheses were partially rejected. All materials were able to increase the pH of the lactic acid solution and presented rougher surfaces after immersion, while mass change was minimal and generally not statistically significant. Conclusions: These findings can be helpful to predict the performance of these materials under clinical conditions. A protective action against the carious process with significant surface damage due to erosion may be expected. PMID:19668984
Predictors of scientific understanding of middle school students
NASA Astrophysics Data System (ADS)
Strate, Joshua Matthew
The purpose of this study was to determine if middle school student scientific understanding could be predicted by the variables: standardized 5th grade score in science, standardized 5th grade score in mathematics, standardized 5th grade score in reading, student attitude towards science, socioeconomic status, gender, and ethnicity. The areas of the comprehensive literature review were trends in science learning and teaching, research in the K-12 science education arena, what factors have influenced K-12 science education, scientific understanding, what research has been done on K-12 scientific understanding, and what factors have influenced science understanding in the K-12 arenas. Based on the results of the literature review, the researcher of this study examined a sample of middle school 8th grade students. An Attitude Towards Science Survey (SATS) Simpson & Oliver (1990) and a Survey of Scientific Understandings (Klapper, DeLucia, & Trent, 1993) were administered to these 116 middle school 8th grade students drawn from a total population of 1109 who attend this middle school in a typical county in Florida during the 2010- 2011 school year. Multiple linear regression analysis was used to test each sub-hypothesis and to provide a model that attempted to predict student scientific understanding. Seven null sub-hypotheses were formed to determine if there were significant relationships between student scientific understanding and the abovementioned variables. The results of the tests of the seven null sub-hypotheses showed that the sub-hypothesis that involved socioeconomic status was rejected, which indicated that the socioeconomic status of a family does influence the level of scientific understanding of a student. Low SES students performed lower on the scientific understanding survey, on average, than high SES students. This study can be a source of information for teachers in low-income schools by recognizing potential areas of concern for low-income students in their science classrooms. The study is also a guide for administrators in developing science curriculum that is designed to remediate critical science content. Recommendations, further research, and implications for stakeholders in the science education process are then identified in order to focus on the concerns that these stakeholders need to address through a needs assessment.
Loss of Vitamin D Receptor Produces Polyuria by Increasing Thirst
Kong, Juan; Zhang, Zhongyi; Li, Dongdong; Wong, Kari E.; Zhang, Yan; Szeto, Frances L.; Musch, Mark W.; Li, Yan Chun
2008-01-01
Vitamin D receptor (VDR)-null mice develop polyuria, but the underlying mechanism remains unknown. In this study, we investigated the relationship between vitamin D and homeostasis of water and electrolytes. VDR-null mice had polyuria, but the urine osmolarity was normal as a result of high salt excretion. The urinary responses to water restriction and to vasopressin were similar between wild-type and VDR-null mice, suggesting intact fluid-handling capacity in VDR-null mice. Compared with wild-type mice, however, renin and angiotensin II were dramatically upregulated in the kidney and brain of VDR-null mice, leading to a marked increase in water intake and salt appetite. Angiotensin II–mediated upregulation of intestinal NHE3 expression partially explained the increased salt absorption and excretion in VDR-null mice. In the brain of VDR-null mice, expression of c-Fos, which is known to associate with increased water intake, was increased in the hypothalamic paraventricular nucleus and the subfornical organ. Treatment with an angiotensin II type 1 receptor antagonist normalized water intake, urinary volume, and c-Fos expression in VDR-null mice. Furthermore, despite a salt-deficient diet to reduce intestinal salt absorption, VDR-null mice still maintained the increased water intake and urinary output. Together, these data indicate that the polyuria observed in VDR-null mice is not caused by impaired renal fluid handling or increased intestinal salt absorption but rather is the result of increased water intake induced by the increase in systemic and brain angiotensin II. PMID:18832438
Loss of vitamin D receptor produces polyuria by increasing thirst.
Kong, Juan; Zhang, Zhongyi; Li, Dongdong; Wong, Kari E; Zhang, Yan; Szeto, Frances L; Musch, Mark W; Li, Yan Chun
2008-12-01
Vitamin D receptor (VDR)-null mice develop polyuria, but the underlying mechanism remains unknown. In this study, we investigated the relationship between vitamin D and homeostasis of water and electrolytes. VDR-null mice had polyuria, but the urine osmolarity was normal as a result of high salt excretion. The urinary responses to water restriction and to vasopressin were similar between wild-type and VDR-null mice, suggesting intact fluid-handling capacity in VDR-null mice. Compared with wild-type mice, however, renin and angiotensin II were dramatically upregulated in the kidney and brain of VDR-null mice, leading to a marked increase in water intake and salt appetite. Angiotensin II-mediated upregulation of intestinal NHE3 expression partially explained the increased salt absorption and excretion in VDR-null mice. In the brain of VDR-null mice, expression of c-Fos, which is known to associate with increased water intake, was increased in the hypothalamic paraventricular nucleus and the subfornical organ. Treatment with an angiotensin II type 1 receptor antagonist normalized water intake, urinary volume, and c-Fos expression in VDR-null mice. Furthermore, despite a salt-deficient diet to reduce intestinal salt absorption, VDR-null mice still maintained the increased water intake and urinary output. Together, these data indicate that the polyuria observed in VDR-null mice is not caused by impaired renal fluid handling or increased intestinal salt absorption but rather is the result of increased water intake induced by the increase in systemic and brain angiotensin II.
Abnormal Mammary Development in 129:STAT1-Null Mice is Stroma-Dependent
Cardiff, Robert D.; Trott, Josephine F.; Hovey, Russell C.; Hubbard, Neil E.; Engelberg, Jesse A.; Tepper, Clifford G.; Willis, Brandon J.; Khan, Imran H.; Ravindran, Resmi K.; Chan, Szeman R.; Schreiber, Robert D.; Borowsky, Alexander D.
2015-01-01
Female 129:Stat1-null mice (129S6/SvEvTac-Stat1tm1Rds homozygous) uniquely develop estrogen-receptor (ER)-positive mammary tumors. Herein we report that the mammary glands (MG) of these mice have altered growth and development with abnormal terminal end buds alongside defective branching morphogenesis and ductal elongation. We also find that the 129:Stat1-null mammary fat pad (MFP) fails to sustain the growth of 129S6/SvEv wild-type and Stat1-null epithelium. These abnormalities are partially reversed by elevated serum progesterone and prolactin whereas transplantation of wild-type bone marrow into 129:Stat1-null mice does not reverse the MG developmental defects. Medium conditioned by 129:Stat1-null epithelium-cleared MFP does not stimulate epithelial proliferation, whereas it is stimulated by medium conditioned by epithelium-cleared MFP from either wild-type or 129:Stat1-null females having elevated progesterone and prolactin. Microarrays and multiplexed cytokine assays reveal that the MG of 129:Stat1-null mice has lower levels of growth factors that have been implicated in normal MG growth and development. Transplanted 129:Stat1-null tumors and their isolated cells also grow slower in 129:Stat1-null MG compared to wild-type recipient MG. These studies demonstrate that growth of normal and neoplastic 129:Stat1-null epithelium is dependent on the hormonal milieu and on factors from the mammary stroma such as cytokines. While the individual or combined effects of these factors remains to be resolved, our data supports the role of STAT1 in maintaining a tumor-suppressive MG microenvironment. PMID:26075897
Tennese, Alysa A; Wevrick, Rachel
2011-03-01
Hypothalamic dysfunction may underlie endocrine abnormalities in Prader-Willi syndrome (PWS), a genetic disorder that features GH deficiency, obesity, and infertility. One of the genes typically inactivated in PWS, MAGEL2, is highly expressed in the hypothalamus. Mice deficient for Magel2 are obese with increased fat mass and decreased lean mass and have blunted circadian rhythm. Here, we demonstrate that Magel2-null mice have abnormalities of hypothalamic endocrine axes that recapitulate phenotypes in PWS. Magel2-null mice had elevated basal corticosterone levels, and although male Magel2-null mice had an intact corticosterone response to restraint and to insulin-induced hypoglycemia, female Magel2-null mice failed to respond to hypoglycemia with increased corticosterone. After insulin-induced hypoglycemia, Magel2-null mice of both sexes became more profoundly hypoglycemic, and female mice were slower to recover euglycemia, suggesting an impaired hypothalamic counterregulatory response. GH insufficiency can produce abnormal body composition, such as that seen in PWS and in Magel2-null mice. Male Magel2-null mice had Igf-I levels similar to control littermates. Female Magel2-null mice had low Igf-I levels and reduced GH release in response to stimulation with ghrelin. Female Magel2-null mice did respond to GHRH, suggesting that their GH deficiency has a hypothalamic rather than pituitary origin. Female Magel2-null mice also had higher serum adiponectin than expected, considering their increased fat mass, and thyroid (T(4)) levels were low. Together, these findings strongly suggest that loss of MAGEL2 contributes to endocrine dysfunction of hypothalamic origin in individuals with PWS.
Tsuchiya, Hiroyuki; da Costa, Kerry-Ann; Lee, Sangmin; Renga, Barbara; Jaeschke, Hartmut; Yang, Zhihong; Orena, Stephen J; Goedken, Michael J; Zhang, Yuxia; Kong, Bo; Lebofsky, Margitta; Rudraiah, Swetha; Smalling, Rana; Guo, Grace; Fiorucci, Stefano; Zeisel, Steven H; Wang, Li
2015-05-01
Hyperhomocysteinemia is often associated with liver and metabolic diseases. We studied nuclear receptors that mediate oscillatory control of homocysteine homeostasis in mice. We studied mice with disruptions in Nr0b2 (called small heterodimer partner [SHP]-null mice), betaine-homocysteine S-methyltransferase (Bhmt), or both genes (BHMT-null/SHP-null mice), along with mice with wild-type copies of these genes (controls). Hyperhomocysteinemia was induced by feeding mice alcohol (National Institute on Alcohol Abuse and Alcoholism binge model) or chow diets along with water containing 0.18% DL-homocysteine. Some mice were placed on diets containing cholic acid (1%) or cholestyramine (2%) or high-fat diets (60%). Serum and livers were collected during a 24-hour light-dark cycle and analyzed by RNA-seq, metabolomic, and quantitative polymerase chain reaction, immunoblot, and chromatin immunoprecipitation assays. SHP-null mice had altered timing in expression of genes that regulate homocysteine metabolism compared with control mice. Oscillatory production of S-adenosylmethionine, betaine, choline, phosphocholine, glyceophosphocholine, cystathionine, cysteine, hydrogen sulfide, glutathione disulfide, and glutathione, differed between SHP-null mice and control mice. SHP inhibited transcriptional activation of Bhmt and cystathionine γ-lyase by FOXA1. Expression of Bhmt and cystathionine γ-lyase was decreased when mice were fed cholic acid but increased when they were placed on diets containing cholestyramine or high-fat content. Diets containing ethanol or homocysteine induced hyperhomocysteinemia and glucose intolerance in control, but not SHP-null, mice. In BHMT-null and BHMT-null/SHP-null mice fed a control liquid, lipid vacuoles were observed in livers. Ethanol feeding induced accumulation of macrovesicular lipid vacuoles to the greatest extent in BHMT-null and BHMT-null/SHP-null mice. Disruption of Shp in mice alters timing of expression of genes that regulate homocysteine metabolism and the liver responses to ethanol and homocysteine. SHP inhibits the transcriptional activation of Bhmt and cystathionine γ-lyase by FOXA1. Copyright © 2015 AGA Institute. Published by Elsevier Inc. All rights reserved.
Kumar, Ramiya; Mota, Linda C.; Litoff, Elizabeth J.; Rooney, John P.; Boswell, W. Tyler; Courter, Elliott; Henderson, Charles M.; Hernandez, Juan P.; Corton, J. Christopher; Moore, David D.
2017-01-01
Targeted mutant models are common in mechanistic toxicology experiments investigating the absorption, metabolism, distribution, or elimination (ADME) of chemicals from individuals. Key models include those for xenosensing transcription factors and cytochrome P450s (CYP). Here we investigated changes in transcript levels, protein expression, and steroid hydroxylation of several xenobiotic detoxifying CYPs in constitutive androstane receptor (CAR)-null and two CYP-null mouse models that have subfamily members regulated by CAR; the Cyp3a-null and a newly described Cyp2b9/10/13-null mouse model. Compensatory changes in CYP expression that occur in these models may also occur in polymorphic humans, or may complicate interpretation of ADME studies performed using these models. The loss of CAR causes significant changes in several CYPs probably due to loss of CAR-mediated constitutive regulation of these CYPs. Expression and activity changes include significant repression of Cyp2a and Cyp2b members with corresponding drops in 6α- and 16β-testosterone hydroxylase activity. Further, the ratio of 6α-/15α-hydroxylase activity, a biomarker of sexual dimorphism in the liver, indicates masculinization of female CAR-null mice, suggesting a role for CAR in the regulation of sexually dimorphic liver CYP profiles. The loss of Cyp3a causes fewer changes than CAR. Nevertheless, there are compensatory changes including gender-specific increases in Cyp2a and Cyp2b. Cyp2a and Cyp2b were down-regulated in CAR-null mice, suggesting activation of CAR and potentially PXR following loss of the Cyp3a members. However, the loss of Cyp2b causes few changes in hepatic CYP transcript levels and almost no significant compensatory changes in protein expression or activity with the possible exception of 6α-hydroxylase activity. This lack of a compensatory response in the Cyp2b9/10/13-null mice is probably due to low CYP2B hepatic expression, especially in male mice. Overall, compensatory and regulatory CYP changes followed the order CAR-null > Cyp3a-null > Cyp2b-null mice. PMID:28350814
Survival of glucose phosphate isomerase null somatic cells and germ cells in adult mouse chimaeras
Keighren, Margaret A.; Flockhart, Jean H.
2016-01-01
ABSTRACT The mouse Gpi1 gene encodes the glycolytic enzyme glucose phosphate isomerase. Homozygous Gpi1−/− null mouse embryos die but a previous study showed that some homozygous Gpi1−/− null cells survived when combined with wild-type cells in fetal chimaeras. One adult female Gpi1−/−↔Gpi1c/c chimaera with functional Gpi1−/− null oocytes was also identified in a preliminary study. The aims were to characterise the survival of Gpi1−/− null cells in adult Gpi1−/−↔Gpi1c/c chimaeras and determine if Gpi1−/− null germ cells are functional. Analysis of adult Gpi1−/−↔Gpi1c/c chimaeras with pigment and a reiterated transgenic lineage marker showed that low numbers of homozygous Gpi1−/− null cells could survive in many tissues of adult chimaeras, including oocytes. Breeding experiments confirmed that Gpi1−/− null oocytes in one female Gpi1−/−↔Gpi1c/c chimaera were functional and provided preliminary evidence that one male putative Gpi1−/−↔Gpi1c/c chimaera produced functional spermatozoa from homozygous Gpi1−/− null germ cells. Although the male chimaera was almost certainly Gpi1−/−↔Gpi1c/c, this part of the study is considered preliminary because only blood was typed for GPI. Gpi1−/− null germ cells should survive in a chimaeric testis if they are supported by wild-type Sertoli cells. It is also feasible that spermatozoa could bypass a block at GPI, but not blocks at some later steps in glycolysis, by using fructose, rather than glucose, as the substrate for glycolysis. Although chimaera analysis proved inefficient for studying the fate of Gpi1−/− null germ cells, it successfully identified functional Gpi1−/− null oocytes and revealed that some Gpi1−/− null cells could survive in many adult tissues. PMID:27103217
Kumar, Ramiya; Mota, Linda C; Litoff, Elizabeth J; Rooney, John P; Boswell, W Tyler; Courter, Elliott; Henderson, Charles M; Hernandez, Juan P; Corton, J Christopher; Moore, David D; Baldwin, William S
2017-01-01
Targeted mutant models are common in mechanistic toxicology experiments investigating the absorption, metabolism, distribution, or elimination (ADME) of chemicals from individuals. Key models include those for xenosensing transcription factors and cytochrome P450s (CYP). Here we investigated changes in transcript levels, protein expression, and steroid hydroxylation of several xenobiotic detoxifying CYPs in constitutive androstane receptor (CAR)-null and two CYP-null mouse models that have subfamily members regulated by CAR; the Cyp3a-null and a newly described Cyp2b9/10/13-null mouse model. Compensatory changes in CYP expression that occur in these models may also occur in polymorphic humans, or may complicate interpretation of ADME studies performed using these models. The loss of CAR causes significant changes in several CYPs probably due to loss of CAR-mediated constitutive regulation of these CYPs. Expression and activity changes include significant repression of Cyp2a and Cyp2b members with corresponding drops in 6α- and 16β-testosterone hydroxylase activity. Further, the ratio of 6α-/15α-hydroxylase activity, a biomarker of sexual dimorphism in the liver, indicates masculinization of female CAR-null mice, suggesting a role for CAR in the regulation of sexually dimorphic liver CYP profiles. The loss of Cyp3a causes fewer changes than CAR. Nevertheless, there are compensatory changes including gender-specific increases in Cyp2a and Cyp2b. Cyp2a and Cyp2b were down-regulated in CAR-null mice, suggesting activation of CAR and potentially PXR following loss of the Cyp3a members. However, the loss of Cyp2b causes few changes in hepatic CYP transcript levels and almost no significant compensatory changes in protein expression or activity with the possible exception of 6α-hydroxylase activity. This lack of a compensatory response in the Cyp2b9/10/13-null mice is probably due to low CYP2B hepatic expression, especially in male mice. Overall, compensatory and regulatory CYP changes followed the order CAR-null > Cyp3a-null > Cyp2b-null mice.
ERIC Educational Resources Information Center
McCabe, Declan J.; Knight, Evelyn J.
2016-01-01
Since being introduced by Connor and Simberloff in response to Diamond's assembly rules, null model analysis has been a controversial tool in community ecology. Despite being commonly used in the primary literature, null model analysis has not featured prominently in general textbooks. Complexity of approaches along with difficulty in interpreting…
Alignment of optical system components using an ADM beam through a null assembly
NASA Technical Reports Server (NTRS)
Hayden, Joseph E. (Inventor); Olczak, Eugene G. (Inventor)
2010-01-01
A system for testing an optical surface includes a rangefinder configured to emit a light beam and a null assembly located between the rangefinder and the optical surface. The null assembly is configured to receive and to reflect the emitted light beam toward the optical surface. The light beam reflected from the null assembly is further reflected back from the optical surface toward the null assembly as a return light beam. The rangefinder is configured to measure a distance to the optical surface using the return light beam.
Interpreting null results from measurements with uncertain correlations: an info-gap approach.
Ben-Haim, Yakov
2011-01-01
Null events—not detecting a pernicious agent—are the basis for declaring the agent is absent. Repeated nulls strengthen confidence in the declaration. However, correlations between observations are difficult to assess in many situations and introduce uncertainty in interpreting repeated nulls. We quantify uncertain correlations using an info-gap model, which is an unbounded family of nested sets of possible probabilities. An info-gap model is nonprobabilistic and entails no assumption about a worst case. We then evaluate the robustness, to uncertain correlations, of estimates of the probability of a null event. This is then the basis for evaluating a nonprobabilistic robustness-based confidence interval for the probability of a null. © 2010 Society for Risk Analysis.
Online gaming in the context of social anxiety.
Lee, Bianca W; Leeson, Peter R C
2015-06-01
In 2014, over 23 million individuals were playing massive multiplayer online role-playing games (MMORPGs). In light of the framework provided by Davis's (2001) cognitive-behavioral model of pathological Internet use, social anxiety, expressions of true self, and perceived in-game and face-to-face social support were examined as predictors of Generalized Problematic Internet Use Scale (GPIUS) scores and hours spent playing MMORPGs per week. Data were collected from adult MMORPG players via an online survey (N = 626). Using structural equation modeling, the hypothesized model was tested on 1 half of the sample (N = 313) and then retested on the other half of the sample. The results indicated that the hypothesized model fit the data well in both samples. Specifically, expressing true self in game, higher levels of social anxiety, larger numbers of in-game social supports, and fewer supportive face-to-face relationships were significant predictors of higher GPIUS scores, and the number of in-game supports was significantly associated with time spent playing. The current study provides clinicians and researchers with a deeper understanding of MMORPG use by being the first to apply, test, and replicate a theory-driven model across 2 samples of MMORPG players. In addition, the present findings suggest that a psychometric measure of MMORPG usage is more indicative of players' psychological and social well-being than is time spent playing these games. (PsycINFO Database Record (c) 2015 APA, all rights reserved).
Abu-Amero, Khaled K; Al-Boudari, Olayan M; Mohamed, Gamal H; Dzimiri, Nduna
2006-01-01
Background The association of the deletion in GSTT1 and GSTM1 genes with coronary artery disease (CAD) among smokers is controversial. In addition, no such investigation has previously been conducted among Arabs. Methods We genotyped 1054 CAD patients and 762 controls for GSTT1 and GSTM1 deletion by multiplex polymerase chain reaction. Both CAD and controls were Saudi Arabs. Results In the control group (n = 762), 82.3% had the T wild M wildgenotype, 9% had the Twild M null, 2.4% had the Tnull M wild and 6.3% had the Tnull M null genotype. Among the CAD group (n = 1054), 29.5% had the Twild M wild genotype, 26.6% (p < .001) had the Twild M null, 8.3% (p < .001) had the Tnull M wild and 35.6% (p < .001) had the Tnull M null genotype, indicating a significant association of the Twild M null, Tnull M wild and Tnull M null genotypes with CAD. Univariate analysis also showed that smoking, age, hypercholesterolemia and hypertriglyceridemia, diabetes mellitus, family history of CAD, hypertension and obesity are all associated with CAD, whereas gender and myocardial infarction are not. Binary logistic regression for smoking and genotypes indicated that only M null and Tnullare interacting with smoking. However, further subgroup analysis stratifying the data by smoking status suggested that genotype-smoking interactions have no effect on the development of CAD. Conclusion GSTT1 and GSTM1 null-genotypes are risk factor for CAD independent of genotype-smoking interaction. PMID:16620396
Qualification of a Null Lens Using Image-Based Phase Retrieval
NASA Technical Reports Server (NTRS)
Bolcar, Matthew R.; Aronstein, David L.; Hill, Peter C.; Smith, J. Scott; Zielinski, Thomas P.
2012-01-01
In measuring the figure error of an aspheric optic using a null lens, the wavefront contribution from the null lens must be independently and accurately characterized in order to isolate the optical performance of the aspheric optic alone. Various techniques can be used to characterize such a null lens, including interferometry, profilometry and image-based methods. Only image-based methods, such as phase retrieval, can measure the null-lens wavefront in situ - in single-pass, and at the same conjugates and in the same alignment state in which the null lens will ultimately be used - with no additional optical components. Due to the intended purpose of a Dull lens (e.g., to null a large aspheric wavefront with a near-equal-but-opposite spherical wavefront), characterizing a null-lens wavefront presents several challenges to image-based phase retrieval: Large wavefront slopes and high-dynamic-range data decrease the capture range of phase-retrieval algorithms, increase the requirements on the fidelity of the forward model of the optical system, and make it difficult to extract diagnostic information (e.g., the system F/#) from the image data. In this paper, we present a study of these effects on phase-retrieval algorithms in the context of a null lens used in component development for the Climate Absolute Radiance and Refractivity Observatory (CLARREO) mission. Approaches for mitigation are also discussed.
Ghosh, Soma; Sur, Surojit; Yerram, Sashidhar R.; Rago, Carlo; Bhunia, Anil K.; Hossain, M. Zulfiquer; Paun, Bogdan C.; Ren, Yunzhao R.; Iacobuzio-Donahue, Christine A.; Azad, Nilofer A.; Kern, Scott E.
2014-01-01
Large-magnitude numerical distinctions (>10-fold) among drug responses of genetically contrasting cancers were crucial for guiding the development of some targeted therapies. Similar strategies brought epidemiological clues and prevention goals for genetic diseases. Such numerical guides, however, were incomplete or low magnitude for Fanconi anemia pathway (FANC) gene mutations relevant to cancer in FANC-mutation carriers (heterozygotes). We generated a four-gene FANC-null cancer panel, including the engineering of new PALB2/FANCN-null cancer cells by homologous recombination. A characteristic matching of FANCC-null, FANCG-null, BRCA2/FANCD1-null, and PALB2/FANCN-null phenotypes was confirmed by uniform tumor regression on single-dose cross-linker therapy in mice and by shared chemical hypersensitivities to various inter-strand cross-linking agents and γ-radiation in vitro. Some compounds, however, had contrasting magnitudes of sensitivity; a strikingly high (19- to 22-fold) hypersensitivity was seen among PALB2-null and BRCA2-null cells for the ethanol metabolite, acetaldehyde, associated with widespread chromosomal breakage at a concentration not producing breaks in parental cells. Because FANC-defective cancer cells can share or differ in their chemical sensitivities, patterns of selective hypersensitivity hold implications for the evolutionary understanding of this pathway. Clinical decisions for cancer-relevant prevention and management of FANC-mutation carriers could be modified by expanded studies of high-magnitude sensitivities. PMID:24200853
Null conformal Killing-Yano tensors and Birkhoff theorem
NASA Astrophysics Data System (ADS)
Ferrando, Joan Josep; Sáez, Juan Antonio
2016-04-01
We study the space-times admitting a null conformal Killing-Yano tensor whose divergence defines a Killing vector. We analyze the similarities and differences with the recently studied non null case (Ferrando and Sáez in Gen Relativ Gravit 47:1911, 2015). The results by Barnes concerning the Birkhoff theorem for the case of null orbits are analyzed and generalized.
NASA Technical Reports Server (NTRS)
Goepfert, T. M.; McCarthy, M.; Kittrell, F. S.; Stephens, C.; Ullrich, R. L.; Brinkley, B. R.; Medina, D.
2000-01-01
Mammary epithelial cells from p53 null mice have been shown recently to exhibit an increased risk for tumor development. Hormonal stimulation markedly increased tumor development in p53 null mammary cells. Here we demonstrate that mammary tumors arising in p53 null mammary cells are highly aneuploid, with greater than 70% of the tumor cells containing altered chromosome number and a mean chromosome number of 56. Normal mammary cells of p53 null genotype and aged less than 14 wk do not exhibit aneuploidy in primary cell culture. Significantly, the hormone progesterone, but not estrogen, increases the incidence of aneuploidy in morphologically normal p53 null mammary epithelial cells. Such cells exhibited 40% aneuploidy and a mean chromosome number of 54. The increase in aneuploidy measured in p53 null tumor cells or hormonally stimulated normal p53 null cells was not accompanied by centrosome amplification. These results suggest that normal levels of progesterone can facilitate chromosomal instability in the absence of the tumor suppressor gene, p53. The results support the emerging hypothesis based both on human epidemiological and animal model studies that progesterone markedly enhances mammary tumorigenesis.
Memon, Mushtaq A.; Anway, Matthew D.; Covert, Trevor R.; Uzumcu, Mehmet; Skinner, Michael K.
2008-01-01
The role transforming growth factor beta (TGFb) isoforms TGFb1, TGFb2 and TGFb3 have in the regulation of embryonic gonadal development was investigated with the use of null-mutant (i.e. knockout) mice for each of the TGFb isoforms. Late embryonic gonadal development was investigated because homozygote TGFb null-mutant mice generally die around birth, with some embryonic loss as well. In the testis, the TGFb1 null-mutant mice had a decrease in the number of germ cells at birth, postnatal day 0 (P0). In the testis, the TGFb2 null-mutant mice had a decrease in the number of seminiferous cords at embryonic day 15 (E15). In the ovary, the TGFb2 null-mutant mice had an increase in the number of germ cells at P0. TGFb isoforms appear to have a role in gonadal development, but interactions between the isoforms is speculated to compensate in the different TGFb isoform null-mutant mice. PMID:18790002
NASA Astrophysics Data System (ADS)
Al-Sarrani, Nauaf
The purpose of this study was to obtain Science faculty concerns and professional development needs to adopt blended learning in their teaching at Taibah University. To answer these two research questions the survey instrument was designed to collect quantitative and qualitative data from close-ended and open-ended questions. The participants' general characteristics were first presented, then the quantitative measures were presented as the results of the null hypotheses. The data analysis for research question one revealed a statistically significant difference in the participants' concerns in adopting BL by their gender sig = .0015. The significances were found in stages one (sig = .000) and stage five (sig = .006) for female faculty. Therefore, null hypothesis 1.1 was rejected (There are no statistically significant differences between science faculty's gender and their concerns in adopting BL). The data analysis indicated also that there were no relationships between science faculty's age, academic rank, nationality, country of graduation and years of teaching experience and their concerns in adopting BL in their teaching, so the null hypotheses 1.2-7 were accepted (There are no statistically significant differences between Science faculty's age and their concerns in adopting BL, there are no statistically significant differences between Science faculty's academic rank and their concerns in adopting BL, there are no statistically significant differences between Science faculty's nationality and their concerns in adopting BL, there are no statistically significant differences between Science faculty's content area and their concerns in adopting BL, there are no statistically significant differences between Science faculty's country of graduation and their concerns in adopting BL and there are no statistically significant differences between Science faculty's years of teaching experience and their concerns in adopting BL). The data analyses for research question two revealed that there was a statistically significant difference between science faculty's use of technology in teaching by department and their attitudes towards technology integration in the Science curriculum. Lambda MANOVA test result was sig =.019 at the alpha = .05 level. Follow up ANOVA result indicated that Chemistry department was significant in the use of computer-based technology (sig =.049) and instructional technology use (sig =.041). Therefore, null hypothesis 2.1 was rejected (There are no statistically significant differences between science faculty's attitudes towards technology integration in the Science curriculum and faculty's use of technology in teaching by department). The data also revealed that there was no statistically significant difference (p<.05) between science faculty's use of technology in teaching by department and their instructional technology use on pedagogy. Therefore, null hypothesis 2.2 was accepted (There are no statistically significant differences between science faculty's perceptions of the effects of faculty IT use on pedagogy and faculty's use of technology in teaching by department). The data also revealed that there was a statistically significant difference between science faculty's use of technology in teaching by department and their professional development needs in adopting BL. Lambda MANOVA test result was .007 at the alpha = .05 level. The follow up ANOVA results showed that the value of significance of Science faculty's professional development needs for adopting BL was smaller than .05 in the Chemistry department with sig =.001 in instructional technology use. Therefore, null hypothesis 2.3 was rejected (There are no statistically significant differences between Science faculty's perceptions of technology professional development needs and faculty's use of technology in teaching by department). Qualitative measures included analyzing data based on answers to three open-ended questions, numbers thirty-six, seventy-four, and seventy-five. These three questions were on blended learning concerns comments (question 36, which had 10 units), professional development activities, support, or incentive requested (question 74, which had 28 units), and the most important professional development activities, support, or incentive (question 75, which had 37 units). These questions yielded 75 units, 23 categories and 8 themes that triangulated with the quantitative data. These 8 themes were then combined to obtain overall themes for all qualitative questions in the study. The two most important themes were "Professional development" with three categories; Professional development through workshops (10 units), Workshops (10 units), Professional development (5 units) and the second overall theme was "Technical support" with two categories: Internet connectivity (4 units), and Technical support (4 units). Finally, based on quantitative and qualitative data, the summary, conclusions, and recommendations for Taibah University regarding faculty adoption of BL in teaching were presented. The recommendations for future studies focused on Science faculty Level of Use and technology use in Saudi universities.
Savitz, D A
1993-01-01
Epidemiologic research concerning electric and magnetic fields in relation to cancer has focused on the potential etiologic roles of residential exposure on childhood cancer and occupational exposure on adult leukemia and brain cancer. Future residential studies must concentrate on exposure assessment that is enhanced by developing models of historical exposure, assessment of the relation between magnetic fields and wire codes, and consideration of alternate exposure indices. Study design issues deserving attention include possible biases in random digit dialing control selection, consideration of the temporal course of exposure and disease, and acquisition of the necessary information to assess the potential value of ecologic studies. Highest priorities are comprehensive evaluation of exposure patterns and sources and examination of the sociology and geography of residential wire codes. Future occupational studies should also concentrate on improved exposure assessment with increased attention to nonutility worker populations and development of historical exposure indicators that are superior to job titles alone. Potential carcinogens in the workplace that could act as confounders need to be more carefully examined. The temporal relation between exposure and disease and possible effect modification by other workplace agents should be incorporated into future studies. The most pressing need is for measurement of exposure patterns in a variety of worker populations and performance of traditional epidemiologic evaluations of cancer occurrence. The principal source of bias toward the null is nondifferential misclassification of exposure with improvements expected to enhance any true etiologic association that is present. Biases away from the null might include biased control selection in residential studies and chemical carcinogens acting as confounders in occupational studies. PMID:8206046
Wang, Yuanjia; Chen, Huaihou
2012-01-01
Summary We examine a generalized F-test of a nonparametric function through penalized splines and a linear mixed effects model representation. With a mixed effects model representation of penalized splines, we imbed the test of an unspecified function into a test of some fixed effects and a variance component in a linear mixed effects model with nuisance variance components under the null. The procedure can be used to test a nonparametric function or varying-coefficient with clustered data, compare two spline functions, test the significance of an unspecified function in an additive model with multiple components, and test a row or a column effect in a two-way analysis of variance model. Through a spectral decomposition of the residual sum of squares, we provide a fast algorithm for computing the null distribution of the test, which significantly improves the computational efficiency over bootstrap. The spectral representation reveals a connection between the likelihood ratio test (LRT) in a multiple variance components model and a single component model. We examine our methods through simulations, where we show that the power of the generalized F-test may be higher than the LRT, depending on the hypothesis of interest and the true model under the alternative. We apply these methods to compute the genome-wide critical value and p-value of a genetic association test in a genome-wide association study (GWAS), where the usual bootstrap is computationally intensive (up to 108 simulations) and asymptotic approximation may be unreliable and conservative. PMID:23020801
Wang, Yuanjia; Chen, Huaihou
2012-12-01
We examine a generalized F-test of a nonparametric function through penalized splines and a linear mixed effects model representation. With a mixed effects model representation of penalized splines, we imbed the test of an unspecified function into a test of some fixed effects and a variance component in a linear mixed effects model with nuisance variance components under the null. The procedure can be used to test a nonparametric function or varying-coefficient with clustered data, compare two spline functions, test the significance of an unspecified function in an additive model with multiple components, and test a row or a column effect in a two-way analysis of variance model. Through a spectral decomposition of the residual sum of squares, we provide a fast algorithm for computing the null distribution of the test, which significantly improves the computational efficiency over bootstrap. The spectral representation reveals a connection between the likelihood ratio test (LRT) in a multiple variance components model and a single component model. We examine our methods through simulations, where we show that the power of the generalized F-test may be higher than the LRT, depending on the hypothesis of interest and the true model under the alternative. We apply these methods to compute the genome-wide critical value and p-value of a genetic association test in a genome-wide association study (GWAS), where the usual bootstrap is computationally intensive (up to 10(8) simulations) and asymptotic approximation may be unreliable and conservative. © 2012, The International Biometric Society.
Efficient Plastid Transformation in Arabidopsis.
Yu, Qiguo; Lutz, Kerry Ann; Maliga, Pal
2017-09-01
Plastid transformation is routine in tobacco ( Nicotiana tabacum ) but 100-fold less frequent in Arabidopsis ( Arabidopsis thaliana ), preventing its use in plastid biology. A recent study revealed that null mutations in ACC2 , encoding a plastid-targeted acetyl-coenzyme A carboxylase, cause hypersensitivity to spectinomycin. We hypothesized that plastid transformation efficiency should increase in the acc2 background, because when ACC2 is absent, fatty acid biosynthesis becomes dependent on translation of the plastid-encoded ACC β-carboxylase subunit. We bombarded ACC2 -defective Arabidopsis leaves with a vector carrying a selectable spectinomycin resistance ( aadA ) gene and gfp , encoding the green fluorescence protein GFP. Spectinomycin-resistant clones were identified as green cell clusters on a spectinomycin medium. Plastid transformation was confirmed by GFP accumulation from the second open reading frame of a polycistronic messenger RNA, which would not be translated in the cytoplasm. We obtained one to two plastid transformation events per bombarded sample in spectinomycin-hypersensitive Slavice and Columbia acc2 knockout backgrounds, an approximately 100-fold enhanced plastid transformation frequency. Slavice and Columbia are accessions in which plant regeneration is uncharacterized or difficult to obtain. A practical system for Arabidopsis plastid transformation will be obtained by creating an ACC2 null background in a regenerable Arabidopsis accession. The recognition that the duplicated ACCase in Arabidopsis is an impediment to plastid transformation provides a rational template to implement plastid transformation in related recalcitrant crops. © 2017 American Society of Plant Biologists. All Rights Reserved.
Complexity of CNC transcription factors as revealed by gene targeting of the Nrf3 locus.
Derjuga, Anna; Gourley, Tania S; Holm, Teresa M; Heng, Henry H Q; Shivdasani, Ramesh A; Ahmed, Rafi; Andrews, Nancy C; Blank, Volker
2004-04-01
Cap'n'collar (CNC) family basic leucine zipper transcription factors play crucial roles in the regulation of mammalian gene expression and development. To determine the in vivo function of the CNC protein Nrf3 (NF-E2-related factor 3), we generated mice deficient in this transcription factor. We performed targeted disruption of two Nrf3 exons coding for CNC homology, basic DNA-binding, and leucine zipper dimerization domains. Nrf3 null mice developed normally and revealed no obvious phenotypic differences compared to wild-type animals. Nrf3(-/-) mice were fertile, and gross anatomy as well as behavior appeared normal. The mice showed normal age progression and did not show any apparent additional phenotype during their life span. We observed no differences in various blood parameters and chemistry values. We infected wild-type and Nrf3(-/-) mice with acute lymphocytic choriomeningitis virus and found no differences in these animals with respect to their number of virus-specific CD8 and CD4 T cells as well as their B-lymphocyte response. To determine whether the mild phenotype of Nrf3 null animals is due to functional redundancy, we generated mice deficient in multiple CNC factors. Contrary to our expectations, an absence of Nrf3 does not seem to cause additional lethality in compound Nrf3(-/-)/Nrf2(-/-) and Nrf3(-/-)/p45(-/-) mice. We hypothesize that the role of Nrf3 in vivo may become apparent only after appropriate challenge to the mice.
Enrichment analysis in high-throughput genomics - accounting for dependency in the NULL.
Gold, David L; Coombes, Kevin R; Wang, Jing; Mallick, Bani
2007-03-01
Translating the overwhelming amount of data generated in high-throughput genomics experiments into biologically meaningful evidence, which may for example point to a series of biomarkers or hint at a relevant pathway, is a matter of great interest in bioinformatics these days. Genes showing similar experimental profiles, it is hypothesized, share biological mechanisms that if understood could provide clues to the molecular processes leading to pathological events. It is the topic of further study to learn if or how a priori information about the known genes may serve to explain coexpression. One popular method of knowledge discovery in high-throughput genomics experiments, enrichment analysis (EA), seeks to infer if an interesting collection of genes is 'enriched' for a Consortium particular set of a priori Gene Ontology Consortium (GO) classes. For the purposes of statistical testing, the conventional methods offered in EA software implicitly assume independence between the GO classes. Genes may be annotated for more than one biological classification, and therefore the resulting test statistics of enrichment between GO classes can be highly dependent if the overlapping gene sets are relatively large. There is a need to formally determine if conventional EA results are robust to the independence assumption. We derive the exact null distribution for testing enrichment of GO classes by relaxing the independence assumption using well-known statistical theory. In applications with publicly available data sets, our test results are similar to the conventional approach which assumes independence. We argue that the independence assumption is not detrimental.
MeCP2 Affects Skeletal Muscle Growth and Morphology through Non Cell-Autonomous Mechanisms.
Conti, Valentina; Gandaglia, Anna; Galli, Francesco; Tirone, Mario; Bellini, Elisa; Campana, Lara; Kilstrup-Nielsen, Charlotte; Rovere-Querini, Patrizia; Brunelli, Silvia; Landsberger, Nicoletta
2015-01-01
Rett syndrome (RTT) is an autism spectrum disorder mainly caused by mutations in the X-linked MECP2 gene and affecting roughly 1 out of 10.000 born girls. Symptoms range in severity and include stereotypical movement, lack of spoken language, seizures, ataxia and severe intellectual disability. Notably, muscle tone is generally abnormal in RTT girls and women and the Mecp2-null mouse model constitutively reflects this disease feature. We hypothesized that MeCP2 in muscle might physiologically contribute to its development and/or homeostasis, and conversely its defects in RTT might alter the tissue integrity or function. We show here that a disorganized architecture, with hypotrophic fibres and tissue fibrosis, characterizes skeletal muscles retrieved from Mecp2-null mice. Alterations of the IGF-1/Akt/mTOR pathway accompany the muscle phenotype. A conditional mouse model selectively depleted of Mecp2 in skeletal muscles is characterized by healthy muscles that are morphologically and molecularly indistinguishable from those of wild-type mice raising the possibility that hypotonia in RTT is mainly, if not exclusively, mediated by non-cell autonomous effects. Our results suggest that defects in paracrine/endocrine signaling and, in particular, in the GH/IGF axis appear as the major cause of the observed muscular defects. Remarkably, this is the first study describing the selective deletion of Mecp2 outside the brain. Similar future studies will permit to unambiguously define the direct impact of MeCP2 on tissue dysfunctions.
Vairamani, Kanimozhi; Wang, Hong-Sheng; Medvedovic, Mario; Lorenz, John N; Shull, Gary E
2017-08-04
Loss of the AE3 Cl - /HCO 3 - exchanger (Slc4a3) in mice causes an impaired cardiac force-frequency response and heart failure under some conditions but the mechanisms are not known. To better understand the functions of AE3, we performed RNA Seq analysis of AE3-null and wild-type mouse hearts and evaluated the data with respect to three hypotheses (CO 2 disposal, facilitation of Na + -loading, and recovery from an alkaline load) that have been proposed for its physiological functions. Gene Ontology and PubMatrix analyses of differentially expressed genes revealed a hypoxia response and changes in vasodilation and angiogenesis genes that strongly support the CO 2 disposal hypothesis. Differential expression of energy metabolism genes, which indicated increased glucose utilization and decreased fatty acid utilization, were consistent with adaptive responses to perturbations of O 2 /CO 2 balance in AE3-null myocytes. Given that the myocardium is an obligate aerobic tissue and consumes large amounts of O 2 , the data suggest that loss of AE3, which has the potential to extrude CO 2 in the form of HCO 3 - , impairs O 2 /CO 2 balance in cardiac myocytes. These results support a model in which the AE3 Cl - /HCO 3 - exchanger, coupled with parallel Cl - and H + -extrusion mechanisms and extracellular carbonic anhydrase, is responsible for active transport-mediated disposal of CO 2 .
Does Short-Term Hunger Increase Trust and Trustworthiness in a High Trust Society?
Rantapuska, Elias; Freese, Riitta; Jääskeläinen, Iiro P.; Hytönen, Kaisa
2017-01-01
We build on the social heuristics hypothesis, the literature on the glucose model of self-control, and recent challenges on these hypotheses to investigate whether individuals exhibit a change in degree of trust and reciprocation after consumption of a meal. We induce short-term manipulation of hunger followed by the trust game and a decision on whether to leave personal belongings in an unlocked and unsupervised room. Our results are inconclusive. While, we report hungry individuals trusting and reciprocating more than those who have just consumed a meal in a high trust society, we fail to reject the null with small number of observations (N = 101) and experimental sessions (N = 8). In addition, we find no evidence of short-term hunger having an impact on charitable giving or decisions in public good game. PMID:29163315
Lahr, Daniel J. G.; Laughinghouse, H. Dail; Oliverio, Angela; Gao, Feng; Katz, Laura A.
2014-01-01
Microscopy has revealed a tremendous diversity of bacterial and eukaryotic forms. More recent molecular analyses show discordance in estimates of biodiversity based on morphological analyses. Moreover, phylogenetic analyses of the diversity of microbial forms have revealed evidence of convergence at scales as large as interdomain – i.e. convergent forms shared between bacteria and eukaryotes. Here, we highlight examples of such discordance, focusing on exemplary lineages such as testate amoebae, ciliates and cyanobacteria, which have long histories of morphological study. We discuss examples in two categories: 1) morphologically identical (or highly similar) individuals that are genetically distinct and 2) morphologically distinct individuals that are genetically distinct. We argue that hypotheses about discordance can be tested using the concept of neutral morphologies, or more broadly neutral phenotypes, as a null hypothesis. PMID:25156897
On the Interpretation and Use of Mediation: Multiple Perspectives on Mediation Analysis
Agler, Robert; De Boeck, Paul
2017-01-01
Mediation analysis has become a very popular approach in psychology, and it is one that is associated with multiple perspectives that are often at odds, often implicitly. Explicitly discussing these perspectives and their motivations, advantages, and disadvantages can help to provide clarity to conversations and research regarding the use and refinement of mediation models. We discuss five such pairs of perspectives on mediation analysis, their associated advantages and disadvantages, and their implications: with vs. without a mediation hypothesis, specific effects vs. a global model, directness vs. indirectness of causation, effect size vs. null hypothesis testing, and hypothesized vs. alternative explanations. Discussion of the perspectives is facilitated by a small simulation study. Some philosophical and linguistic considerations are briefly discussed, as well as some other perspectives we do not develop here. PMID:29187828
Natural frequencies improve Bayesian reasoning in simple and complex inference tasks
Hoffrage, Ulrich; Krauss, Stefan; Martignon, Laura; Gigerenzer, Gerd
2015-01-01
Representing statistical information in terms of natural frequencies rather than probabilities improves performance in Bayesian inference tasks. This beneficial effect of natural frequencies has been demonstrated in a variety of applied domains such as medicine, law, and education. Yet all the research and applications so far have been limited to situations where one dichotomous cue is used to infer which of two hypotheses is true. Real-life applications, however, often involve situations where cues (e.g., medical tests) have more than one value, where more than two hypotheses (e.g., diseases) are considered, or where more than one cue is available. In Study 1, we show that natural frequencies, compared to information stated in terms of probabilities, consistently increase the proportion of Bayesian inferences made by medical students in four conditions—three cue values, three hypotheses, two cues, or three cues—by an average of 37 percentage points. In Study 2, we show that teaching natural frequencies for simple tasks with one dichotomous cue and two hypotheses leads to a transfer of learning to complex tasks with three cue values and two cues, with a proportion of 40 and 81% correct inferences, respectively. Thus, natural frequencies facilitate Bayesian reasoning in a much broader class of situations than previously thought. PMID:26528197
Seafood prices reveal impacts of a major ecological disturbance
Smith, Martin D.; Oglend, Atle; Kirkpatrick, A. Justin; Asche, Frank; Bennear, Lori S.; Craig, J. Kevin; Nance, James M.
2017-01-01
Coastal hypoxia (dissolved oxygen ≤ 2 mg/L) is a growing problem worldwide that threatens marine ecosystem services, but little is known about economic effects on fisheries. Here, we provide evidence that hypoxia causes economic impacts on a major fishery. Ecological studies of hypoxia and marine fauna suggest multiple mechanisms through which hypoxia can skew a population’s size distribution toward smaller individuals. These mechanisms produce sharp predictions about changes in seafood markets. Hypoxia is hypothesized to decrease the quantity of large shrimp relative to small shrimp and increase the price of large shrimp relative to small shrimp. We test these hypotheses using time series of size-based prices. Naive quantity-based models using treatment/control comparisons in hypoxic and nonhypoxic areas produce null results, but we find strong evidence of the hypothesized effects in the relative prices: Hypoxia increases the relative price of large shrimp compared with small shrimp. The effects of fuel prices provide supporting evidence. Empirical models of fishing effort and bioeconomic simulations explain why quantifying effects of hypoxia on fisheries using quantity data has been inconclusive. Specifically, spatial-dynamic feedbacks across the natural system (the fish stock) and human system (the mobile fishing fleet) confound “treated” and “control” areas. Consequently, analyses of price data, which rely on a market counterfactual, are able to reveal effects of the ecological disturbance that are obscured in quantity data. Our results are an important step toward quantifying the economic value of reduced upstream nutrient loading in the Mississippi Basin and are broadly applicable to other coupled human-natural systems. PMID:28137850
POWER-ENHANCED MULTIPLE DECISION FUNCTIONS CONTROLLING FAMILY-WISE ERROR AND FALSE DISCOVERY RATES.
Peña, Edsel A; Habiger, Joshua D; Wu, Wensong
2011-02-01
Improved procedures, in terms of smaller missed discovery rates (MDR), for performing multiple hypotheses testing with weak and strong control of the family-wise error rate (FWER) or the false discovery rate (FDR) are developed and studied. The improvement over existing procedures such as the Šidák procedure for FWER control and the Benjamini-Hochberg (BH) procedure for FDR control is achieved by exploiting possible differences in the powers of the individual tests. Results signal the need to take into account the powers of the individual tests and to have multiple hypotheses decision functions which are not limited to simply using the individual p -values, as is the case, for example, with the Šidák, Bonferroni, or BH procedures. They also enhance understanding of the role of the powers of individual tests, or more precisely the receiver operating characteristic (ROC) functions of decision processes, in the search for better multiple hypotheses testing procedures. A decision-theoretic framework is utilized, and through auxiliary randomizers the procedures could be used with discrete or mixed-type data or with rank-based nonparametric tests. This is in contrast to existing p -value based procedures whose theoretical validity is contingent on each of these p -value statistics being stochastically equal to or greater than a standard uniform variable under the null hypothesis. Proposed procedures are relevant in the analysis of high-dimensional "large M , small n " data sets arising in the natural, physical, medical, economic and social sciences, whose generation and creation is accelerated by advances in high-throughput technology, notably, but not limited to, microarray technology.
Seafood prices reveal impacts of a major ecological disturbance.
Smith, Martin D; Oglend, Atle; Kirkpatrick, A Justin; Asche, Frank; Bennear, Lori S; Craig, J Kevin; Nance, James M
2017-02-14
Coastal hypoxia (dissolved oxygen ≤ 2 mg/L) is a growing problem worldwide that threatens marine ecosystem services, but little is known about economic effects on fisheries. Here, we provide evidence that hypoxia causes economic impacts on a major fishery. Ecological studies of hypoxia and marine fauna suggest multiple mechanisms through which hypoxia can skew a population's size distribution toward smaller individuals. These mechanisms produce sharp predictions about changes in seafood markets. Hypoxia is hypothesized to decrease the quantity of large shrimp relative to small shrimp and increase the price of large shrimp relative to small shrimp. We test these hypotheses using time series of size-based prices. Naive quantity-based models using treatment/control comparisons in hypoxic and nonhypoxic areas produce null results, but we find strong evidence of the hypothesized effects in the relative prices: Hypoxia increases the relative price of large shrimp compared with small shrimp. The effects of fuel prices provide supporting evidence. Empirical models of fishing effort and bioeconomic simulations explain why quantifying effects of hypoxia on fisheries using quantity data has been inconclusive. Specifically, spatial-dynamic feedbacks across the natural system (the fish stock) and human system (the mobile fishing fleet) confound "treated" and "control" areas. Consequently, analyses of price data, which rely on a market counterfactual, are able to reveal effects of the ecological disturbance that are obscured in quantity data. Our results are an important step toward quantifying the economic value of reduced upstream nutrient loading in the Mississippi Basin and are broadly applicable to other coupled human-natural systems.
Theories of the origin and evolution of the giant planets
NASA Technical Reports Server (NTRS)
Pollack, J. B.; Bodenheimer, P.
1989-01-01
Following the accretion of solids and gases in the solar nebula, the giant planets contracted to their present sizes over the age of the solar system. It is presently hypothesized that this contraction was rapid, but not hydrodynamic; at a later stage, a nebular disk out of which the regular satellites formed may have been spun out of the outer envelope of the contracting giant planets due to a combination of total angular momentum conservation and the outward transfer of specific angular momentum in the envelope. If these hypotheses are true, the composition of the irregular satellites directly reflects the composition of planetesimals from which the giant planets formed, while the composition of the regular satellites is indicative of the composition of the less volatile components of the outer envelopes of the giant planets.
Accelerating Translational Research through Open Science: The Neuro Experiment.
Gold, E Richard
2016-12-01
Translational research is often afflicted by a fundamental problem: a limited understanding of disease mechanisms prevents effective targeting of new treatments. Seeking to accelerate research advances and reimagine its role in the community, the Montreal Neurological Institute (Neuro) announced in the spring of 2016 that it is launching a five-year experiment during which it will adopt Open Science-open data, open materials, and no patenting-across the institution. The experiment seeks to examine two hypotheses. The first is whether the Neuro's Open Science initiative will attract new private partners. The second hypothesis is that the Neuro's institution-based approach will draw companies to the Montreal region, where the Neuro is based, leading to the creation of a local knowledge hub. This article explores why these hypotheses are likely to be true and describes the Neuro's approach to exploring them.
Magnetoacoustic Waves in a Stratified Atmosphere with a Magnetic Null Point
DOE Office of Scientific and Technical Information (OSTI.GOV)
Tarr, Lucas A.; Linton, Mark; Leake, James, E-mail: lucas.tarr.ctr@nrl.navy.mil
2017-03-01
We perform nonlinear MHD simulations to study the propagation of magnetoacoustic waves from the photosphere to the low corona. We focus on a 2D system with a gravitationally stratified atmosphere and three photospheric concentrations of magnetic flux that produce a magnetic null point with a magnetic dome topology. We find that a single wavepacket introduced at the lower boundary splits into multiple secondary wavepackets. A portion of the packet refracts toward the null owing to the varying Alfvén speed. Waves incident on the equipartition contour surrounding the null, where the sound and Alfvén speeds coincide, partially transmit, reflect, and mode-convertmore » between branches of the local dispersion relation. Approximately 15.5% of the wavepacket’s initial energy ( E {sub input}) converges on the null, mostly as a fast magnetoacoustic wave. Conversion is very efficient: 70% of the energy incident on the null is converted to slow modes propagating away from the null, 7% leaves as a fast wave, and the remaining 23% (0.036 E {sub input}) is locally dissipated. The acoustic energy leaving the null is strongly concentrated along field lines near each of the null’s four separatrices. The portion of the wavepacket that refracts toward the null, and the amount of current accumulation, depends on the vertical and horizontal wavenumbers and the centroid position of the wavepacket as it crosses the photosphere. Regions that refract toward or away from the null do not simply coincide with regions of open versus closed magnetic field or regions of particular field orientation. We also model wavepacket propagation using a WKB method and find that it agrees qualitatively, though not quantitatively, with the results of the numerical simulation.« less
OBSERVATION OF MAGNETIC RECONNECTION AT A 3D NULL POINT ASSOCIATED WITH A SOLAR ERUPTION
DOE Office of Scientific and Technical Information (OSTI.GOV)
Sun, J. Q.; Yang, K.; Cheng, X.
Magnetic null has long been recognized as a special structure serving as a preferential site for magnetic reconnection (MR). However, the direct observational study of MR at null-points is largely lacking. Here, we show the observations of MR around a magnetic null associated with an eruption that resulted in an M1.7 flare and a coronal mass ejection. The Geostationary Operational Environmental Satellites X-ray profile of the flare exhibited two peaks at ∼02:23 UT and ∼02:40 UT on 2012 November 8, respectively. Based on the imaging observations, we find that the first and also primary X-ray peak was originated from MRmore » in the current sheet (CS) underneath the erupting magnetic flux rope (MFR). On the other hand, the second and also weaker X-ray peak was caused by MR around a null point located above the pre-eruption MFR. The interaction of the null point and the erupting MFR can be described as a two-step process. During the first step, the erupting and fast expanding MFR passed through the null point, resulting in a significant displacement of the magnetic field surrounding the null. During the second step, the displaced magnetic field started to move back, resulting in a converging inflow and subsequently the MR around the null. The null-point reconnection is a different process from the current sheet reconnection in this flare; the latter is the cause of the main peak of the flare, while the former is the cause of the secondary peak of the flare and the conspicuous high-lying cusp structure.« less
An improved null model for assessing the net effects of multiple stressors on communities.
Thompson, Patrick L; MacLennan, Megan M; Vinebrooke, Rolf D
2018-01-01
Ecological stressors (i.e., environmental factors outside their normal range of variation) can mediate each other through their interactions, leading to unexpected combined effects on communities. Determining whether the net effect of stressors is ecologically surprising requires comparing their cumulative impact to a null model that represents the linear combination of their individual effects (i.e., an additive expectation). However, we show that standard additive and multiplicative null models that base their predictions on the effects of single stressors on community properties (e.g., species richness or biomass) do not provide this linear expectation, leading to incorrect interpretations of antagonistic and synergistic responses by communities. We present an alternative, the compositional null model, which instead bases its predictions on the effects of stressors on individual species, and then aggregates them to the community level. Simulations demonstrate the improved ability of the compositional null model to accurately provide a linear expectation of the net effect of stressors. We simulate the response of communities to paired stressors that affect species in a purely additive fashion and compare the relative abilities of the compositional null model and two standard community property null models (additive and multiplicative) to predict these linear changes in species richness and community biomass across different combinations (both positive, negative, or opposite) and intensities of stressors. The compositional model predicts the linear effects of multiple stressors under almost all scenarios, allowing for proper classification of net effects, whereas the standard null models do not. Our findings suggest that current estimates of the prevalence of ecological surprises on communities based on community property null models are unreliable, and should be improved by integrating the responses of individual species to the community level as does our compositional null model. © 2017 John Wiley & Sons Ltd.
High frequency generation in the corona: Resonant cavities
NASA Astrophysics Data System (ADS)
Santamaria, I. C.; Van Doorsselaere, T.
2018-03-01
Aims: Null points are prominent magnetic field singularities in which the magnetic field strength strongly decreases in very small spatial scales. Around null points, predicted to be ubiquitous in the solar chromosphere and corona, the wave behavior changes considerably. Null points are also responsible for driving very energetic phenomena, and for contributing to chromospheric and coronal heating. In previous works we demonstrated that slow magneto-acoustic shock waves were generated in the chromosphere propagate through the null point, thereby producing a train of secondary shocks escaping along the field lines. A particular combination of the shock wave speeds generates waves at a frequency of 80 MHz. The present work aims to investigate this high frequency region around a coronal null point to give a plausible explanation to its generation at that particular frequency. Methods: We carried out a set of two-dimensional numerical simulations of wave propagation in the neighborhood of a null point located in the corona. We varied both the amplitude of the driver and the atmospheric properties to investigate the sensitivity of the high frequency waves to these parameters. Results: We demonstrate that the wave frequency is sensitive to the atmospheric parameters in the corona, but it is independent of the strength of the driver. Thus, the null point behaves as a resonant cavity generating waves at specific frequencies that depend on the background equilibrium model. Moreover, we conclude that the high frequency wave train generated at the null point is not necessarily a result of the interaction between the null point and a shock wave. This wave train can be also developed by the interaction between the null point and fast acoustic-like magneto-acoustic waves, that is, this interaction within the linear regime.
Kager, Leo; Bruce, Lesley J; Zeitlhofer, Petra; Flatt, Joanna F; Maia, Tabita M; Ribeiro, M Leticia; Fahrner, Bernhard; Fritsch, Gerhard; Boztug, Kaan; Haas, Oskar A
2017-03-01
We describe the second patient with anionic exchanger 1/band 3 null phenotype (band 3 null VIENNA ), which was caused by a novel nonsense mutation c.1430C>A (p.Ser477X) in exon 12 of SLC4A1. We also update on the previous band 3 null COIMBRA patient, thereby elucidating the physiological implications of total loss of AE1/band 3. Besides transfusion-dependent severe hemolytic anemia and complete distal renal tubular acidosis, dyserythropoiesis was identified in the band 3 null VIENNA patient, suggesting a role for band 3 in erythropoiesis. Moreover, we also, for the first time, report that long-term survival is possible in band 3 null patients. © 2016 Wiley Periodicals, Inc.
On the Penrose inequality along null hypersurfaces
NASA Astrophysics Data System (ADS)
Mars, Marc; Soria, Alberto
2016-06-01
The null Penrose inequality, i.e. the Penrose inequality in terms of the Bondi energy, is studied by introducing a functional on surfaces and studying its properties along a null hypersurface Ω extending to past null infinity. We prove a general Penrose-type inequality which involves the limit at infinity of the Hawking energy along a specific class of geodesic foliations called Geodesic Asymptotically Bondi (GAB), which are shown to always exist. Whenever this foliation approaches large spheres, this inequality becomes the null Penrose inequality and we recover the results of Ludvigsen-Vickers (1983 J. Phys. A: Math. Gen. 16 3349-53) and Bergqvist (1997 Class. Quantum Grav. 14 2577-83). By exploiting further properties of the functional along general geodesic foliations, we introduce an approach to the null Penrose inequality called the Renormalized Area Method and find a set of two conditions which imply the validity of the null Penrose inequality. One of the conditions involves a limit at infinity and the other a restriction on the spacetime curvature along the flow. We investigate their range of applicability in two particular but interesting cases, namely the shear-free and vacuum case, where the null Penrose inequality is known to hold from the results by Sauter (2008 PhD Thesis Zürich ETH), and the case of null shells propagating in the Minkowski spacetime. Finally, a general inequality bounding the area of the quasi-local black hole in terms of an asymptotic quantity intrinsic of Ω is derived.
Ghosh, Soma; Sur, Surojit; Yerram, Sashidhar R; Rago, Carlo; Bhunia, Anil K; Hossain, M Zulfiquer; Paun, Bogdan C; Ren, Yunzhao R; Iacobuzio-Donahue, Christine A; Azad, Nilofer A; Kern, Scott E
2014-01-01
Large-magnitude numerical distinctions (>10-fold) among drug responses of genetically contrasting cancers were crucial for guiding the development of some targeted therapies. Similar strategies brought epidemiological clues and prevention goals for genetic diseases. Such numerical guides, however, were incomplete or low magnitude for Fanconi anemia pathway (FANC) gene mutations relevant to cancer in FANC-mutation carriers (heterozygotes). We generated a four-gene FANC-null cancer panel, including the engineering of new PALB2/FANCN-null cancer cells by homologous recombination. A characteristic matching of FANCC-null, FANCG-null, BRCA2/FANCD1-null, and PALB2/FANCN-null phenotypes was confirmed by uniform tumor regression on single-dose cross-linker therapy in mice and by shared chemical hypersensitivities to various inter-strand cross-linking agents and γ-radiation in vitro. Some compounds, however, had contrasting magnitudes of sensitivity; a strikingly high (19- to 22-fold) hypersensitivity was seen among PALB2-null and BRCA2-null cells for the ethanol metabolite, acetaldehyde, associated with widespread chromosomal breakage at a concentration not producing breaks in parental cells. Because FANC-defective cancer cells can share or differ in their chemical sensitivities, patterns of selective hypersensitivity hold implications for the evolutionary understanding of this pathway. Clinical decisions for cancer-relevant prevention and management of FANC-mutation carriers could be modified by expanded studies of high-magnitude sensitivities. Copyright © 2014 American Society for Investigative Pathology. Published by Elsevier Inc. All rights reserved.
DOE Office of Scientific and Technical Information (OSTI.GOV)
Olshevsky, Vyacheslav; Lapenta, Giovanni; Divin, Andrey
We use kinetic particle-in-cell and MHD simulations supported by an observational data set to investigate magnetic reconnection in clusters of null points in space plasma. The magnetic configuration under investigation is driven by fast adiabatic flux rope compression that dissipates almost half of the initial magnetic field energy. In this phase powerful currents are excited producing secondary instabilities, and the system is brought into a state of “intermittent turbulence” within a few ion gyro-periods. Reconnection events are distributed all over the simulation domain and energy dissipation is rather volume-filling. Numerous spiral null points interconnected via their spines form null linesmore » embedded into magnetic flux ropes; null point pairs demonstrate the signatures of torsional spine reconnection. However, energy dissipation mainly happens in the shear layers formed by adjacent flux ropes with oppositely directed currents. In these regions radial null pairs are spontaneously emerging and vanishing, associated with electron streams and small-scale current sheets. The number of spiral nulls in the simulation outweighs the number of radial nulls by a factor of 5–10, in accordance with Cluster observations in the Earth's magnetosheath. Twisted magnetic fields with embedded spiral null points might indicate the regions of major energy dissipation for future space missions such as the Magnetospheric Multiscale Mission.« less
Nichols, Matthew; Elustondo, Pia A; Warford, Jordan; Thirumaran, Aruloli; Pavlov, Evgeny V; Robertson, George S
2017-08-01
The effects of global mitochondrial calcium (Ca 2+ ) uniporter (MCU) deficiency on hypoxic-ischemic (HI) brain injury, neuronal Ca 2+ handling, bioenergetics and hypoxic preconditioning (HPC) were examined. Forebrain mitochondria isolated from global MCU nulls displayed markedly reduced Ca 2+ uptake and Ca 2+ -induced opening of the membrane permeability transition pore. Despite evidence that these effects should be neuroprotective, global MCU nulls and wild-type (WT) mice suffered comparable HI brain damage. Energetic stress enhanced glycolysis and depressed Complex I activity in global MCU null, relative to WT, cortical neurons. HI reduced forebrain NADH levels more in global MCU nulls than WT mice suggesting that increased glycolytic consumption of NADH suppressed Complex I activity. Compared to WT neurons, pyruvate dehydrogenase (PDH) was hyper-phosphorylated in MCU nulls at several sites that lower the supply of substrates for the tricarboxylic acid cycle. Elevation of cytosolic Ca 2+ with glutamate or ionomycin decreased PDH phosphorylation in MCU null neurons suggesting the use of alternative mitochondrial Ca 2+ transport. Under basal conditions, global MCU nulls showed similar increases of Ca 2+ handling genes in the hippocampus as WT mice subjected to HPC. We propose that long-term adaptations, common to HPC, in global MCU nulls compromise resistance to HI brain injury and disrupt HPC.
Shocks and currents in stratified atmospheres with a magnetic null point
NASA Astrophysics Data System (ADS)
Tarr, Lucas A.; Linton, Mark
2017-08-01
We use the resistive MHD code LARE (Arber et al 2001) to inject a compressive MHD wavepacket into a stratified atmosphere that has a single magnetic null point, as recently described in Tarr et al 2017. The 2.5D simulation represents a slice through a small ephemeral region or area of plage. The strong gradients in field strength and connectivity related to the presence of the null produce substantially different dynamics compared to the more slowly varying fields typically used in simple sunspot models. The wave-null interaction produces a fast mode shock that collapses the null into a current sheet and generates a set of outward propagating (from the null) slow mode shocks confined to field lines near each separatrix. A combination of oscillatory reconnection and shock dissipation ultimately raise the plasma's internal energy at the null and along each separatrix by 25-50% above the background. The resulting pressure gradients must be balanced by Lorentz forces, so that the final state has contact discontinuities along each separatrix and a persistent current at the null. The simulation demonstrates that fast and slow mode waves localize currents to the topologically important locations of the field, just as their Alfvenic counterparts do, and also illustrates the necessity of treating waves and reconnection as coupled phenomena.
DOE Office of Scientific and Technical Information (OSTI.GOV)
Umansky, M. V.; Ryutov, D. D.
Reduced MHD equations are used for studying toroidally symmetric plasma dynamics near the divertor null point. Numerical solution of these equations exhibits a plasma vortex localized at the null point with the time-evolution defined by interplay of the curvature drive, magnetic restoring force, and dissipation. Convective motion is easier to achieve for a second-order null (snowflake) divertor than for a regular x-point configuration, and the size of the convection zone in a snowflake configuration grows with plasma pressure at the null point. In conclusion, the trends in simulations are consistent with tokamak experiments which indicate the presence of enhanced transportmore » at the null point.« less
NASA Astrophysics Data System (ADS)
Fragnelli, Vito; Patrone, Fioravante; Torre, Anna
2006-02-01
The lexicographic order is not representable by a real-valued function, contrary to many other orders or preorders. So, standard tools and results for well-posed minimum problems cannot be used. We prove that under suitable hypotheses it is however possible to guarantee the well-posedness of a lexicographic minimum over a compact or convex set. This result allows us to prove that some game theoretical solution concepts, based on lexicographic order are well-posed: in particular, this is true for the nucleolus.
Current progress on TPFI nulling architectures at Jet Propulsion Laboratory
NASA Technical Reports Server (NTRS)
Gappinger, Robert O.; Wallace, J. Kent; Bartos, Randall D.; Macdonald, Daniel R.; Brown, Kenneth A.
2005-01-01
Infrared interferometric nulling is a promising technology for exoplanet detection. Nulling research for the Terrestrial Planet Finder Interferometer has been exploring a variety of interferometer architectures at the Jet Propulsion Laboratory (JPL).
Cardiomyocyte-specific desmin rescue of desmin null cardiomyopathy excludes vascular involvement.
Weisleder, Noah; Soumaka, Elisavet; Abbasi, Shahrzad; Taegtmeyer, Heinrich; Capetanaki, Yassemi
2004-01-01
Mice deficient in desmin, the muscle-specific member of the intermediate filament gene family, display defects in all muscle types and particularly in the myocardium. Desmin null hearts develop cardiomyocyte hypertrophy and dilated cardiomyopathy (DCM) characterized by extensive myocyte cell death, calcific fibrosis and multiple ultrastructural defects. Several lines of evidence suggest impaired vascular function in desmin null animals. To determine whether altered capillary function or an intrinsic cardiomyocyte defect is responsible for desmin null DCM, transgenic mice were generated to rescue desmin expression specifically to cardiomyocytes. Desmin rescue mice display a wild-type cardiac phenotype with no fibrosis or calcification in the myocardium and normalization of coronary flow. Cardiomyocyte ultrastructure is also restored to normal. Markers of hypertrophy upregulated in desmin null hearts return to wild-type levels in desmin rescue mice. Working hearts were perfused to assess coronary flow and cardiac power. Restoration of a wild-type cardiac phenotype in a desmin null background by expression of desmin specifically within cardiomyocyte indicates that defects in the desmin null heart are due to an intrinsic cardiomyocytes defect rather than compromised coronary circulation.
DOE Office of Scientific and Technical Information (OSTI.GOV)
Lees, J.R.
This study was a systematic replication of a study by Stagliano (1981). Additional hypotheses concerning pretest, student major, and student section variance were tested. Achievement in energy knowledge and conservation attitudes attained by (a) lecture-discussion enriched with the Energy-Environment Simulator and (b) lecture-discussion methods of instruction were measured. Energy knowledge was measured on the Energy Knowledge Assessment Test (EKAT), and attitudes were measured on the Youth Energy Survey (YES), the Lecture-discussion simulation (LDS) used a two hour out-of-class activity in debriefing. The population consisted of 142 college student volunteers randomly selected, and assigned to one of two groups of 71more » students for each treatment. Stagliano used three groups (n = 35), one group receiving an energy-game treatment. Both studies used the pretest-posttest true experimental design. The present study included 28 hypotheses, eight of which were found to be significant. Stagliano used 12 hypotheses, all of which were rejected. The present study hypothesized that students who received the LDS treatment would obtain significantly higher scores on the EKAT and the YES instruments. Results showed that significance was found (alpha level .05) on the EKAT and also found on the YES total subscale when covaried for effects of pretest, student major, and student section. When covarying the effects of pretest scores only, significance was found on the EKAT. All YES hypotheses were rejected.« less
Specifications for Managed Strings, Second Edition
2010-05-01
const char * cstr , const size_t maxsize, const char *charset); 10 | CMU/SEI-2010-TR-018 Runtime-Constraints s shall not be a null pointer...strcreate_m function creates a managed string, referenced by s, given a conventional string cstr (which may be null or empty). maxsize specifies the...characters to those in the null-terminated byte string cstr (which may be empty). If charset is a null pointer, no restricted character set is defined. If
Continuous development of current sheets near and away from magnetic nulls
DOE Office of Scientific and Technical Information (OSTI.GOV)
Kumar, Sanjay; Bhattacharyya, R.
2016-04-15
The presented computations compare the strength of current sheets which develop near and away from the magnetic nulls. To ensure the spontaneous generation of current sheets, the computations are performed congruently with Parker's magnetostatic theorem. The simulations evince current sheets near two dimensional and three dimensional magnetic nulls as well as away from them. An important finding of this work is in the demonstration of comparative scaling of peak current density with numerical resolution, for these different types of current sheets. The results document current sheets near two dimensional magnetic nulls to have larger strength while exhibiting a stronger scalingmore » than the current sheets close to three dimensional magnetic nulls or away from any magnetic null. The comparative scaling points to a scenario where the magnetic topology near a developing current sheet is important for energetics of the subsequent reconnection.« less
Experimental evaluation of achromatic phase shifters for mid-infrared starlight suppression.
Gappinger, Robert O; Diaz, Rosemary T; Ksendzov, Alexander; Lawson, Peter R; Lay, Oliver P; Liewer, Kurt M; Loya, Frank M; Martin, Stefan R; Serabyn, Eugene; Wallace, James K
2009-02-10
Phase shifters are a key component of nulling interferometry, one of the potential routes to enabling the measurement of faint exoplanet spectra. Here, three different achromatic phase shifters are evaluated experimentally in the mid-infrared, where such nulling interferometers may someday operate. The methods evaluated include the use of dispersive glasses, a through-focus field inversion, and field reversals on reflection from antisymmetric flat-mirror periscopes. All three approaches yielded deep, broadband, mid-infrared nulls, but the deepest broadband nulls were obtained with the periscope architecture. In the periscope system, average null depths of 4x10(-5) were obtained with a 25% bandwidth, and 2x10(-5) with a 20% bandwidth, at a central wavelength of 9.5 mum. The best short term nulls at 20% bandwidth were approximately 9x10(-6), in line with error budget predictions and the limits of the current generation of hardware.
Light cone structure near null infinity of the Kerr metric
DOE Office of Scientific and Technical Information (OSTI.GOV)
Bai Shan; Shang Yu; Graduate School of Chinese Academy of Sciences, Beijing, 100080
2007-02-15
Motivated by our attempt to understand the question of angular momentum of a relativistic rotating source carried away by gravitational waves, in the asymptotic regime near future null infinity of the Kerr metric, a family of null hypersurfaces intersecting null infinity in shearfree (good) cuts are constructed by means of asymptotic expansion of the eikonal equation. The geometry of the null hypersurfaces as well as the asymptotic structure of the Kerr metric near null infinity are studied. To the lowest order in angular momentum, the Bondi-Sachs form of the Kerr metric is worked out. The Newman-Unti formalism is then furthermore » developed, with which the Newman-Penrose constants of the Kerr metric are computed and shown to be zero. Possible physical implications of the vanishing of the Newman-Penrose constants of the Kerr metric are also briefly discussed.« less
Biostatistics Series Module 2: Overview of Hypothesis Testing.
Hazra, Avijit; Gogtay, Nithya
2016-01-01
Hypothesis testing (or statistical inference) is one of the major applications of biostatistics. Much of medical research begins with a research question that can be framed as a hypothesis. Inferential statistics begins with a null hypothesis that reflects the conservative position of no change or no difference in comparison to baseline or between groups. Usually, the researcher has reason to believe that there is some effect or some difference which is the alternative hypothesis. The researcher therefore proceeds to study samples and measure outcomes in the hope of generating evidence strong enough for the statistician to be able to reject the null hypothesis. The concept of the P value is almost universally used in hypothesis testing. It denotes the probability of obtaining by chance a result at least as extreme as that observed, even when the null hypothesis is true and no real difference exists. Usually, if P is < 0.05 the null hypothesis is rejected and sample results are deemed statistically significant. With the increasing availability of computers and access to specialized statistical software, the drudgery involved in statistical calculations is now a thing of the past, once the learning curve of the software has been traversed. The life sciences researcher is therefore free to devote oneself to optimally designing the study, carefully selecting the hypothesis tests to be applied, and taking care in conducting the study well. Unfortunately, selecting the right test seems difficult initially. Thinking of the research hypothesis as addressing one of five generic research questions helps in selection of the right hypothesis test. In addition, it is important to be clear about the nature of the variables (e.g., numerical vs. categorical; parametric vs. nonparametric) and the number of groups or data sets being compared (e.g., two or more than two) at a time. The same research question may be explored by more than one type of hypothesis test. While this may be of utility in highlighting different aspects of the problem, merely reapplying different tests to the same issue in the hope of finding a P < 0.05 is a wrong use of statistics. Finally, it is becoming the norm that an estimate of the size of any effect, expressed with its 95% confidence interval, is required for meaningful interpretation of results. A large study is likely to have a small (and therefore "statistically significant") P value, but a "real" estimate of the effect would be provided by the 95% confidence interval. If the intervals overlap between two interventions, then the difference between them is not so clear-cut even if P < 0.05. The two approaches are now considered complementary to one another.
Biostatistics Series Module 2: Overview of Hypothesis Testing
Hazra, Avijit; Gogtay, Nithya
2016-01-01
Hypothesis testing (or statistical inference) is one of the major applications of biostatistics. Much of medical research begins with a research question that can be framed as a hypothesis. Inferential statistics begins with a null hypothesis that reflects the conservative position of no change or no difference in comparison to baseline or between groups. Usually, the researcher has reason to believe that there is some effect or some difference which is the alternative hypothesis. The researcher therefore proceeds to study samples and measure outcomes in the hope of generating evidence strong enough for the statistician to be able to reject the null hypothesis. The concept of the P value is almost universally used in hypothesis testing. It denotes the probability of obtaining by chance a result at least as extreme as that observed, even when the null hypothesis is true and no real difference exists. Usually, if P is < 0.05 the null hypothesis is rejected and sample results are deemed statistically significant. With the increasing availability of computers and access to specialized statistical software, the drudgery involved in statistical calculations is now a thing of the past, once the learning curve of the software has been traversed. The life sciences researcher is therefore free to devote oneself to optimally designing the study, carefully selecting the hypothesis tests to be applied, and taking care in conducting the study well. Unfortunately, selecting the right test seems difficult initially. Thinking of the research hypothesis as addressing one of five generic research questions helps in selection of the right hypothesis test. In addition, it is important to be clear about the nature of the variables (e.g., numerical vs. categorical; parametric vs. nonparametric) and the number of groups or data sets being compared (e.g., two or more than two) at a time. The same research question may be explored by more than one type of hypothesis test. While this may be of utility in highlighting different aspects of the problem, merely reapplying different tests to the same issue in the hope of finding a P < 0.05 is a wrong use of statistics. Finally, it is becoming the norm that an estimate of the size of any effect, expressed with its 95% confidence interval, is required for meaningful interpretation of results. A large study is likely to have a small (and therefore “statistically significant”) P value, but a “real” estimate of the effect would be provided by the 95% confidence interval. If the intervals overlap between two interventions, then the difference between them is not so clear-cut even if P < 0.05. The two approaches are now considered complementary to one another. PMID:27057011
Mohanty, S; Jermyn, K A; Early, A; Kawata, T; Aubry, L; Ceccarelli, A; Schaap, P; Williams, J G; Firtel, R A
1999-08-01
Dd-STATa is a structural and functional homologue of the metazoan STAT (Signal Transducer and Activator of Transcription) proteins. We show that Dd-STATa null cells exhibit several distinct developmental phenotypes. The aggregation of Dd-STATa null cells is delayed and they chemotax slowly to a cyclic AMP source, suggesting a role for Dd-STATa in these early processes. In Dd-STATa null strains, slug-like structures are formed but they have an aberrant pattern of gene expression. In such slugs, ecmB/lacZ, a marker that is normally specific for cells on the stalk cell differentiation pathway, is expressed throughout the prestalk region. Stalk cell differentiation in Dictyostelium has been proposed to be under negative control, mediated by repressor elements present in the promoters of stalk cell-specific genes. Dd-STATa binds these repressor elements in vitro and the ectopic expression of ecmB/lacZ in the null strain provides in vivo evidence that Dd-STATa is the repressor protein that regulates commitment to stalk cell differentiation. Dd-STATa null cells display aberrant behavior in a monolayer assay wherein stalk cell differentiation is induced using the stalk cell morphogen DIF. The ecmB gene, a general marker for stalk cell differentiation, is greatly overinduced by DIF in Dd-STATa null cells. Also, Dd-STATa null cells are hypersensitive to DIF for expression of ST/lacZ, a marker for the earliest stages in the differentiation of one of the stalk cell sub-types. We suggest that both these manifestations of DIF hypersensitivity in the null strain result from the balance between activation and repression of the promoter elements being tipped in favor of activation when the repressor is absent. Paradoxically, although Dd-STATa null cells are hypersensitive to the inducing effects of DIF and readily form stalk cells in monolayer assay, the Dd-STATa null cells show little or no terminal stalk cell differentiation within the slug. Dd-STATa null slugs remain developmentally arrested for several days before forming very small spore masses supported by a column of apparently undifferentiated cells. Thus, complete stalk cell differentiation appears to require at least two events: a commitment step, whereby the repression exerted by Dd-STATa is lifted, and a second step that is blocked in a Dd-STATa null organism. This latter step may involve extracellular cAMP, a known repressor of stalk cell differentiation, because Dd-STATa null cells are abnormally sensitive to the inhibitory effects of extracellular cyclic AMP.
Inman, Melissa; Perng, Guey-Chuen; Henderson, Gail; Ghiasi, Homayon; Nesburn, Anthony B.; Wechsler, Steven L.; Jones, Clinton
2001-01-01
The latency-associated transcript (LAT) is the only abundant herpes simplex virus type 1 (HSV-1) transcript expressed during latency. In the rabbit eye model, LAT null mutants do not reactivate efficiently from latency. We recently demonstrated that the LAT null mutant dLAT2903 induces increased levels of apoptosis in trigeminal ganglia of infected rabbits compared to LAT+ strains (G.-C. Perng, C. Jones, J. Ciacci-Zarella, M. Stone, G. Henderson, A. Yokht, S. M. Slanina, F. M. Hoffman, H. Ghiasi, A. B. Nesburn, and C. S. Wechsler, Science 287:1500–1503, 2000).The same study also demonstrated that a plasmid expressing LAT nucleotides 301 to 2659 enhanced cell survival of transfected cells after induction of apoptosis. Consequently, we hypothesized that LAT enhances spontaneous reactivation in part, because it promotes survival of infected neurons. Here we report on the ability of plasmids expressing different portions of the 5′ end of LAT to promote cell survival after induction of apoptosis. A plasmid expressing the first 1.5 kb of LAT (LAT nucleotides 1 to 1499) promoted cell survival in neuro-2A (mouse neuronal) and CV-1 (monkey fibroblast) cells. A plasmid expressing just the first 811 nucleotides of LAT promoted cell survival less efficiently. Plasmids expressing the first 661 nucleotides or less of LAT did not promote cell survival. We previously showed that a mutant expressing just the first 1.5 kb of LAT has wild-type spontaneous reactivation in rabbits, and a mutant expressing just the first 811 nucleotides of LAT has a reactivation frequency higher than that of dLAT2903 but lower than that of wild-type virus. In addition, mutants reported here for the first time, expressing just the first 661 or 76 nucleotides of LAT, had spontaneous reactivation indistinguishable from that of the LAT null mutant dLAT2903. In summary, these studies provide evidence that there is a functional relationship between the ability of LAT to promote cell survival and its ability to enhance spontaneous reactivation. PMID:11264353
The origin of nulls mode changes and timing noise in pulsars
NASA Astrophysics Data System (ADS)
Jones, P. B.
A solvable polar cap model obtained previously has normal states which may be associated with radio emission and null states. The solutions cannot be time-independent; the neutron star surface temperature T and mean surface nuclear charge Z are both functions of time. The normal and null states, and the transitions between them, form closed cycles in the T-Z plane. Normal-null transitions can occur inside a fraction of the area on the neutron star surface intersected by open magnetic flux lines. The fraction increases with pulsar period and becomes unity when the pulsar nears extinction. Frequency noise, mode changes, and pulse nulls have a common explanation in the transitions.
The origin of nulls, mode changes and timing noise in pulsars
NASA Astrophysics Data System (ADS)
Jones, P. B.
1982-09-01
A solvable polar cap model obtained previously has normal states which may be associated with radio emission, and null states. The solutions cannot be time-independent; the neutron star surface temperature T and mean surface nuclear charge Z are both functions of time. The normal and null states and the transitions between them, form closed cycles in the T-Z plane. Normal-null transitions can occur inside a fraction of the area of the neutron star surface intersected by open magnetic flux lines. The fraction increases with pulsar period and becomes unity when the pulsar nears extinction. Frequency noise, mode changes and pulse nulls have a common explanation in the transitions.
Moses, Tally
2015-05-01
Better understanding of the individual and environmental factors that promote adolescents' use of more or less adaptive coping strategies with mental illness stigma would inform interventions designed to bolster youth resilience. This cross-sectional study draws on data from research on adolescents' well-being after discharge from a first psychiatric hospitalization to explore the relationships between anticipated coping in reaction to a hypothetical social stigma scenario, and various factors conceptualized as 'coping resource' and 'coping vulnerability' factors. Focusing on coping strategies also identified in the companion article, we hypothesize that primary and secondary control engagement coping would relate to more coping resource and less coping vulnerability factors, and the opposite would be true for disengagement, aggression/confrontation and efforts to disconfirm stereotypes. Data were elicited from interviews with 102 adolescents within 7 days of discharge. Hypothesized coping resource factors included social resources, optimistic illness perceptions, better hospital experiences and higher self-esteem. Vulnerability factors included more previous stigma experiences, desire for concealment of treatment, more contingent self-worth, higher symptom levels and higher anticipated stress. Multivariate ordinary least squares (OLS) regression was used to analyze associations between coping strategy endorsement and correlates. Although some coping correlates 'behaved' contrary to expectations, for the most part, our hypotheses were confirmed. As expected, youth anticipating reacting to the stigmatizing situation with greater disengagement, aggression/confrontation or efforts to disconfirm stenotypes rated significantly lower on 'coping resources' such as self-esteem and higher on vulnerability factors such as symptom severity. The opposite was true for youth who anticipated exercising more primary and secondary control engagement coping. This study begins to identify factors that promote more and less adaptive coping strategies among youth at high risk for social stigma. Some factors that can be modified in the shorter term point to useful directions for clinical interventions. © The Author(s) 2014.
Measurement Via Optical Near-Nulling and Subaperture Stitching
NASA Technical Reports Server (NTRS)
Forbes, Greg; De Vries, Gary; Murphy, Paul; Brophy, Chris
2012-01-01
A subaperture stitching interferometer system provides near-nulling of a subaperture wavefront reflected from an object of interest over a portion of a surface of the object. A variable optical element located in the radiation path adjustably provides near-nulling to facilitate stitching of subaperture interferograms, creating an interferogram representative of the entire surface of interest. This enables testing of aspheric surfaces without null optics customized for each surface prescription. The surface shapes of objects such as lenses and other precision components are often measured with interferometry. However, interferometers have a limited capture range, and thus the test wavefront cannot be too different from the reference or the interference cannot be analyzed. Furthermore, the performance of the interferometer is usually best when the test and reference wavefronts are nearly identical (referred to as a null condition). Thus, it is necessary when performing such measurements to correct for known variations in shape to ensure that unintended variations are within the capture range of the interferometer and accurately measured. This invention is a system for nearnulling within a subaperture stitching interferometer, although in principle, the concept can be employed by wavefront measuring gauges other than interferometers. The system employs a light source for providing coherent radiation of a subaperture extent. An object of interest is placed to modify the radiation (e.g., to reflect or pass the radiation), and a variable optical element is located to interact with, and nearly null, the affected radiation. A detector or imaging device is situated to obtain interference patterns in the modified radiation. Multiple subaperture interferograms are taken and are stitched, or joined, to provide an interferogram representative of the entire surface of the object of interest. The primary aspect of the invention is the use of adjustable corrective optics in the context of subaperture stitching near-nulling interferometry, wherein a complex surface is analyzed via multiple, separate, overlapping interferograms. For complex surfaces, the problem of managing the identification and placement of corrective optics becomes even more pronounced, to the extent that in most cases the null corrector optics are specific to the particular asphere prescription and no others (i.e. another asphere requires completely different null correction optics). In principle, the near-nulling technique does not require subaperture stitching at all. Building a near-null system that is practically useful relies on two key features: simplicity and universality. If the system is too complex, it will be difficult to calibrate and model its manufacturing errors, rendering it useless as a precision metrology tool and/or prohibitively expensive. If the system is not applicable to a wide range of test parts, then it does not provide significant value over conventional null-correction technology. Subaperture stitching enables simpler and more universal near-null systems to be effective, because a fraction of a surface is necessarily less complex than the whole surface (excepting the extreme case of a fractal surface description). The technique of near-nulling can significantly enhance aspheric subaperture stitching capability by allowing the interferometer to capture a wider range of aspheres. More over, subaperture stitching is essential to a truly effective near-nulling system, since looking at a fraction of the surface keeps the wavefront complexity within the capability of a relatively simple nearnull apparatus. Furthermore, by reducing the subaperture size, the complexity of the measured wavefront can be reduced until it is within the capability of the near-null design.
Terrestrial Planet Finder Interferometer Technology Status and Plans
NASA Technical Reports Server (NTRS)
Lawson, Perter R.; Ahmed, A.; Gappinger, R. O.; Ksendzov, A.; Lay, O. P.; Martin, S. R.; Peters, R. D.; Scharf, D. P.; Wallace, J. K.; Ware, B.
2006-01-01
A viewgraph presentation on the technology status and plans for Terrestrial Planet Finder Interferometer is shown. The topics include: 1) The Navigator Program; 2) TPF-I Project Overview; 3) Project Organization; 4) Technology Plan for TPF-I; 5) TPF-I Testbeds; 6) Nulling Error Budget; 7) Nulling Testbeds; 8) Nulling Requirements; 9) Achromatic Nulling Testbed; 10) Single Mode Spatial Filter Technology; 11) Adaptive Nuller Testbed; 12) TPF-I: Planet Detection Testbed (PDT); 13) Planet Detection Testbed Phase Modulation Experiment; and 14) Formation Control Testbed.
Coupling Spatiotemporal Community Assembly Processes to Changes in Microbial Metabolism
DOE Office of Scientific and Technical Information (OSTI.GOV)
Graham, Emily B.; Crump, Alex R.; Resch, Charles T.
Community assembly processes govern shifts in species abundances in response to environmental change, yet our understanding of assembly remains largely decoupled from ecosystem function. Here, we test hypotheses regarding assembly and function across space and time using hyporheic microbial communities as a model system. We pair sampling of two habitat types through hydrologic fluctuation with null modeling and multivariate statistics. We demonstrate that dual selective pressures assimilate to generate compositional changes at distinct timescales among habitat types, resulting in contrasting associations of Betaproteobacteria and Thaumarchaeota with selection and with seasonal changes in aerobic metabolism. Our results culminate in a conceptualmore » model in which selection from contrasting environments regulates taxon abundance and ecosystem function through time, with increases in function when oscillating selection opposes stable selective pressures. Our model is applicable within both macrobial and microbial ecology and presents an avenue for assimilating community assembly processes into predictions of ecosystem function.« less
DOE Office of Scientific and Technical Information (OSTI.GOV)
Tu Qisheng; Valverde, Paloma; Chen, Jake
Osterix (Osx) is a zinc-finger-containing transcription factor that is expressed in osteoblasts of all endochondral and membranous bones. In Osx null mice osteoblast differentiation is impaired and bone formation is absent. In this study, we hypothesized that overexpression of Osx in murine bone marrow stromal cells (BMSC) would be able to enhance their osteoblastic differentiation and mineralization in vitro. Retroviral transduction of Osx in BMSC cultured in non-differentiating medium did not affect expression of Runx2/Cbfa1, another key transcription factor of osteoblast differentiation, but induced an increase in the expression of other markers associated with the osteoblastic lineage including alkaline phosphatase,more » bone sialoprotein, osteocalcin, and osteopontin. Retroviral transduction of Osx in BMSC also increased their proliferation, alkaline phosphatase activity, and ability to form bone nodules. These events occurred without significant changes in the expression of {alpha}1(II) procollagen or lipoprotein lipase, which are markers of chondrogenic and adipogenic differentiation, respectively.« less
From Discovery to Justification: Outline of an Ideal Research Program in Empirical Psychology
Witte, Erich H.; Zenker, Frank
2017-01-01
The gold standard for an empirical science is the replicability of its research results. But the estimated average replicability rate of key-effects that top-tier psychology journals report falls between 36 and 39% (objective vs. subjective rate; Open Science Collaboration, 2015). So the standard mode of applying null-hypothesis significance testing (NHST) fails to adequately separate stable from random effects. Therefore, NHST does not fully convince as a statistical inference strategy. We argue that the replicability crisis is “home-made” because more sophisticated strategies can deliver results the successful replication of which is sufficiently probable. Thus, we can overcome the replicability crisis by integrating empirical results into genuine research programs. Instead of continuing to narrowly evaluate only the stability of data against random fluctuations (discovery context), such programs evaluate rival hypotheses against stable data (justification context). PMID:29163256
Taoist Tai Chi® and Memory Intervention for Individuals with Mild Cognitive Impairment.
Fogarty, Jennifer N; Murphy, Kelly J; McFarlane, Bruce; Montero-Odasso, Manuel; Wells, Jennie; Troyer, Angela K; Trinh, Daniel; Gutmanis, Iris; Hansen, Kevin T
2016-04-01
It was hypothesized that a combined Taoist Tai Chi (TTC) and a memory intervention program (MIP) would be superior to a MIP alone in improving everyday memory behaviors in individuals with amnestic mild cognitive impairment (aMCI). A secondary hypothesis was that TTC would improve cognition, self-reported health status, gait, and balance. A total of 48 individuals were randomly assigned to take part in MIP + TTC or MIP alone. The TTC intervention consisted of twenty 90 min sessions. Outcome measures were given at baseline, and after 10 and 22 weeks. Both groups significantly increased their memory strategy knowledge and use, ratings of physical health, processing speed, everyday memory, and visual attention. No preferential benefit was found for individuals in the MIP + TTC group on cognition, gait, or balance measures. Contrary to expectations, TTC exercise did not specifically improve cognition or physical mobility. Explanations for null findings are explored.
Sequential Probability Ratio Test for Spacecraft Collision Avoidance Maneuver Decisions
NASA Technical Reports Server (NTRS)
Carpenter, J. Russell; Markley, F. Landis
2013-01-01
A document discusses sequential probability ratio tests that explicitly allow decision-makers to incorporate false alarm and missed detection risks, and are potentially less sensitive to modeling errors than a procedure that relies solely on a probability of collision threshold. Recent work on constrained Kalman filtering has suggested an approach to formulating such a test for collision avoidance maneuver decisions: a filter bank with two norm-inequality-constrained epoch-state extended Kalman filters. One filter models the null hypotheses that the miss distance is inside the combined hard body radius at the predicted time of closest approach, and one filter models the alternative hypothesis. The epoch-state filter developed for this method explicitly accounts for any process noise present in the system. The method appears to work well using a realistic example based on an upcoming, highly elliptical orbit formation flying mission.
Behavioral determinants of occupational exposure to chemical agents.
Meijman, T F; Ulenbelt, P; Lumens, M E; Herber, R F
1996-01-01
In the demand-control model (see T. Theorell & R. A. Karasek, 1996), it is hypothesized that workers in active jobs (high demands-high decision latitude) can exert effective coping strategies when confronted with environmental stessors. Thus, when exposed to similar levels of a chemical agent, lower concentrations of this agent in blood could be expected in these workers in comparison with workers in passive jobs. This theory was tested in 2 studies of lead-exposed workers: 18 male Caucasian workers from an electric accumulatory factory and 18 male Caucasian workers from a lead smelting factory. The results did not follow the hypothesized outcomes. In the work environment of the workers in active jobs, lower concentrations of lead in air were measured, but higher levels of lead in blood were observed in these workers. The opposite was true of workers in passive jobs. Differences in hygienic behavior at work may explain these unexpected results.
Bias neglect: a blind spot in the evaluation of scientific results.
Strickland, Brent; Mercier, Hugo
2014-01-01
Experimenter bias occurs when scientists' hypotheses influence their results, even if involuntarily. Meta-analyses have suggested that in some domains, such as psychology, up to a third of the studies could be unreliable due to such biases. A series of experiments demonstrates that while people are aware of the possibility that scientists can be more biased when the conclusions of their experiments fit their initial hypotheses, they robustly fail to appreciate that they should also be more sceptical of such results. This is true even when participants read descriptions of studies that have been shown to be biased. Moreover, participants take other sources of bias-such as financial incentives-into account, showing that this bias neglect may be specific to theory-driven hypothesis testing. In combination with a common style of scientific reporting, bias neglect could lead the public to accept premature conclusions.
Surfactant-Associated Protein A Provides Critical Immunoprotection in Neonatal Mice▿
George, Caroline L. S.; Goss, Kelli L.; Meyerholz, David K.; Lamb, Fred S.; Snyder, Jeanne M.
2008-01-01
The collectins surfactant-associated protein A (SP-A) and SP-D are components of innate immunity that are present before birth. Both proteins bind pathogens and assist in clearing infection. The significance of SP-A and SP-D as components of the neonatal immune system has not been investigated. To determine the role of SP-A and SP-D in neonatal immunity, wild-type, SP-A null, and SP-D null mice were bred in a bacterium-laden environment (corn dust bedding) or in a semisterile environment (cellulose fiber bedding). When reared in the corn dust bedding, SP-A null pups had significant mortality (P < 0.001) compared to both wild-type and SP-D null pups exposed to the same environment. The mortality of the SP-A null pups was associated with significant gastrointestinal tract pathology but little lung pathology. Moribund SP-A null newborn mice exhibited Bacillus sp. and Enterococcus sp. peritonitis. When the mother or newborn produced SP-A, newborn survival was significantly improved (P < 0.05) compared to the results when there was a complete absence of SP-A in both the mother and the pup. Significant sources of SP-A likely to protect a newborn include the neonatal lung and gastrointestinal tract but not the lactating mammary tissue of the mother. Furthermore, exogenous SP-A delivered by mouth to newborn SP-A null pups with SP-A null mothers improved newborn survival in the corn dust environment. Therefore, a lack of SP-D did not affect newborn survival, while SP-A produced by either the mother or the pup or oral exogenous SP-A significantly reduced newborn mortality associated with environmentally induced infection in SP-A null newborns. PMID:17967856
Simón, Oihane; Williams, Trevor; Asensio, Aaron C.; Ros, Sarhay; Gaya, Andrea; Caballero, Primitivo; Possee, Robert D.
2008-01-01
The genome of Spodoptera frugiperda multiple nucleopolyhedrovirus (NPV) was inserted into a bacmid (Sfbac) and used to produce a mutant lacking open reading frame 29 (Sf29null). Sf29null bacmid DNA was able to generate an infection in S. frugiperda. Approximately six times less DNA was present in occlusion bodies (OBs) produced by the Sf29null bacmid in comparison to viruses containing this gene. This reduction in DNA content was consistent with fewer virus particles being packaged within Sf29null bacmid OBs, as determined by fractionation of dissolved polyhedra and comparison of occlusion-derived virus (ODV) infectivity in cell culture. DNA from Sfbac, Sf29null, or Sf29null-repair, in which the gene deletion had been repaired, were equally infectious when used to transfect S. frugiperda. All three viruses produced similar numbers of OBs, although those from Sf29null were 10-fold less infectious than viruses with the gene. Insects infected with Sf29null bacmid died ∼24 h later than positive controls, consistent with the reduced virus particle content of Sf29null OBs. Transcripts from Sf29 were detected in infected insects 12 h prior to those from the polyhedrin gene. Homologs to Sf29 were present in other group II NPVs, and similar sequences were present in entomopoxviruses. Analysis of the Sf29 predicted protein sequence revealed signal peptide and transmembrane domains, but the presence of 12 potential N-glycosylation sites suggest that it is not an ODV envelope protein. Other motifs, including zinc-binding and threonine-rich regions, suggest degradation and adhesion functions. We conclude that Sf29 is a viral factor that determines the number of ODVs occluded in each OB. PMID:18550678
Mota, Linda C.; Hernandez, Juan P.
2010-01-01
Constitutive androgen receptor (CAR) is activated by several chemicals and in turn regulates multiple detoxification genes. Our research demonstrates that parathion is one of the most potent, environmentally relevant CAR activators with an EC50 of 1.43 μM. Therefore, animal studies were conducted to determine whether CAR was activated by parathion in vivo. Surprisingly, CAR-null mice, but not wild-type (WT) mice, showed significant parathion-induced toxicity. However, parathion did not induce Cyp2b expression, suggesting that parathion is not a CAR activator in vivo, presumably because of its short half-life. CAR expression is also associated with the expression of several drug-metabolizing cytochromes P450 (P450). CAR-null mice demonstrate lower expression of Cyp2b9, Cyp2b10, Cyp2c29, and Cyp3a11 primarily, but not exclusively in males. Therefore, we incubated microsomes from untreated WT and CAR-null mice with parathion in the presence of esterase inhibitors to determine whether CAR-null mice show perturbed P450-mediated parathion metabolism compared with that in WT mice. The metabolism of parathion to paraoxon and p-nitrophenol (PNP) was reduced in CAR-null mice with male CAR-null mice showing reduced production of both paraoxon and PNP, and female CAR-null mice showing reduced production of only PNP. Overall, the data indicate that CAR-null mice metabolize parathion slower than WT mice. These results provide a potential mechanism for increased sensitivity of individuals with lower CAR activity such as newborns to parathion and potentially other chemicals due to decreased metabolic capacity. PMID:20573718
McFarland, Daniel C; Holland, Jimmie; Holcombe, Randall F
2015-07-01
The demand for hematologists and oncologists is not being met. We hypothesized that an inpatient hematology-oncology ward rotation would increase residents' interest. Potential reasons mitigating interest were explored and included differences in physician distress, empathy, resilience, and patient death experiences. Agreement with the statement "I am interested in pursuing a career/fellowship in hematology and oncology" was rated by residents before and after a hematology-oncology rotation, with 0 = not true at all, 1 = rarely true, 2 = sometimes true, 3 = often true, and 4 = true nearly all the time. House staff rotating on a hematology-oncology service from November 2013 to October 2014 also received questionnaires before and after their rotations containing the Connors-Davidson Resilience Scale, the Impact of Events Scale-Revised, the Interpersonal Reactivity Index, demographic information, and number of dying patients cared for and if a sense of meaning was derived from that experience. Fifty-six residents completed both before- and after-rotation questionnaires (response rate, 58%). The mean interest score was 1.43 initially and decreased to 1.24 after the rotation (P = .301). Female residents' mean score was 1.13 initially and dropped to 0.81 after the rotation (P = .04). Male residents' mean score was 1.71 initially and 1.81 after the rotation (P = .65). Decreased hematology-oncology interest correlated with decreased empathy; male interest decrease correlated with decreased resilience. An inpatient hematology-oncology ward rotation does not lead to increased interest and, for some residents, may lead to decreased interest in the field. Encouraging outpatient hematology-oncology rotations and the cultivation of resilience, empathy, and meaning regarding death experiences may increase resident interest. Copyright © 2015 by American Society of Clinical Oncology.
Current Structure and Nonideal Behavior at Magnetic Null Points in the Turbulent Magnetosheath
NASA Technical Reports Server (NTRS)
Wendel, D. E.; Adrian, M. L.
2013-01-01
The Poincaré index indicates that the Cluster spacecraft tetrahedron entraps a number of 3-D magnetic nulls during an encounter with the turbulent magnetosheath. Previous researchers have found evidence for reconnection at one of the many filamentary current layers observed by Cluster in this region. We find that many of the entrained nulls are also associated with strong currents. We dissect the current structure of a pair of spiral nulls that may be topologically connected. At both nulls, we find a strong current along the spine, accompanied by a somewhat more modest current perpendicular to the spine that tilts the fan toward the axis of the spine. The current along the fan is comparable to the that along the spine. At least one of the nulls manifests a rotational flow pattern in the fan plane that is consistent with torsional spine reconnection as predicted by theory. These results emphasize the importance of examining the magnetic topology in interpreting the nature of currents and reconnection in 3-D turbulence.
Manfroi, Silvia; Scarcello, Antonio; Pagliaro, Pasqualepaolo
2015-10-01
Molecular genetic studies on Duffy blood group antigens have identified mutations underlying rare FY*Null and FY*X alleles. FY*Null has a high frequency in Blacks, especially from sub-Saharan Africa, while its frequency is not defined in Caucasians. FY*X allele, associated with Fy(a-b+w) phenotype, has a frequency of 2-3.5% in Caucasian people while it is absent in Blacks. During the project of extensive blood group genotyping in patients affected by hemoglobinopathies, we identified FY*X/FY*Null and FY*A/FY*Null genotypes in a Caucasian thalassemic family from Sardinia. We speculate on the frequency of FY*X and FY*Null alleles in Caucasian and Black people; further, we focused on the association of FY*X allele with weak Fyb antigen expression on red blood cells and its identification performing high sensitivity serological typing methods or genotyping. Copyright © 2015 Elsevier Ltd. All rights reserved.
Position sensor for a fuel injection element in an internal combustion engine
DOE Office of Scientific and Technical Information (OSTI.GOV)
Fulkerson, D.E.; Geske, M.L.
1987-08-18
This patent describes an electronic circuit for dynamically sensing and processing signals representative of changes in a magnet field, the circuit comprising: means for sensing a change in a magnetic field external to the circuit and providing an output representative of the change; circuit means electronically coupled with the output of the sensing means for providing an output indicating the presence of the magnetic field change; and a nulling circuit coupled with the output of the sensing means and across the indicating circuit means for nulling the electronic circuit responsive to the sensing means output, to thereby avoid ambient magneticmore » fields temperature and process variations, and wherein the nulling circuit comprises a capacitor coupled to the output of the nulling circuit, means for charging and discharging the capacitor responsive to any imbalance in the input to the nulling circuit, and circuit means coupling the capacitor with the output of the sensing means for nulling any imbalance during the charging or discharging of the capacitor.« less
Naked singularity resolution in cylindrical collapse
DOE Office of Scientific and Technical Information (OSTI.GOV)
Kurita, Yasunari; Yukawa Institute for Theoretical Physics, Kyoto University, Kyoto, 606-8502; Nakao, Ken-ichi
In this paper, we study the gravitational collapse of null dust in cylindrically symmetric spacetime. The naked singularity necessarily forms at the symmetry axis. We consider the situation in which null dust is emitted again from the naked singularity formed by the collapsed null dust and investigate the backreaction by this emission for the naked singularity. We show a very peculiar but physically important case in which the same amount of null dust as that of the collapsed one is emitted from the naked singularity as soon as the ingoing null dust hits the symmetry axis and forms the nakedmore » singularity. In this case, although this naked singularity satisfies the strong curvature condition by Krolak (limiting focusing condition), geodesics which hit the singularity can be extended uniquely across the singularity. Therefore, we may say that the collapsing null dust passes through the singularity formed by itself and then leaves for infinity. Finally, the singularity completely disappears and the flat spacetime remains.« less
Evaluation of null-point detection methods on simulation data
NASA Astrophysics Data System (ADS)
Olshevsky, Vyacheslav; Fu, Huishan; Vaivads, Andris; Khotyaintsev, Yuri; Lapenta, Giovanni; Markidis, Stefano
2014-05-01
We model the measurements of artificial spacecraft that resemble the configuration of CLUSTER propagating in the particle-in-cell simulation of turbulent magnetic reconnection. The simulation domain contains multiple isolated X-type null-points, but the majority are O-type null-points. Simulations show that current pinches surrounded by twisted fields, analogous to laboratory pinches, are formed along the sequences of O-type nulls. In the simulation, the magnetic reconnection is mainly driven by the kinking of the pinches, at spatial scales of several ion inertial lentghs. We compute the locations of magnetic null-points and detect their type. When the satellites are separated by the fractions of ion inertial length, as it is for CLUSTER, they are able to locate both the isolated null-points, and the pinches. We apply the method to the real CLUSTER data and speculate how common are pinches in the magnetosphere, and whether they play a dominant role in the dissipation of magnetic energy.
Ioannidis, John P. A.
2017-01-01
A typical rule that has been used for the endorsement of new medications by the Food and Drug Administration is to have two trials, each convincing on its own, demonstrating effectiveness. “Convincing” may be subjectively interpreted, but the use of p-values and the focus on statistical significance (in particular with p < .05 being coined significant) is pervasive in clinical research. Therefore, in this paper, we calculate with simulations what it means to have exactly two trials, each with p < .05, in terms of the actual strength of evidence quantified by Bayes factors. Our results show that different cases where two trials have a p-value below .05 have wildly differing Bayes factors. Bayes factors of at least 20 in favor of the alternative hypothesis are not necessarily achieved and they fail to be reached in a large proportion of cases, in particular when the true effect size is small (0.2 standard deviations) or zero. In a non-trivial number of cases, evidence actually points to the null hypothesis, in particular when the true effect size is zero, when the number of trials is large, and when the number of participants in both groups is low. We recommend use of Bayes factors as a routine tool to assess endorsement of new medications, because Bayes factors consistently quantify strength of evidence. Use of p-values may lead to paradoxical and spurious decision-making regarding the use of new medications. PMID:28273140
Jurkowska, Halina; Niewiadomski, Julie; Hirschberger, Lawrence L.; Roman, Heather B.; Mazor, Kevin M.; Liu, Xiaojing; Locasale, Jason W.; Park, Eunkyue
2016-01-01
The cysteine dioxygenase (Cdo1)-null and the cysteine sulfinic acid decarboxylase (Csad)-null mouse are not able to synthesize hypotaurine/taurine by the cysteine/cysteine sulfinate pathway and have very low tissue taurine levels. These mice provide excellent models for studying the effects of taurine on biological processes. Using these mouse models, we identified betaine:homocysteine methyltransferase (BHMT) as a protein whose in vivo expression is robustly regulated by taurine. BHMT levels are low in liver of both Cdo1-null and Csad-null mice, but are restored to wild-type levels by dietary taurine supplementation. A lack of BHMT activity was indicated by an increase in the hepatic betaine level. In contrast to observations in liver of Cdo1-null and Csad-null mice, BHMT was not affected by taurine supplementation of primary hepatocytes from these mice. Likewise, CSAD abundance was not affected by taurine supplementation of primary hepatocytes, although it was robustly upregulated in liver of Cdo1-null and Csad-null mice and lowered to wild-type levels by dietary taurine supplementation. The mechanism by which taurine status affects hepatic CSAD and BHMT expression appears to be complex and to require factors outside of hepatocytes. Within the liver, mRNA abundance for both CSAD and BHMT was upregulated in parallel with protein levels, indicating regulation of BHMT and CSAD mRNA synthesis or degradation. PMID:26481005
Timko, C Alix; Hormes, Julia M; Chubski, Janice
2012-06-01
Adherence to a vegetarian diet has been hypothesized to be a factor in the onset and maintenance of disordered eating behavior; however, evidence to support this assumption has been largely mixed. The two studies presented here sought to address the causes of inconsistent findings in previous research, including: small samples of true vegetarians, lack of appropriate operational definitions of "vegetarianism", and uncertainty about the appropriateness of existing assessments of eating behaviors for semi-vegetarians. Study 1 assessed eating behaviors in the largest samples of confirmed true vegetarians and vegans surveyed to date, and compared them to semi-vegetarians and omnivores. Semi-vegetarians reported the highest levels of eating-related pathology; true vegetarians and vegans appeared to be healthiest in regards to weight and eating. Study 2 examined differences between semi-vegetarians and omnivores in terms of restraint and disordered eating and found little evidence for more eating-related pathology in semi-vegetarians, compared to omnivores. Semi-vegetarians' higher scores on traditional assessments of eating behaviors appeared artificially inflated by ratings of items assessing avoidance of specific food items which should be considered normative in the context of a vegetarian diet. Findings shed light on the sources of inconsistencies in prior research on eating behaviors in vegetarians and suggest that semi-vegetarianism - as opposed to true vegetarianism or veganism - is the most likely related to disordered eating. Copyright © 2012 Elsevier Ltd. All rights reserved.
Epidemiologic studies of electric and magnetic fields and cancer: Strategies for extending knowledge
DOE Office of Scientific and Technical Information (OSTI.GOV)
Savitz, D.A.
1993-12-01
Epidemiologic research concerning electric and magnetic fields in relation to cancer has focused on the potential etiologic roles of residential exposure on childhood cancer and occupational exposure on adult leukemia and brain cancer. Future residential studies must concentrate on exposure assessment that is enhanced by developing models of historical exposure, assessment of the relation between magnetic fields and wire codes, and consideration of alternate exposure indices. Study design issues deserving attention include possible biases in random digit dialing control selection, consideration of the temporal course of exposure and disease, and acquisition of the necessary information to assess the potential valuemore » of ecologic studies. Highest priorities are comprehensive evaluation of exposure patterns and sources and examination of the sociology and geography of residential wire codes. Future occupational studies should also concentrate on improved exposure assessment with increased attention to nonutility worker populations and development of historical exposure indicators that are superior to job titles alone. Potential carcinogens in the workplace that could act as confounders need to be more carefully examined. The temporal relation between exposure and disease and possible effect modification by other workplace agents should be incorporated into future studies. The most pressing need is for measurement of exposure patterns in a variety of worker populations and performance of traditional epidemiologic evaluations of cancer occurrence. The principal source of bias toward the null is nondifferential misclassification of exposure with improvements expected to enhance any true etiologic association that is present. Biases away from the null might include biased control selection in residential studies and chemical carcinogens acting as confounders in occupational studies. 51 refs., 1 tab.« less
Interpreting observational studies: why empirical calibration is needed to correct p-values
Schuemie, Martijn J; Ryan, Patrick B; DuMouchel, William; Suchard, Marc A; Madigan, David
2014-01-01
Often the literature makes assertions of medical product effects on the basis of ‘ p < 0.05’. The underlying premise is that at this threshold, there is only a 5% probability that the observed effect would be seen by chance when in reality there is no effect. In observational studies, much more than in randomized trials, bias and confounding may undermine this premise. To test this premise, we selected three exemplar drug safety studies from literature, representing a case–control, a cohort, and a self-controlled case series design. We attempted to replicate these studies as best we could for the drugs studied in the original articles. Next, we applied the same three designs to sets of negative controls: drugs that are not believed to cause the outcome of interest. We observed how often p < 0.05 when the null hypothesis is true, and we fitted distributions to the effect estimates. Using these distributions, we compute calibrated p-values that reflect the probability of observing the effect estimate under the null hypothesis, taking both random and systematic error into account. An automated analysis of scientific literature was performed to evaluate the potential impact of such a calibration. Our experiment provides evidence that the majority of observational studies would declare statistical significance when no effect is present. Empirical calibration was found to reduce spurious results to the desired 5% level. Applying these adjustments to literature suggests that at least 54% of findings with p < 0.05 are not actually statistically significant and should be reevaluated. © 2013 The Authors. Statistics in Medicine published by John Wiley & Sons Ltd. PMID:23900808
A system for 3D representation of burns and calculation of burnt skin area.
Prieto, María Felicidad; Acha, Begoña; Gómez-Cía, Tomás; Fondón, Irene; Serrano, Carmen
2011-11-01
In this paper a computer-based system for burnt surface area estimation (BAI), is presented. First, a 3D model of a patient, adapted to age, weight, gender and constitution is created. On this 3D model, physicians represent both burns as well as burn depth allowing the burnt surface area to be automatically calculated by the system. Each patient models as well as photographs and burn area estimation can be stored. Therefore, these data can be included in the patient's clinical records for further review. Validation of this system was performed. In a first experiment, artificial known sized paper patches were attached to different parts of the body in 37 volunteers. A panel of 5 experts diagnosed the extent of the patches using the Rule of Nines. Besides, our system estimated the area of the "artificial burn". In order to validate the null hypothesis, Student's t-test was applied to collected data. In addition, intraclass correlation coefficient (ICC) was calculated and a value of 0.9918 was obtained, demonstrating that the reliability of the program in calculating the area is of 99%. In a second experiment, the burnt skin areas of 80 patients were calculated using BAI system and the Rule of Nines. A comparison between these two measuring methods was performed via t-Student test and ICC. The hypothesis of null difference between both measures is only true for deep dermal burns and the ICC is significantly different, indicating that the area estimation calculated by applying classical techniques can result in a wrong diagnose of the burnt surface. Copyright © 2011 Elsevier Ltd and ISBI. All rights reserved.
NASA Astrophysics Data System (ADS)
Will, Clifford M.; Wiseman, Alan G.
1996-10-01
We derive the gravitational waveform and gravitational-wave energy flux generated by a binary star system of compact objects (neutron stars or black holes), accurate through second post-Newtonian order (O[(v/c)4]=O[(Gm/rc2)2]) beyond the lowest-order quadrupole approximation. We cast the Einstein equations into the form of a flat-spacetime wave equation together with a harmonic gauge condition, and solve it formally as a retarded integral over the past null cone of the chosen field point. The part of this integral that involves the matter sources and the near-zone gravitational field is evaluated in terms of multipole moments using standard techniques; the remainder of the retarded integral, extending over the radiation zone, is evaluated in a novel way. The result is a manifestly convergent and finite procedure for calculating gravitational radiation to arbitrary orders in a post-Newtonian expansion. Through second post-Newtonian order, the radiation is also shown to propagate toward the observer along true null rays of the asymptotically Schwarzschild spacetime, despite having been derived using flat-spacetime wave equations. The method cures defects that plagued previous ``brute-force'' slow-motion approaches to the generation of gravitational radiation, and yields results that agree perfectly with those recently obtained by a mixed post-Minkowskian post-Newtonian method. We display explicit formulas for the gravitational waveform and the energy flux for two-body systems, both in arbitrary orbits and in circular orbits. In an appendix, we extend the formalism to bodies with finite spatial extent, and derive the spin corrections to the waveform and energy loss.
The distribution of genetic variance across phenotypic space and the response to selection.
Blows, Mark W; McGuigan, Katrina
2015-05-01
The role of adaptation in biological invasions will depend on the availability of genetic variation for traits under selection in the new environment. Although genetic variation is present for most traits in most populations, selection is expected to act on combinations of traits, not individual traits in isolation. The distribution of genetic variance across trait combinations can be characterized by the empirical spectral distribution of the genetic variance-covariance (G) matrix. Empirical spectral distributions of G from a range of trait types and taxa all exhibit a characteristic shape; some trait combinations have large levels of genetic variance, while others have very little genetic variance. In this study, we review what is known about the empirical spectral distribution of G and show how it predicts the response to selection across phenotypic space. In particular, trait combinations that form a nearly null genetic subspace with little genetic variance respond only inconsistently to selection. We go on to set out a framework for understanding how the empirical spectral distribution of G may differ from the random expectations that have been developed under random matrix theory (RMT). Using a data set containing a large number of gene expression traits, we illustrate how hypotheses concerning the distribution of multivariate genetic variance can be tested using RMT methods. We suggest that the relative alignment between novel selection pressures during invasion and the nearly null genetic subspace is likely to be an important component of the success or failure of invasion, and for the likelihood of rapid adaptation in small populations in general. © 2014 John Wiley & Sons Ltd.
Hodge, Greg; Holmes, Mark; Jersmann, Hubertus; Reynolds, Paul N; Hodge, Sandra
2014-05-15
We have shown that chronic obstructive pulmonary disease (COPD) is associated with increased production of pro-inflammatory cytokines and the cytotoxic mediator, granzyme B by peripheral blood steroid resistant CD28nullCD137 + CD8+ T cells and granzyme B by NKT-like and NK cells. We hypothesized that we could target these pro-inflammatory/cytotoxic lymphocytes by inhibiting co-stimulation through CD137. Isolated PBMC from patients with COPD and healthy controls were stimulated with phytohaemagglutinin (PHA) ± blocking anti-CD137 ± 10(-6) M methylprednislone (MP) (±stimulatory anti-CD137 ± control antibodies). Pro-inflammatory cytokine profiles and expression of granzyme B, by T, NKT-like CD28 ± subsets and NK cells were determined using flow cytometry. There was a significant decrease in the percentage of T, NKT-like subsets and NK cells producing IFNγ, TNFα and granzyme B in all subjects in the presence of anti-CD137 blocking antibody compared with PHA alone (eg, 60% decrease in CD8 + granzyme B + cells) or MP. Stimulatory anti-CD137 was associated with an increase in the percentage of pro-inflammatory/cytotoxic cells. The inhibitory effect of anti-CD137 on IFNγ, TNFα and granzyme B production by CD28null cells was greater than by CD28+ cells. Blocking CD137 expression is associated with downregulation of IFNγ, TNFα and granzyme B by CD8+ T and NKT-like and NK cells. Targeting CD137 may have novel therapeutic implications for patients with COPD.
Eichmann, Anne; Corbel, Catherine; Nataf, Valérie; Vaigot, Pierre; Bréant, Christiane; Le Douarin, Nicole M.
1997-01-01
The existence of a common precursor for endothelial and hemopoietic cells, termed the hemangioblast, has been postulated since the beginning of the century. Recently, deletion of the endothelial-specific vascular endothelial growth factor receptor 2 (VEGFR2) by gene targeting has shown that both endothelial and hemopoietic cells are absent in homozygous null mice. This observation suggested that VEGFR2 could be expressed by the hemangioblast and essential for its further differentiation along both lineages. However, it was not possible to exclude the hypothesis that hemopoietic failure was a secondary effect resulting from the absence of an endothelial cell microenvironment. To distinguish between these two hypotheses, we have produced a mAb directed against the extracellular domain of avian VEGFR2 and isolated VEGFR2+ cells from the mesoderm of chicken embryos at the gastrulation stage. We have found that in clonal cultures, a VEGFR2+ cell gives rise to either a hemopoietic or an endothelial cell colony. The developmental decision appears to be regulated by the binding of two different VEGFR2 ligands. Thus, endothelial differentiation requires VEGF, whereas hemopoietic differentiation occurs in the absence of VEGF and is significantly reduced by soluble VEGFR2, showing that this process could be mediated by a second, yet unidentified, VEGFR2 ligand. These observations thus suggest strongly that in the absence of the VEGFR2 gene product, the precursors of both hemopoietic and vascular endothelial lineages cannot survive. These cells therefore might be the initial targets of the VEGFR2 null mutation. PMID:9144204
Holsclaw, Julie Korda; Sekelsky, Jeff
2017-05-01
DNA double-strand breaks (DSBs) pose a serious threat to genomic integrity. If unrepaired, they can lead to chromosome fragmentation and cell death. If repaired incorrectly, they can cause mutations and chromosome rearrangements. DSBs are repaired using end-joining or homology-directed repair strategies, with the predominant form of homology-directed repair being synthesis-dependent strand annealing (SDSA). SDSA is the first defense against genomic rearrangements and information loss during DSB repair, making it a vital component of cell health and an attractive target for chemotherapeutic development. SDSA has also been proposed to be the primary mechanism for integration of large insertions during genome editing with CRISPR/Cas9. Despite the central role for SDSA in genome stability, little is known about the defining step: annealing. We hypothesized that annealing during SDSA is performed by the annealing helicase SMARCAL1, which can anneal RPA-coated single DNA strands during replication-associated DNA damage repair. We used unique genetic tools in Drosophila melanogaster to test whether the fly ortholog of SMARCAL1, Marcal1, mediates annealing during SDSA. Repair that requires annealing is significantly reduced in Marcal1 null mutants in both synthesis-dependent and synthesis-independent (single-strand annealing) assays. Elimination of the ATP-binding activity of Marcal1 also reduced annealing-dependent repair, suggesting that the annealing activity requires translocation along DNA. Unlike the null mutant, however, the ATP-binding defect mutant showed reduced end joining, shedding light on the interaction between SDSA and end-joining pathways. Copyright © 2017 by the Genetics Society of America.
De Luca, Chiara; Chung Sheun Thai, Jeffrey; Raskovic, Desanka; Cesareo, Eleonora; Caccamo, Daniela; Trukhanov, Arseny
2014-01-01
Growing numbers of “electromagnetic hypersensitive” (EHS) people worldwide self-report severely disabling, multiorgan, non-specific symptoms when exposed to low-dose electromagnetic radiations, often associated with symptoms of multiple chemical sensitivity (MCS) and/or other environmental “sensitivity-related illnesses” (SRI). This cluster of chronic inflammatory disorders still lacks validated pathogenetic mechanism, diagnostic biomarkers, and management guidelines. We hypothesized that SRI, not being merely psychogenic, may share organic determinants of impaired detoxification of common physic-chemical stressors. Based on our previous MCS studies, we tested a panel of 12 metabolic blood redox-related parameters and of selected drug-metabolizing-enzyme gene polymorphisms, on 153 EHS, 147 MCS, and 132 control Italians, confirming MCS altered (P < 0.05–0.0001) glutathione-(GSH), GSH-peroxidase/S-transferase, and catalase erythrocyte activities. We first described comparable—though milder—metabolic pro-oxidant/proinflammatory alterations in EHS with distinctively increased plasma coenzyme-Q10 oxidation ratio. Severe depletion of erythrocyte membrane polyunsaturated fatty acids with increased ω6/ω3 ratio was confirmed in MCS, but not in EHS. We also identified significantly (P = 0.003) altered distribution-versus-control of the CYP2C19∗1/∗2 SNP variants in EHS, and a 9.7-fold increased risk (OR: 95% C.I. = 1.3–74.5) of developing EHS for the haplotype (null)GSTT1 + (null)GSTM1 variants. Altogether, results on MCS and EHS strengthen our proposal to adopt this blood metabolic/genetic biomarkers' panel as suitable diagnostic tool for SRI. PMID:24812443
Abad-Grau, Mara M; Medina-Medina, Nuria; Montes-Soldado, Rosana; Matesanz, Fuencisla; Bafna, Vineet
2012-01-01
Multimarker Transmission/Disequilibrium Tests (TDTs) are very robust association tests to population admixture and structure which may be used to identify susceptibility loci in genome-wide association studies. Multimarker TDTs using several markers may increase power by capturing high-degree associations. However, there is also a risk of spurious associations and power reduction due to the increase in degrees of freedom. In this study we show that associations found by tests built on simple null hypotheses are highly reproducible in a second independent data set regardless the number of markers. As a test exhibiting this feature to its maximum, we introduce the multimarker 2-Groups TDT (mTDT(2G)), a test which under the hypothesis of no linkage, asymptotically follows a χ2 distribution with 1 degree of freedom regardless the number of markers. The statistic requires the division of parental haplotypes into two groups: disease susceptibility and disease protective haplotype groups. We assessed the test behavior by performing an extensive simulation study as well as a real-data study using several data sets of two complex diseases. We show that mTDT(2G) test is highly efficient and it achieves the highest power among all the tests used, even when the null hypothesis is tested in a second independent data set. Therefore, mTDT(2G) turns out to be a very promising multimarker TDT to perform genome-wide searches for disease susceptibility loci that may be used as a preprocessing step in the construction of more accurate genetic models to predict individual susceptibility to complex diseases.
Abad-Grau, Mara M.; Medina-Medina, Nuria; Montes-Soldado, Rosana; Matesanz, Fuencisla; Bafna, Vineet
2012-01-01
Multimarker Transmission/Disequilibrium Tests (TDTs) are very robust association tests to population admixture and structure which may be used to identify susceptibility loci in genome-wide association studies. Multimarker TDTs using several markers may increase power by capturing high-degree associations. However, there is also a risk of spurious associations and power reduction due to the increase in degrees of freedom. In this study we show that associations found by tests built on simple null hypotheses are highly reproducible in a second independent data set regardless the number of markers. As a test exhibiting this feature to its maximum, we introduce the multimarker -Groups TDT ( ), a test which under the hypothesis of no linkage, asymptotically follows a distribution with degree of freedom regardless the number of markers. The statistic requires the division of parental haplotypes into two groups: disease susceptibility and disease protective haplotype groups. We assessed the test behavior by performing an extensive simulation study as well as a real-data study using several data sets of two complex diseases. We show that test is highly efficient and it achieves the highest power among all the tests used, even when the null hypothesis is tested in a second independent data set. Therefore, turns out to be a very promising multimarker TDT to perform genome-wide searches for disease susceptibility loci that may be used as a preprocessing step in the construction of more accurate genetic models to predict individual susceptibility to complex diseases. PMID:22363405
Chronic treatment with fibrates elevates superoxide dismutase in adult mouse brain microvessels
Wang, Guangming; Liu, Xiaowei; Guo, Qingmin; Namura, Shobu
2010-01-01
Fibrates are activators of peroxisome proliferator-activated receptor (PPAR) α. Pretreatment with fibrates has been shown to protect brain against ischemia in mice. We hypothesized that fibrates elevate superoxide dismutase (SOD) levels in the brain microvessels (BMV). BMV were isolated from male C57BL/6 and PPARα null mice that had been treated with fenofibrate or gemfibrozil for 7 days. To examine the effect of discontinuation of fenofibrate, another animal group treated with fenofibrate was examined on post-discontinuation day 3 (D-3). To examine whether SOD elevations attenuate oxidative stress in the ischemic brain, separate animals treated with fenofibrate for 7 days were subjected to 60 minutes focal ischemia on post-discontinuation day 0 (D-0) or D-3. Fenofibrate (30 mg/kg) increased mRNA levels of all three isoforms of SOD and activity level in BMV on D-0 but these effects were not detected on D-3. The elevations were not detected in PPARα null mice. SOD levels were also elevated by gemfibrozil (30 mg/kg). Fenofibrate significantly reduced superoxide production and protein oxidation in the ischemic brain at 30 minutes after reperfusion. Fenofibrate reduced infarct size measured at 24 hours after reperfusion on D-0; however, the infarct reduction was not seen when ischemia was induced on D-3. These findings suggest that fibrates elevate SOD in BMV through PPARα, which contributes to the infarct reduction, at least in part. Further studies are needed to establish the link between the SOD elevations and the brain protection by fibrates against ischemia. PMID:20813100
The Role of Pancreatic Preproglucagon in Glucose Homeostasis in Mice.
Chambers, Adam P; Sorrell, Joyce E; Haller, April; Roelofs, Karen; Hutch, Chelsea R; Kim, Ki-Suk; Gutierrez-Aguilar, Ruth; Li, Bailing; Drucker, Daniel J; D'Alessio, David A; Seeley, Randy J; Sandoval, Darleen A
2017-04-04
Glucagon-like peptide 1 (GLP-1) is necessary for normal gluco-regulation, and it has been widely presumed that this function reflects the actions of GLP-1 released from enteroendocrine L cells. To test the relative importance of intestinal versus pancreatic sources of GLP-1 for physiological regulation of glucose, we administered a GLP-1R antagonist, exendin-[9-39] (Ex9), to mice with tissue-specific reactivation of the preproglucagon gene (Gcg). Ex9 impaired glucose tolerance in wild-type mice but had no impact on Gcg-null or GLP-1R KO mice, suggesting that Ex9 is a true and specific GLP-1R antagonist. Unexpectedly, Ex-9 had no effect on blood glucose in mice with restoration of intestinal Gcg. In contrast, pancreatic reactivation of Gcg fully restored the effect of Ex9 to impair both oral and i.p. glucose tolerance. These findings suggest an alternative model whereby islet GLP-1 also plays an important role in regulating glucose homeostasis. Copyright © 2017 Elsevier Inc. All rights reserved.
Re-examination of Oostenbroek et al. (2016): evidence for neonatal imitation of tongue protrusion.
Meltzoff, Andrew N; Murray, Lynne; Simpson, Elizabeth; Heimann, Mikael; Nagy, Emese; Nadel, Jacqueline; Pedersen, Eric J; Brooks, Rechele; Messinger, Daniel S; Pascalis, Leonardo De; Subiaul, Francys; Paukner, Annika; Ferrari, Pier F
2017-09-27
The meaning, mechanism, and function of imitation in early infancy have been actively discussed since Meltzoff and Moore's (1977) report of facial and manual imitation by human neonates. Oostenbroek et al. (2016) claim to challenge the existence of early imitation and to counter all interpretations so far offered. Such claims, if true, would have implications for theories of social-cognitive development. Here we identify 11 flaws in Oostenbroek et al.'s experimental design that biased the results toward null effects. We requested and obtained the authors' raw data. Contrary to the authors' conclusions, new analyses reveal significant tongue-protrusion imitation at all four ages tested (1, 3, 6, and 9 weeks old). We explain how the authors missed this pattern and offer five recommendations for designing future experiments. Infant imitation raises fundamental issues about action representation, social learning, and brain-behavior relations. The debate about the origins and development of imitation reflects its importance to theories of developmental science. © 2017 John Wiley & Sons Ltd.
Carryover negligibility and relevance in bioequivalence studies.
Ocaña, Jordi; Sanchez O, Maria P; Carrasco, Josep L
2015-01-01
The carryover effect is a recurring issue in the pharmaceutical field. It may strongly influence the final outcome of an average bioequivalence study. Testing a null hypothesis of zero carryover is useless: not rejecting it does not guarantee the non-existence of carryover, and rejecting it is not informative of the true degree of carryover and its influence on the validity of the final outcome of the bioequivalence study. We propose a more consistent approach: even if some carryover is present, is it enough to seriously distort the study conclusions or is it negligible? This is the central aim of this paper, which focuses on average bioequivalence studies based on 2 × 2 crossover designs and on the main problem associated with carryover: type I error inflation. We propose an equivalence testing approach to these questions and suggest reasonable negligibility or relevance limits for carryover. Finally, we illustrate this approach on some real datasets. Copyright © 2015 John Wiley & Sons, Ltd.
Oh, Youkeun K.; Kreinbrink, Jennifer L.; Wojtys, Edward M.; Ashton-Miller, James A.
2011-01-01
Anterior cruciate ligament (ACL) injuries most frequently occur under the large loads associated with a unipedal jump landing involving a cutting or pivoting maneuver. We tested the hypotheses that internal tibial torque would increase the anteromedial (AM) bundle ACL relative strain and strain rate more than would the corresponding external tibial torque under the large impulsive loads associated with such landing maneuvers. Twelve cadaveric female knees [mean (SD) age: 65.0 (10.5) years] were tested. Pretensioned quadriceps, hamstring and gastrocnemius muscle-tendon unit forces maintained an initial knee flexion angle of 15°. A compound impulsive test load (compression, flexion moment and internal or external tibial torque) was applied to the distal tibia while recording the 3-D knee loads and tibofemoral kinematics. AM-ACL relative strain was measured using a 3mm DVRT. In this repeated measures experiment, the Wilcoxon Signed-Rank test was used to test the null hypotheses with p<0.05 considered significant. The mean (± SD) peak AM-ACL relative strains were 5.4±3.7 % and 3.1±2.8 % under internal and external tibial torque, respectively. The corresponding mean (± SD) peak AM-ACL strain rates reached 254.4±160.1 %/sec and 179.4±109.9 %/sec, respectively. The hypotheses were supported in that the normalized mean peak AM-ACL relative strain and strain rate were 70% and 42% greater under internal than external tibial torque, respectively (p=0.023, p=0.041). We conclude that internal tibial torque is a potent stressor of the ACL because it induces a considerably (70%) larger peak strain in the AM-ACL than does a corresponding external tibial torque. PMID:22025178
Pronouns in Catalan: Information, Discourse and Strategy
ERIC Educational Resources Information Center
Mayol, Laia
2009-01-01
This thesis investigates the variation between null and overt pronouns in subject position in Catalan, a null subject language. I argue that null and overt subject pronouns are two resources that speakers efficiently deploy to signal their intended interpretation regarding antecedent choice or semantic meaning, and that communicative agents…
Background: Simulation studies have previously demonstrated that time-series analyses using smoothing splines correctly model null health-air pollution associations. Methods: We repeatedly simulated season, meteorology and air quality for the metropolitan area of Atlanta from cyc...
Complex Fuzzy Set-Valued Complex Fuzzy Measures and Their Properties
Ma, Shengquan; Li, Shenggang
2014-01-01
Let F*(K) be the set of all fuzzy complex numbers. In this paper some classical and measure-theoretical notions are extended to the case of complex fuzzy sets. They are fuzzy complex number-valued distance on F*(K), fuzzy complex number-valued measure on F*(K), and some related notions, such as null-additivity, pseudo-null-additivity, null-subtraction, pseudo-null-subtraction, autocontionuous from above, autocontionuous from below, and autocontinuity of the defined fuzzy complex number-valued measures. Properties of fuzzy complex number-valued measures are studied in detail. PMID:25093202
Comparison null imaging ellipsometry using polarization rotator
NASA Astrophysics Data System (ADS)
Park, Sungmo; Kim, Eunsung; Kim, Jiwon; An, Ilsin
2018-05-01
In this study, two-reflection imaging ellipsometry is carried out to compare the changes in polarization states between two samples. By using a polarization rotator, the parallel and perpendicular components of polarization are easily switched between the two samples being compared. This leads to an intensity image consisting of null and off-null points depending on the difference in optical characteristics between the two samples. This technique does not require any movement of optical elements for nulling and can be used to detect defects or surface contamination for quality control of samples.
NASA Astrophysics Data System (ADS)
LaBombard, B.; Kuang, A. Q.; Brunner, D.; Faust, I.; Mumgaard, R.; Reinke, M. L.; Terry, J. L.; Howard, N.; Hughes, J. W.; Chilenski, M.; Lin, Y.; Marmar, E.; Rice, J. E.; Rodriguez-Fernandez, P.; Wallace, G.; Whyte, D. G.; Wolfe, S.; Wukitch, S.
2017-07-01
The impurity screening response of the high-field side (HFS) scrape-off layer (SOL) to localized nitrogen injection is investigated on Alcator C-Mod for magnetic equilibria spanning lower-single-null, double-null and upper-single-null configurations under otherwise identical plasma conditions. L-mode, EDA H-mode and I-mode discharges are investigated. HFS impurity screening is found to depend on magnetic flux balance and the direction of B × \
Increased scientific rigor will improve reliability of research and effectiveness of management
Sells, Sarah N.; Bassing, Sarah B.; Barker, Kristin J.; Forshee, Shannon C.; Keever, Allison; Goerz, James W.; Mitchell, Michael S.
2018-01-01
Rigorous science that produces reliable knowledge is critical to wildlife management because it increases accurate understanding of the natural world and informs management decisions effectively. Application of a rigorous scientific method based on hypothesis testing minimizes unreliable knowledge produced by research. To evaluate the prevalence of scientific rigor in wildlife research, we examined 24 issues of the Journal of Wildlife Management from August 2013 through July 2016. We found 43.9% of studies did not state or imply a priori hypotheses, which are necessary to produce reliable knowledge. We posit that this is due, at least in part, to a lack of common understanding of what rigorous science entails, how it produces more reliable knowledge than other forms of interpreting observations, and how research should be designed to maximize inferential strength and usefulness of application. Current primary literature does not provide succinct explanations of the logic behind a rigorous scientific method or readily applicable guidance for employing it, particularly in wildlife biology; we therefore synthesized an overview of the history, philosophy, and logic that define scientific rigor for biological studies. A rigorous scientific method includes 1) generating a research question from theory and prior observations, 2) developing hypotheses (i.e., plausible biological answers to the question), 3) formulating predictions (i.e., facts that must be true if the hypothesis is true), 4) designing and implementing research to collect data potentially consistent with predictions, 5) evaluating whether predictions are consistent with collected data, and 6) drawing inferences based on the evaluation. Explicitly testing a priori hypotheses reduces overall uncertainty by reducing the number of plausible biological explanations to only those that are logically well supported. Such research also draws inferences that are robust to idiosyncratic observations and unavoidable human biases. Offering only post hoc interpretations of statistical patterns (i.e., a posteriorihypotheses) adds to uncertainty because it increases the number of plausible biological explanations without determining which have the greatest support. Further, post hocinterpretations are strongly subject to human biases. Testing hypotheses maximizes the credibility of research findings, makes the strongest contributions to theory and management, and improves reproducibility of research. Management decisions based on rigorous research are most likely to result in effective conservation of wildlife resources.
ON THE NATURE OF RECONNECTION AT A SOLAR CORONAL NULL POINT ABOVE A SEPARATRIX DOME
DOE Office of Scientific and Technical Information (OSTI.GOV)
Pontin, D. I.; Priest, E. R.; Galsgaard, K., E-mail: dpontin@maths.dundee.ac.uk
2013-09-10
Three-dimensional magnetic null points are ubiquitous in the solar corona and in any generic mixed-polarity magnetic field. We consider magnetic reconnection at an isolated coronal null point whose fan field lines form a dome structure. Using analytical and computational models, we demonstrate several features of spine-fan reconnection at such a null, including the fact that substantial magnetic flux transfer from one region of field line connectivity to another can occur. The flux transfer occurs across the current sheet that forms around the null point during spine-fan reconnection, and there is no separator present. Also, flipping of magnetic field lines takesmore » place in a manner similar to that observed in the quasi-separatrix layer or slip-running reconnection.« less
Are eikonal quasinormal modes linked to the unstable circular null geodesics?
NASA Astrophysics Data System (ADS)
Konoplya, R. A.; Stuchlík, Z.
2017-08-01
In Cardoso et al. [6] it was claimed that quasinormal modes which any stationary, spherically symmetric and asymptotically flat black hole emits in the eikonal regime are determined by the parameters of the circular null geodesic: the real and imaginary parts of the quasinormal mode are multiples of the frequency and instability timescale of the circular null geodesics respectively. We shall consider asymptotically flat black hole in the Einstein-Lovelock theory, find analytical expressions for gravitational quasinormal modes in the eikonal regime and analyze the null geodesics. Comparison of the both phenomena shows that the expected link between the null geodesics and quasinormal modes is violated in the Einstein-Lovelock theory. Nevertheless, the correspondence exists for a number of other cases and here we formulate its actual limits.
The data-driven null models for information dissemination tree in social networks
NASA Astrophysics Data System (ADS)
Zhang, Zhiwei; Wang, Zhenyu
2017-10-01
For the purpose of detecting relatedness and co-occurrence between users, as well as the distribution features of nodes in spreading path of a social network, this paper explores topological characteristics of information dissemination trees (IDT) that can be employed indirectly to probe the information dissemination laws within social networks. Hence, three different null models of IDT are presented in this article, including the statistical-constrained 0-order IDT null model, the random-rewire-broken-edge 0-order IDT null model and the random-rewire-broken-edge 2-order IDT null model. These null models firstly generate the corresponding randomized copy of an actual IDT; then the extended significance profile, which is developed by adding the cascade ratio of information dissemination path, is exploited not only to evaluate degree correlation of two nodes associated with an edge, but also to assess the cascade ratio of different length of information dissemination paths. The experimental correspondences of the empirical analysis for several SinaWeibo IDTs and Twitter IDTs indicate that the IDT null models presented in this paper perform well in terms of degree correlation of nodes and dissemination path cascade ratio, which can be better to reveal the features of information dissemination and to fit the situation of real social networks.
Cobb, Laura K; Appel, Lawrence J; Franco, Manuel; Jones-Smith, Jessica C; Nur, Alana; Anderson, Cheryl A M
2015-07-01
To examine the relationship between local food environments and obesity and assess the quality of studies reviewed. Systematic keyword searches identified studies from US and Canada that assessed the relationship of obesity to local food environments. We applied a quality metric based on design, exposure and outcome measurement, and analysis. We identified 71 studies representing 65 cohorts. Overall, study quality was low; 60 studies were cross-sectional. Associations between food outlet availability and obesity were predominantly null. Among non-null associations, we saw a trend toward inverse associations between supermarket availability and obesity (22 negative, 4 positive, 67 null) and direct associations between fast food and obesity (29 positive, 6 negative, 71 null) in adults. We saw direct associations between fast food availability and obesity in lower income children (12 positive, 7 null). Indices including multiple food outlets were most consistently associated with obesity in adults (18 expected, 1 not expected, 17 null). Limiting to higher quality studies did not affect results. Despite the large number of studies, we found limited evidence for associations between local food environments and obesity. The predominantly null associations should be interpreted cautiously due to the low quality of available studies. © 2015 The Obesity Society.
Prum, Richard O
2010-11-01
The Fisher-inspired, arbitrary intersexual selection models of Lande (1981) and Kirkpatrick (1982), including both stable and unstable equilibrium conditions, provide the appropriate null model for the evolution of traits and preferences by intersexual selection. Like the Hardy–Weinberg equilibrium, the Lande–Kirkpatrick (LK) mechanism arises as an intrinsic consequence of genetic variation in trait and preference in the absence of other evolutionary forces. The LK mechanism is equivalent to other intersexual selection mechanisms in the absence of additional selection on preference and with additional trait-viability and preference-viability correlations equal to zero. The LK null model predicts the evolution of arbitrary display traits that are neither honest nor dishonest, indicate nothing other than mating availability, and lack any meaning or design other than their potential to correspond to mating preferences. The current standard for demonstrating an arbitrary trait is impossible to meet because it requires proof of the null hypothesis. The LK null model makes distinct predictions about the evolvability of traits and preferences. Examples of recent intersexual selection research document the confirmationist pitfalls of lacking a null model. Incorporation of the LK null into intersexual selection will contribute to serious examination of the extent to which natural selection on preferences shapes signals.
Viewing condition dependence of the gaze-evoked nystagmus in Arnold Chiari type 1 malformation.
Ghasia, Fatema F; Gulati, Deepak; Westbrook, Edward L; Shaikh, Aasef G
2014-04-15
Saccadic eye movements rapidly shift gaze to the target of interest. Once the eyes reach a given target, the brainstem ocular motor integrator utilizes feedback from various sources to assure steady gaze. One of such sources is cerebellum whose lesion can impair neural integration leading to gaze-evoked nystagmus. The gaze evoked nystagmus is characterized by drifts moving the eyes away from the target and a null position where the drifts are absent. The extent of impairment in the neural integration for two opposite eccentricities might determine the location of the null position. Eye in the orbit position might also determine the location of the null. We report this phenomenon in a patient with Arnold Chiari type 1 malformation who had intermittent esotropia and horizontal gaze-evoked nystagmus with a shift in the null position. During binocular viewing, the null was shifted to the right. During monocular viewing, when the eye under cover drifted nasally (secondary to the esotropia), the null of the gaze-evoked nystagmus reorganized toward the center. We speculate that the output of the neural integrator is altered from the bilateral conflicting eye in the orbit position secondary to the strabismus. This could possibly explain the reorganization of the location of the null position. Copyright © 2014 Elsevier B.V. All rights reserved.
Sachs' free data in real connection variables
NASA Astrophysics Data System (ADS)
De Paoli, Elena; Speziale, Simone
2017-11-01
We discuss the Hamiltonian dynamics of general relativity with real connection variables on a null foliation, and use the Newman-Penrose formalism to shed light on the geometric meaning of the various constraints. We identify the equivalent of Sachs' constraint-free initial data as projections of connection components related to null rotations, i.e. the translational part of the ISO(2) group stabilising the internal null direction soldered to the hypersurface. A pair of second-class constraints reduces these connection components to the shear of a null geodesic congruence, thus establishing equivalence with the second-order formalism, which we show in details at the level of symplectic potentials. A special feature of the first-order formulation is that Sachs' propagating equations for the shear, away from the initial hypersurface, are turned into tertiary constraints; their role is to preserve the relation between connection and shear under retarded time evolution. The conversion of wave-like propagating equations into constraints is possible thanks to an algebraic Bianchi identity; the same one that allows one to describe the radiative data at future null infinity in terms of a shear of a (non-geodesic) asymptotic null vector field in the physical spacetime. Finally, we compute the modification to the spin coefficients and the null congruence in the presence of torsion.
Cobb, Laura K; Appel, Lawrence J; Franco, Manuel; Jones-Smith, Jessica C; Nur, Alana; Anderson, Cheryl AM
2015-01-01
Objective To examine the relationship between local food environments and obesity and assess the quality of studies reviewed. Methods Systematic keyword searches identified studies from US and Canada that assessed the relationship of obesity to local food environments. We applied a quality metric based on design, exposure and outcome measurement, and analysis. Results We identified 71 studies representing 65 cohorts. Overall, study quality was low; 60 studies were cross-sectional. Associations between food outlet availability and obesity were predominantly null. Among non-null associations, we saw a trend toward inverse associations between supermarket availability and obesity (22 negative, 4 positive, 67 null) and direct associations between fast food and obesity (29 positive, 6 negative, 71 null) in adults. We saw direct associations between fast food availability and obesity in lower income children (12 positive, 7 null). Indices including multiple food outlets were most consistently associated with obesity in adults (18 expected, 1 not expected, 17 null). Limiting to higher quality studies did not affect results. Conclusions Despite the large number of studies, we found limited evidence for associations between local food environments and obesity. The predominantly null associations should be interpreted cautiously due to the low quality of available studies. PMID:26096983
Gregorich, Steven E
2006-11-01
Comparative public health research makes wide use of self-report instruments. For example, research identifying and explaining health disparities across demographic strata may seek to understand the health effects of patient attitudes or private behaviors. Such personal attributes are difficult or impossible to observe directly and are often best measured by self-reports. Defensible use of self-reports in quantitative comparative research requires not only that the measured constructs have the same meaning across groups, but also that group comparisons of sample estimates (eg, means and variances) reflect true group differences and are not contaminated by group-specific attributes that are unrelated to the construct of interest. Evidence for these desirable properties of measurement instruments can be established within the confirmatory factor analysis (CFA) framework; a nested hierarchy of hypotheses is tested that addresses the cross-group invariance of the instrument's psychometric properties. By name, these hypotheses include configural, metric (or pattern), strong (or scalar), and strict factorial invariance. The CFA model and each of these hypotheses are described in nontechnical language. A worked example and technical appendices are included.
Montazerhodjat, Vahid; Chaudhuri, Shomesh E; Sargent, Daniel J; Lo, Andrew W
2017-09-14
Randomized clinical trials (RCTs) currently apply the same statistical threshold of alpha = 2.5% for controlling for false-positive results or type 1 error, regardless of the burden of disease or patient preferences. Is there an objective and systematic framework for designing RCTs that incorporates these considerations on a case-by-case basis? To apply Bayesian decision analysis (BDA) to cancer therapeutics to choose an alpha and sample size that minimize the potential harm to current and future patients under both null and alternative hypotheses. We used the National Cancer Institute (NCI) Surveillance, Epidemiology, and End Results (SEER) database and data from the 10 clinical trials of the Alliance for Clinical Trials in Oncology. The NCI SEER database was used because it is the most comprehensive cancer database in the United States. The Alliance trial data was used owing to the quality and breadth of data, and because of the expertise in these trials of one of us (D.J.S.). The NCI SEER and Alliance data have already been thoroughly vetted. Computations were replicated independently by 2 coauthors and reviewed by all coauthors. Our prior hypothesis was that an alpha of 2.5% would not minimize the overall expected harm to current and future patients for the most deadly cancers, and that a less conservative alpha may be necessary. Our primary study outcomes involve measuring the potential harm to patients under both null and alternative hypotheses using NCI and Alliance data, and then computing BDA-optimal type 1 error rates and sample sizes for oncology RCTs. We computed BDA-optimal parameters for the 23 most common cancer sites using NCI data, and for the 10 Alliance clinical trials. For RCTs involving therapies for cancers with short survival times, no existing treatments, and low prevalence, the BDA-optimal type 1 error rates were much higher than the traditional 2.5%. For cancers with longer survival times, existing treatments, and high prevalence, the corresponding BDA-optimal error rates were much lower, in some cases even lower than 2.5%. Bayesian decision analysis is a systematic, objective, transparent, and repeatable process for deciding the outcomes of RCTs that explicitly incorporates burden of disease and patient preferences.
Gochfeld, Michael; Jeitner, Christian; Burke, Sean; Volz, Conrad D.; Snigaroff, Ronald; Snigaroff, Daniel; Shukla, Tara; Shukla, Sheila
2014-01-01
Levels of mercury and other contaminants should be lower in birds nesting on isolated oceanic islands and at high latitudes without any local or regional sources of contamination, compared to more urban and industrialized temperate regions. We examined concentrations of arsenic, cadmium, chromium, lead, manganese, mercury and selenium in the eggs, and the feathers of fledgling and adult glaucous-winged gulls (Larus glaucescens) nesting in breeding colonies on Adak, Amchitka, and Kiska Islands in the Aleutian Chain of Alaska in the Bering Sea/North Pacific. We tested the following null hypotheses: 1) There were no differences in metal levels among eggs and feathers of adult and fledgling glaucous-winged gulls, 2) There were no differences in metal levels among gulls nesting near the three underground nuclear test sites (Long Shot 1965, Milrow 1969, Cannikin 1971) on Amchitka, 3) There were no differences in metal levels among the three islands, and 4) There were no gender-related differences in metal levels. All four null hypotheses were rejected at the 0.05 level, although there were few differences among the three test sites on Amchitka. Eggs had the lowest levels of cadmium, lead, and mercury, and the feathers of adults had the lowest levels of selenium. Comparing only adults and fledglings, adults had higher levels of cadmium, chromium, lead and mercury, and fledglings had higher levels of arsenic, manganese and selenium. There were few consistent interisland differences, although levels were generally lower for eggs and feathers from gulls on Amchitka compared to the other islands. Arsenic was higher in both adult feathers and eggs from Amchitka compared to Adak, and chromium and lead were higher in adult feathers and eggs from Adak compared to Amchitka. Mercury and arsenic, and chromium and manganese levels were significantly correlated in the feathers of both adult and fledgling gulls. The feathers of males had significantly higher levels of chromium and manganese than did females. The levels of most metals in feathers are below those known to be associated with adverse effects in the gulls or their predators. However, levels of mercury in some gull eggs are within a range suggesting that several eggs should not be eaten in one day by sensitive humans. PMID:18626778
Montazerhodjat, Vahid; Chaudhuri, Shomesh E.; Sargent, Daniel J.
2017-01-01
Importance Randomized clinical trials (RCTs) currently apply the same statistical threshold of alpha = 2.5% for controlling for false-positive results or type 1 error, regardless of the burden of disease or patient preferences. Is there an objective and systematic framework for designing RCTs that incorporates these considerations on a case-by-case basis? Objective To apply Bayesian decision analysis (BDA) to cancer therapeutics to choose an alpha and sample size that minimize the potential harm to current and future patients under both null and alternative hypotheses. Data Sources We used the National Cancer Institute (NCI) Surveillance, Epidemiology, and End Results (SEER) database and data from the 10 clinical trials of the Alliance for Clinical Trials in Oncology. Study Selection The NCI SEER database was used because it is the most comprehensive cancer database in the United States. The Alliance trial data was used owing to the quality and breadth of data, and because of the expertise in these trials of one of us (D.J.S.). Data Extraction and Synthesis The NCI SEER and Alliance data have already been thoroughly vetted. Computations were replicated independently by 2 coauthors and reviewed by all coauthors. Main Outcomes and Measures Our prior hypothesis was that an alpha of 2.5% would not minimize the overall expected harm to current and future patients for the most deadly cancers, and that a less conservative alpha may be necessary. Our primary study outcomes involve measuring the potential harm to patients under both null and alternative hypotheses using NCI and Alliance data, and then computing BDA-optimal type 1 error rates and sample sizes for oncology RCTs. Results We computed BDA-optimal parameters for the 23 most common cancer sites using NCI data, and for the 10 Alliance clinical trials. For RCTs involving therapies for cancers with short survival times, no existing treatments, and low prevalence, the BDA-optimal type 1 error rates were much higher than the traditional 2.5%. For cancers with longer survival times, existing treatments, and high prevalence, the corresponding BDA-optimal error rates were much lower, in some cases even lower than 2.5%. Conclusions and Relevance Bayesian decision analysis is a systematic, objective, transparent, and repeatable process for deciding the outcomes of RCTs that explicitly incorporates burden of disease and patient preferences. PMID:28418507
1993-03-01
in the study. 2 The highest ratio was 21.3:1 for Pilots (1310). Among the hypothesized reasons for low utilization rates are that the officers...34DoD Compliance" by gender reveal that women officers are utilized in an appropriate subspecialty at a rate that is close to 100 percent. This is true...NFOs (1320) with 94.4 percent. Analysis by gender revealed that women accounted for one of every seven officers in the 1985 cohort. Female officers were
NASA Astrophysics Data System (ADS)
Filatov, Alexei Vladimirovich
2002-09-01
Using electromagnetic forces to suspend rotating objects (rotors) without mechanical contact is often an appealing technical solution. Magnetic suspensions are typically required to have adequate load capacity and stiffness, and low rotational loss. Other desired features include low price, high reliability and manufacturability. With recent advances in permanent-magnet materials, the required forces can often be obtained by simply using the interaction between permanent magnets. While a magnetic bearing based entirely on permanent magnets could be expected to be inexpensive, reliable and easy to manufacture, a fundamental physical principle known as Earnshaw's theorem maintains that this type of suspension cannot be statically stable. Therefore, some other physical mechanisms must be included. One such mechanism employs the interaction between a conductor and a nonuniform magnetic field in relative motion. Its advantages include simplicity, reliability, wide range of operating temperature and system autonomy (no external wiring and power supplies are required). The disadvantages of the earlier embodiments were high rotational loss, low stiffness and load capacity. This dissertation proposes a novel type of magnetic bearing stabilized by the field-conductor interaction. One of the advantages of this bearing is that no electric field, E, develops in the conductor during the rotor rotation when the system is in no-load equilibrium. Because of this we refer to it as the Null-E Bearing. Null-E Bearings have potential for lower rotational loss and higher load capacity and stiffness than other bearings utilizing the field-conductor interaction. Their performance is highly insensitive to manufacturing inaccuracies. The Null-E Bearing in its basic form can be augmented with supplementary electronics to improve its performance. Depending on the degree of the electronics involvement, a variety of magnetic bearings can be developed ranging from a completely passive to an active magnetic bearing of a novel type. This dissertation contains theoretical analysis of the Null-E Bearing operation, including derivation of the stability conditions and estimation of some of the rotational losses. The validity of the theoretical conclusions has been demonstrated by building and testing a prototype in which non-contact suspension of a 3.2-kg rotor is achieved at spin speeds above 18 Hz.
Dai, Mei; Liou, Benjamin; Swope, Brittany; Wang, Xiaohong; Zhang, Wujuan; Inskeep, Venette; Grabowski, Gregory A; Sun, Ying; Pan, Dao
2016-01-01
To study the neuronal deficits in neuronopathic Gaucher Disease (nGD), the chronological behavioral profiles and the age of onset of brain abnormalities were characterized in a chronic nGD mouse model (9V/null). Progressive accumulation of glucosylceramide (GC) and glucosylsphingosine (GS) in the brain of 9V/null mice were observed at as early as 6 and 3 months of age for GC and GS, respectively. Abnormal accumulation of α-synuclein was present in the 9V/null brain as detected by immunofluorescence and Western blot analysis. In a repeated open-field test, the 9V/null mice (9 months and older) displayed significantly less environmental habituation and spent more time exploring the open-field than age-matched WT group, indicating the onset of short-term spatial memory deficits. In the marble burying test, the 9V/null group had a shorter latency to initiate burying activity at 3 months of age, whereas the latency increased significantly at ≥12 months of age; 9V/null females buried significantly more marbles to completion than the WT group, suggesting an abnormal response to the instinctive behavior and an abnormal activity in non-associative anxiety-like behavior. In the conditional fear test, only the 9V/null males exhibited a significant decrease in response to contextual fear, but both genders showed less response to auditory-cued fear compared to age- and gender-matched WT at 12 months of age. These results indicate hippocampus-related emotional memory defects. Abnormal gait emerged in 9V/null mice with wider front-paw and hind-paw widths, as well as longer stride in a gender-dependent manner with different ages of onset. Significantly higher liver- and spleen-to-body weight ratios were detected in 9V/null mice with different ages of onsets. These data provide temporal evaluation of neurobehavioral dysfunctions and brain pathology in 9V/null mice that can be used for experimental designs to evaluate novel therapies for nGD.
Dai, Mei; Liou, Benjamin; Swope, Brittany; Wang, Xiaohong; Zhang, Wujuan; Inskeep, Venette; Grabowski, Gregory A.; Sun, Ying; Pan, Dao
2016-01-01
To study the neuronal deficits in neuronopathic Gaucher Disease (nGD), the chronological behavioral profiles and the age of onset of brain abnormalities were characterized in a chronic nGD mouse model (9V/null). Progressive accumulation of glucosylceramide (GC) and glucosylsphingosine (GS) in the brain of 9V/null mice were observed at as early as 6 and 3 months of age for GC and GS, respectively. Abnormal accumulation of α-synuclein was present in the 9V/null brain as detected by immunofluorescence and Western blot analysis. In a repeated open-field test, the 9V/null mice (9 months and older) displayed significantly less environmental habituation and spent more time exploring the open-field than age-matched WT group, indicating the onset of short-term spatial memory deficits. In the marble burying test, the 9V/null group had a shorter latency to initiate burying activity at 3 months of age, whereas the latency increased significantly at ≥12 months of age; 9V/null females buried significantly more marbles to completion than the WT group, suggesting an abnormal response to the instinctive behavior and an abnormal activity in non-associative anxiety-like behavior. In the conditional fear test, only the 9V/null males exhibited a significant decrease in response to contextual fear, but both genders showed less response to auditory-cued fear compared to age- and gender-matched WT at 12 months of age. These results indicate hippocampus-related emotional memory defects. Abnormal gait emerged in 9V/null mice with wider front-paw and hind-paw widths, as well as longer stride in a gender-dependent manner with different ages of onset. Significantly higher liver- and spleen-to-body weight ratios were detected in 9V/null mice with different ages of onsets. These data provide temporal evaluation of neurobehavioral dysfunctions and brain pathology in 9V/null mice that can be used for experimental designs to evaluate novel therapies for nGD. PMID:27598339
ElAlfy, Mohsen Saleh; Adly, Amira Abdel Moneam; Ebeid, Fatma Soliman ElSayed; Eissa, Deena Samir; Ismail, Eman Abdel Rahman; Mohammed, Yasser Hassan; Ahmed, Manar Elsayed; Saad, Aya Sayed
2018-06-20
Sickle cell disease (SCD) is associated with alterations in immune phenotypes. CD4 + CD28 null T lymphocytes have pro-inflammatory functions and are linked to vascular diseases. To assess the percentage of CD4 + CD28 null T lymphocytes, natural killer cells (NK), and IFN-gamma levels, we compared 40 children and adolescents with SCD with 40 healthy controls and evaluated their relation to disease severity and response to therapy. Patients with SCD steady state were studied, focusing on history of frequent vaso-occlusive crisis, hydroxyurea therapy, and IFN-gamma levels. Analysis of CD4 + CD28 null T lymphocytes and NK cells was done by flow cytometry. Liver and cardiac iron overload were assessed. CD4 + CD28 null T lymphocytes, NK cells, and IFN-gamma levels were significantly higher in patients than controls. Patients with history of frequent vaso-occlusive crisis and those with vascular complications had higher percentage of CD4 + CD28 null T lymphocytes and IFN-gamma while levels were significantly lower among hydroxyurea-treated patients. CD4 + CD28 null T lymphocytes were positively correlated to transfusional iron input while these cells and IFN-gamma were negatively correlated to cardiac T2* and duration of hydroxyurea therapy. NK cells were correlated to HbS and indirect bilirubin. Increased expression of CD4 + CD28 null T lymphocytes highlights their role in immune dysfunction and pathophysiology of SCD complications.
Influence of Choice of Null Network on Small-World Parameters of Structural Correlation Networks
Hosseini, S. M. Hadi; Kesler, Shelli R.
2013-01-01
In recent years, coordinated variations in brain morphology (e.g., volume, thickness) have been employed as a measure of structural association between brain regions to infer large-scale structural correlation networks. Recent evidence suggests that brain networks constructed in this manner are inherently more clustered than random networks of the same size and degree. Thus, null networks constructed by randomizing topology are not a good choice for benchmarking small-world parameters of these networks. In the present report, we investigated the influence of choice of null networks on small-world parameters of gray matter correlation networks in healthy individuals and survivors of acute lymphoblastic leukemia. Three types of null networks were studied: 1) networks constructed by topology randomization (TOP), 2) networks matched to the distributional properties of the observed covariance matrix (HQS), and 3) networks generated from correlation of randomized input data (COR). The results revealed that the choice of null network not only influences the estimated small-world parameters, it also influences the results of between-group differences in small-world parameters. In addition, at higher network densities, the choice of null network influences the direction of group differences in network measures. Our data suggest that the choice of null network is quite crucial for interpretation of group differences in small-world parameters of structural correlation networks. We argue that none of the available null models is perfect for estimation of small-world parameters for correlation networks and the relative strengths and weaknesses of the selected model should be carefully considered with respect to obtained network measures. PMID:23840672
Visual and Plastic Arts in Teaching Literacy: Null Curricula?
ERIC Educational Resources Information Center
Wakeland, Robin Gay
2010-01-01
Visual and plastic arts in contemporary literacy instruction equal null curricula. Studies show that painting and sculpture facilitate teaching reading and writing (literacy), yet such pedagogy has not been formally adopted into USA curriculum. An example of null curriculum can be found in late 19th - early 20th century education the USA…
A Critique of One-Tailed Hypothesis Test Procedures in Business and Economics Statistics Textbooks.
ERIC Educational Resources Information Center
Liu, Tung; Stone, Courtenay C.
1999-01-01
Surveys introductory business and economics statistics textbooks and finds that they differ over the best way to explain one-tailed hypothesis tests: the simple null-hypothesis approach or the composite null-hypothesis approach. Argues that the composite null-hypothesis approach contains methodological shortcomings that make it more difficult for…
Targeted mutant models are common in mechanistic toxicology experiments investigating the absorption, metabolism, distribution, or elimination (ADME) of chemicals from individuals. Key models include those for xenosensing transcription factors and cytochrome P450s (CYP). Here we ...
A Null Space Control of Two Wheels Driven Mobile Manipulator Using Passivity Theory
NASA Astrophysics Data System (ADS)
Shibata, Tsuyoshi; Murakami, Toshiyuki
This paper describes a control strategy of null space motion of a two wheels driven mobile manipulator. Recently, robot is utilized in various industrial fields and it is preferable for the robot manipulator to have multiple degrees of freedom motion. Several studies of kinematics for null space motion have been proposed. However stability analysis of null space motion is not enough. Furthermore, these approaches apply to stable systems, but they do not apply unstable systems. Then, in this research, base of manipulator equips with two wheels driven mobile robot. This robot is called two wheels driven mobile manipulator, which becomes unstable system. In the proposed approach, a control design of null space uses passivity based stabilizing. A proposed controller is decided so that closed-loop system of robot dynamics satisfies passivity. This is passivity based control. Then, control strategy is that stabilizing of the robot system applies to work space observer based approach and null space control while keeping end-effector position. The validity of the proposed approach is verified by simulations and experiments of two wheels driven mobile manipulator.
Yoo, Min Heui; Kim, Tae-Youn; Yoon, Young Hee; Koh, Jae-Young
2016-01-01
To investigate the role of synaptic zinc in the ASD pathogenesis, we examined zinc transporter 3 (ZnT3) null mice. At 4–5 weeks of age, male but not female ZnT3 null mice exhibited autistic-like behaviors. Cortical volume and neurite density were significantly greater in male ZnT3 null mice than in WT mice. In male ZnT3 null mice, consistent with enhanced neurotrophic stimuli, the level of BDNF as well as activity of MMP-9 was increased. Consistent with known roles for MMPs in BDNF upregulation, 2.5-week treatment with minocycline, an MMP inhibitor, significantly attenuated BDNF levels as well as megalencephaly and autistic-like behaviors. Although the ZnT3 null state removed synaptic zinc, it rather increased free zinc in the cytosol of brain cells, which appeared to increase MMP-9 activity and BDNF levels. The present results suggest that zinc dyshomeostasis during the critical period of brain development may be a possible contributing mechanism for ASD. PMID:27352957
Yoo, Min Heui; Kim, Tae-Youn; Yoon, Young Hee; Koh, Jae-Young
2016-06-29
To investigate the role of synaptic zinc in the ASD pathogenesis, we examined zinc transporter 3 (ZnT3) null mice. At 4-5 weeks of age, male but not female ZnT3 null mice exhibited autistic-like behaviors. Cortical volume and neurite density were significantly greater in male ZnT3 null mice than in WT mice. In male ZnT3 null mice, consistent with enhanced neurotrophic stimuli, the level of BDNF as well as activity of MMP-9 was increased. Consistent with known roles for MMPs in BDNF upregulation, 2.5-week treatment with minocycline, an MMP inhibitor, significantly attenuated BDNF levels as well as megalencephaly and autistic-like behaviors. Although the ZnT3 null state removed synaptic zinc, it rather increased free zinc in the cytosol of brain cells, which appeared to increase MMP-9 activity and BDNF levels. The present results suggest that zinc dyshomeostasis during the critical period of brain development may be a possible contributing mechanism for ASD.
Making inference from wildlife collision data: inferring predator absence from prey strikes
Hosack, Geoffrey R.; Barry, Simon C.
2017-01-01
Wildlife collision data are ubiquitous, though challenging for making ecological inference due to typically irreducible uncertainty relating to the sampling process. We illustrate a new approach that is useful for generating inference from predator data arising from wildlife collisions. By simply conditioning on a second prey species sampled via the same collision process, and by using a biologically realistic numerical response functions, we can produce a coherent numerical response relationship between predator and prey. This relationship can then be used to make inference on the population size of the predator species, including the probability of extinction. The statistical conditioning enables us to account for unmeasured variation in factors influencing the runway strike incidence for individual airports and to enable valid comparisons. A practical application of the approach for testing hypotheses about the distribution and abundance of a predator species is illustrated using the hypothesized red fox incursion into Tasmania, Australia. We estimate that conditional on the numerical response between fox and lagomorph runway strikes on mainland Australia, the predictive probability of observing no runway strikes of foxes in Tasmania after observing 15 lagomorph strikes is 0.001. We conclude there is enough evidence to safely reject the null hypothesis that there is a widespread red fox population in Tasmania at a population density consistent with prey availability. The method is novel and has potential wider application. PMID:28243534
NASA Astrophysics Data System (ADS)
Armal, S.; Devineni, N.; Khanbilvardi, R.
2017-12-01
This study presents a systematic analysis for identifying and attributing trends in the annual frequency of extreme rainfall events across the contiguous United States to climate change and climate variability modes. A Bayesian multilevel model is developed for 1,244 stations simultaneously to test the null hypothesis of no trend and verify two alternate hypotheses: Trend can be attributed to changes in global surface temperature anomalies, or to a combination of cyclical climate modes with varying quasi-periodicities and global surface temperature anomalies. The Bayesian multilevel model provides the opportunity to pool information across stations and reduce the parameter estimation uncertainty, hence identifying the trends better. The choice of the best alternate hypotheses is made based on Watanabe-Akaike Information Criterion, a Bayesian pointwise predictive accuracy measure. Statistically significant time trends are observed in 742 of the 1,244 stations. Trends in 409 of these stations can be attributed to changes in global surface temperature anomalies. These stations are predominantly found in the Southeast and Northeast climate regions. The trends in 274 of these stations can be attributed to the El Nino Southern Oscillations, North Atlantic Oscillation, Pacific Decadal Oscillation and Atlantic Multi-Decadal Oscillation along with changes in global surface temperature anomalies. These stations are mainly found in the Northwest, West and Southwest climate regions.
Divertor with a third-order null of the poloidal field
DOE Office of Scientific and Technical Information (OSTI.GOV)
Ryutov, D. D.; Umansky, M. V.
2013-09-15
A concept and preliminary feasibility analysis of a divertor with the third-order poloidal field null is presented. The third-order null is the point where not only the field itself but also its first and second spatial derivatives are zero. In this case, the separatrix near the null-point has eight branches, and the number of strike-points increases from 2 (as in the standard divertor) to six. It is shown that this magnetic configuration can be created by a proper adjustment of the currents in a set of three divertor coils. If the currents are somewhat different from the required values, themore » configuration becomes that of three closely spaced first-order nulls. Analytic approach, suitable for a quick orientation in the problem, is used. Potential advantages and disadvantages of this configuration are briefly discussed.« less
Castilla-Ortega, Estela; Pavón, Francisco Javier; Sánchez-Marín, Laura; Estivill-Torrús, Guillermo; Pedraza, Carmen; Blanco, Eduardo; Suárez, Juan; Santín, Luis; Rodríguez de Fonseca, Fernando; Serrano, Antonia
2016-04-01
Lysophosphatidic acid species (LPA) are lipid bioactive signaling molecules that have been recently implicated in the modulation of emotional and motivational behaviors. The present study investigates the consequences of either genetic deletion or pharmacological blockade of lysophosphatidic acid receptor-1 (LPA1) in alcohol consumption. The experiments were performed in alcohol-drinking animals by using LPA1-null mice and administering the LPA1 receptor antagonist Ki16425 in both mice and rats. In the two-bottle free choice paradigm, the LPA1-null mice preferred the alcohol more than their wild-type counterparts. Whereas the male LPA1-null mice displayed this higher preference at all doses tested, the female LPA1-null mice only consumed more alcohol at 6% concentration. The male LPA1-null mice were then further characterized, showing a notably increased ethanol drinking after a deprivation period and a reduced sleep time after acute ethanol administration. In addition, LPA1-null mice were more anxious than the wild-type mice in the elevated plus maze test. For the pharmacological experiments, the acute administration of the antagonist Ki16425 consistently increased ethanol consumption in both wild-type mice and rats; while it did not modulate alcohol drinking in the LPA1-null mice and lacked intrinsic rewarding properties and locomotor effects in a conditioned place preference paradigm. In addition, LPA1-null mice exhibited a marked reduction on the expression of glutamate-transmission-related genes in the prefrontal cortex similar to those described in alcohol-exposed rodents. Results suggest a relevant role for the LPA/LPA1 signaling system in alcoholism. In addition, the LPA1-null mice emerge as a new model for genetic vulnerability to excessive alcohol drinking. The pharmacological manipulation of LPA1 receptor arises as a new target for the study and treatment of alcoholism. Copyright © 2015 Elsevier Ltd. All rights reserved.
On Nulling, Drifting, and Their Interactions in PSRs J1741-0840 and J1840-0840
NASA Astrophysics Data System (ADS)
Gajjar, V.; Yuan, J. P.; Yuen, R.; Wen, Z. G.; Liu, Z. Y.; Wang, N.
2017-12-01
We report detailed investigation of nulling and drifting behavior of two pulsars PSRs J1741-0840 and J1840-0840 observed from the Giant Meterwave Radio Telescope at 625 MHz. PSR J1741-0840 was found to show a nulling fraction (NF) of around 30% ± 5% while PSR J1840-0840 was shown to have an NF of around 50% ± 6%. We measured drifting behavior from different profile components in PSR J1840-0840 for the first time with the leading component showing drifting with 13.5 ± 0.7 periods while the weak trailing component showed drifting of around 18 ± 1 periods. Large nulling hampers accuracy of these quantities derived using standard Fourier techniques. A more accurate comparison was drawn from driftband slopes, measured after sub-pulse modeling. These measurements revealed interesting sporadic and irregular drifting behavior in both pulsars. We conclude that the previously reported different drifting periodicities in the trailing component of PSR J1741-0840 is likely due to the spread in these driftband slopes. We also find that both components of PSR J1840-0840 show similar driftband slopes within the uncertainties. Unique nulling-drifting interaction is identified in PSR J1840-0840 where, on most occasions, the pulsar tends to start nulling after what appears to be the end of a driftband. Similarly, when the pulsar switches back to an emission phase, on most occasions it starts at the beginning of a new driftband in both components. Such behaviors have not been detected in any other pulsars to our knowledge. We also found that PSR J1741-0840 seems to have no memory of its previous burst phase while PSR J1840-0840 clearly exhibits memory of its previous state even after longer nulls for both components. We discuss possible explanations for these intriguing nulling-drifting interactions seen in both pulsars based on various pulsar nulling models.
Rosales, Corina; Patel, Niket; Gillard, Baiba K.; Yelamanchili, Dedipya; Yang, Yaliu; Courtney, Harry S.; Santos, Raul D.; Gotto, Antonio M.; Pownall, Henry J.
2016-01-01
The reaction of Streptococcal serum opacity factor (SOF) against plasma high-density lipoproteins (HDL) produces a large cholesteryl ester-rich microemulsion (CERM), a smaller neo HDL that is apolipoprotein (apo) AI-poor, and lipid-free apo AI. SOF is active vs. both human and mouse plasma HDL. In vivo injection of SOF into mice reduces plasma cholesterol ~40% in 3 hours while forming the same products observed in vitro, but at different ratios. Previous studies supported the hypothesis that labile apo AI is required for the SOF reaction vs. HDL. Here we further tested that hypothesis by studies of SOF against HDL from apo AI-null mice. When injected into apo AI-null mice, SOF reduced plasma cholesterol ~35% in three hours. The reaction of SOF vs. apo AI-null HDL in vitro produced a CERM and neo HDL, but no lipid-free apo. Moreover, according to the rate of CERM formation, the extent and rate of the SOF reaction vs. apo AI-null mouse HDL was less than that against wild-type (WT) mouse HDL. Chaotropic perturbation studies using guanidine hydrochloride showed that apo AI-null HDL was more stable than WT HDL. Human apo AI added to apo AI-null HDL was quantitatively incorporated, giving reconstituted HDL. Both SOF and guanidine hydrochloride displaced apo AI from the reconstituted HDL. These results support the conclusion that apo AI-null HDL is more stable than WT HDL because it lacks apo AI, a labile protein that is readily displaced by physico-chemical and biochemical perturbations. Thus, apo AI-null HDL is less SOF-reactive than WT HDL. The properties of apo AI-null HDL can be partially restored to those of WT HDL by the spontaneous incorporation of human apo AI. It remains to be determined what other HDL functions are affected by apo AI deletion. PMID:25790332
Adaptive Nulling for Interferometric Detection of Planets
NASA Technical Reports Server (NTRS)
Lay, Oliver P.; Peters, Robert D.
2010-01-01
An adaptive-nulling method has been proposed to augment the nulling-optical- interferometry method of detection of Earth-like planets around distant stars. The method is intended to reduce the cost of building and aligning the highly precise optical components and assemblies needed for nulling. Typically, at the mid-infrared wavelengths used for detecting planets orbiting distant stars, a star is millions of times brighter than an Earth-sized planet. In order to directly detect the light from the planet, it is necessary to remove most of the light coming from the star. Nulling interferometry is one way to suppress the light from the star without appreciably suppressing the light from the planet. In nulling interferometry in its simplest form, one uses two nominally identical telescopes aimed in the same direction and separated laterally by a suitable distance. The light collected by the two telescopes is processed through optical trains and combined on a detector. The optical trains are designed such that the electric fields produced by an on-axis source (the star) are in anti-phase at the detector while the electric fields from the planet, which is slightly off-axis, combine in phase, so that the contrast ratio between the star and the planet is greatly decreased. If the electric fields from the star are exactly equal in amplitude and opposite in phase, then the star is effectively nulled out. Nulling is effective only if it is complete in the sense that it occurs simultaneously in both polarization states and at all wavelengths of interest. The need to ensure complete nulling translates to extremely tight demands upon the design and fabrication of the complex optical trains: The two telescopes must be highly symmetric, the reflectivities of the many mirrors in the telescopes and other optics must be carefully tailored, the optical coatings must be extremely uniform, sources of contamination must be minimized, optical surfaces must be nearly ideal, and alignments must be extremely precise. Satisfaction of all of these requirements entails substantial cost.
NASA Astrophysics Data System (ADS)
Stahle, D.; Griffin, D.; Cleaveland, M.; Fye, F.; Meko, D.; Cayan, D.; Dettinger, M.; Redmond, K.
2007-05-01
A new network of 36 moisture sensitive tree-ring chronologies has been developed in and near the drainage basins of the Sacramento and San Joaquin Rivers. The network is based entirely on blue oak (Quercus douglasii), which is a California endemic found from the lower forest border up into the mixed conifer zone in the Coast Ranges, Sierra Nevada, and Cascades. These blue oak tree-ring chronologies are highly correlated with winter-spring precipitation totals, Sacramento and San Joaquin streamflow, and with seasonal variations in salinity and null zone position in San Francisco Bay. Null zone is the non-tidal bottom water location where density-driven salinity and river-driven freshwater currents balance (zero flow). It is the area of highest turbidity, water residence time, sediment accumulation, and net primary productivity in the estuary. Null zone position is measured by the distance from the Golden Gate of the 2 per mil bottom water isohaline and is primarily controlled by discharge from the Sacramento and San Joaquin Rivers (and ultimately by winter-spring precipitation). The location of the null zone is an estuarine habitat indicator, a policy variable used for ecosystem management, and can have a major impact on biological resources in the San Francisco estuary. Precipitation-sensitive blue oak chronologies can be used to estimate null zone position based on the strong biogeophysical interaction among terrestrial, aquatic, and estuarine ecosystems, orchestrated by precipitation. The null zone reconstruction is 626-years long and provides a unique long term perspective on the interannual to decadal variability of this important estuarine habitat indicator. Consecutive two-year droughts (or longer) allow the null zone to shrink into the confined upper reaches of Suisun Bay, causing a dramatic reduction in phytoplankton production and favoring colonization of the estuary by marine biota. The reconstruction indicates an approximate 10 year recurrence interval between these consecutive two-year droughts and null zone maxima. Composite analyses of the Palmer drought index over North America indicate that the drought and wetness regimes associated with maxima and minima in reconstructed null zone position are largely restricted to the California sector. Composite analyses of the 20th century global sea surface temperature (SST) field indicate that wet years over central California with good oak growth, high flows, and a seaward position for the null zone (minima) are associated with warm El Nino conditions and a "Pineapple Express" SST pattern. The composite SST pattern is not as strong during dry years with poor growth, low flows, and a landward position of the null zone (maxima), but the composite warm SST anomaly in the eastern North Pacific during maxima would be consistent with a persistent ridge and drought over western North America.
Earthquake likelihood model testing
Schorlemmer, D.; Gerstenberger, M.C.; Wiemer, S.; Jackson, D.D.; Rhoades, D.A.
2007-01-01
INTRODUCTIONThe Regional Earthquake Likelihood Models (RELM) project aims to produce and evaluate alternate models of earthquake potential (probability per unit volume, magnitude, and time) for California. Based on differing assumptions, these models are produced to test the validity of their assumptions and to explore which models should be incorporated in seismic hazard and risk evaluation. Tests based on physical and geological criteria are useful but we focus on statistical methods using future earthquake catalog data only. We envision two evaluations: a test of consistency with observed data and a comparison of all pairs of models for relative consistency. Both tests are based on the likelihood method, and both are fully prospective (i.e., the models are not adjusted to fit the test data). To be tested, each model must assign a probability to any possible event within a specified region of space, time, and magnitude. For our tests the models must use a common format: earthquake rates in specified “bins” with location, magnitude, time, and focal mechanism limits.Seismology cannot yet deterministically predict individual earthquakes; however, it should seek the best possible models for forecasting earthquake occurrence. This paper describes the statistical rules of an experiment to examine and test earthquake forecasts. The primary purposes of the tests described below are to evaluate physical models for earthquakes, assure that source models used in seismic hazard and risk studies are consistent with earthquake data, and provide quantitative measures by which models can be assigned weights in a consensus model or be judged as suitable for particular regions.In this paper we develop a statistical method for testing earthquake likelihood models. A companion paper (Schorlemmer and Gerstenberger 2007, this issue) discusses the actual implementation of these tests in the framework of the RELM initiative.Statistical testing of hypotheses is a common task and a wide range of possible testing procedures exist. Jolliffe and Stephenson (2003) present different forecast verifications from atmospheric science, among them likelihood testing of probability forecasts and testing the occurrence of binary events. Testing binary events requires that for each forecasted event, the spatial, temporal and magnitude limits be given. Although major earthquakes can be considered binary events, the models within the RELM project express their forecasts on a spatial grid and in 0.1 magnitude units; thus the results are a distribution of rates over space and magnitude. These forecasts can be tested with likelihood tests.In general, likelihood tests assume a valid null hypothesis against which a given hypothesis is tested. The outcome is either a rejection of the null hypothesis in favor of the test hypothesis or a nonrejection, meaning the test hypothesis cannot outperform the null hypothesis at a given significance level. Within RELM, there is no accepted null hypothesis and thus the likelihood test needs to be expanded to allow comparable testing of equipollent hypotheses.To test models against one another, we require that forecasts are expressed in a standard format: the average rate of earthquake occurrence within pre-specified limits of hypocentral latitude, longitude, depth, magnitude, time period, and focal mechanisms. Focal mechanisms should either be described as the inclination of P-axis, declination of P-axis, and inclination of the T-axis, or as strike, dip, and rake angles. Schorlemmer and Gerstenberger (2007, this issue) designed classes of these parameters such that similar models will be tested against each other. These classes make the forecasts comparable between models. Additionally, we are limited to testing only what is precisely defined and consistently reported in earthquake catalogs. Therefore it is currently not possible to test such information as fault rupture length or area, asperity location, etc. Also, to account for data quality issues, we allow for location and magnitude uncertainties as well as the probability that an event is dependent on another event.As we mentioned above, only models with comparable forecasts can be tested against each other. Our current tests are designed to examine grid-based models. This requires that any fault-based model be adapted to a grid before testing is possible. While this is a limitation of the testing, it is an inherent difficulty in any such comparative testing. Please refer to appendix B for a statistical evaluation of the application of the Poisson hypothesis to fault-based models.The testing suite we present consists of three different tests: L-Test, N-Test, and R-Test. These tests are defined similarily to Kagan and Jackson (1995). The first two tests examine the consistency of the hypotheses with the observations while the last test compares the spatial performances of the models.
Manna, Soumen K.; Patterson, Andrew D.; Yang, Qian; Krausz, Kristopher W.; Li, Henghong; Idle, Jeffrey R.; Fornace, Albert J.; Gonzalez, Frank J.
2010-01-01
Alcohol-induced liver disease (ALD) is a leading cause of non-accident-related deaths in the United States. Although liver damage caused by ALD is reversible when discovered at the earlier stages, current risk assessment tools are relatively non-specific. Identification of an early specific signature of ALD would aid in therapeutic intervention and recovery. In this study the metabolic changes associated with alcohol-induced liver disease were examined using alcohol-fed male Ppara-null mouse as a model of ALD. Principal components analysis of the mass spectrometry-based urinary metabolic profile showed that alcohol-treated wild-type and Ppara-null mice could be distinguished from control animals without information on history of alcohol consumption. The urinary excretion of ethyl-sulfate, ethyl-β-D-glucuronide, 4-hydroxyphenylacetic acid, and 4-hydroxyphenylacetic acid sulfate was elevated and that of the 2-hydroxyphenylacetic acid, adipic acid, and pimelic acid was depleted during alcohol treatment in both wild-type and the Ppara-null mice albeit to different extents. However, indole-3-lactic acid was exclusively elevated by alcohol exposure in Ppara-null mice. The elevation of indole-3-lactic acid is mechanistically related to the molecular events associated with development of ALD in alcohol-treated Ppara-null mice. This study demonstrated the ability of metabolomics approach to identify early, noninvasive biomarkers of ALD pathogenesis in Ppara-null mouse model. PMID:20540569
pyNSMC: A Python Module for Null-Space Monte Carlo Uncertainty Analysis
NASA Astrophysics Data System (ADS)
White, J.; Brakefield, L. K.
2015-12-01
The null-space monte carlo technique is a non-linear uncertainty analyses technique that is well-suited to high-dimensional inverse problems. While the technique is powerful, the existing workflow for completing null-space monte carlo is cumbersome, requiring the use of multiple commandline utilities, several sets of intermediate files and even a text editor. pyNSMC is an open-source python module that automates the workflow of null-space monte carlo uncertainty analyses. The module is fully compatible with the PEST and PEST++ software suites and leverages existing functionality of pyEMU, a python framework for linear-based uncertainty analyses. pyNSMC greatly simplifies the existing workflow for null-space monte carlo by taking advantage of object oriented design facilities in python. The core of pyNSMC is the ensemble class, which draws and stores realized random vectors and also provides functionality for exporting and visualizing results. By relieving users of the tedium associated with file handling and command line utility execution, pyNSMC instead focuses the user on the important steps and assumptions of null-space monte carlo analysis. Furthermore, pyNSMC facilitates learning through flow charts and results visualization, which are available at many points in the algorithm. The ease-of-use of the pyNSMC workflow is compared to the existing workflow for null-space monte carlo for a synthetic groundwater model with hundreds of estimable parameters.
Abbas, Shania; Raza, Syed Tasleem; Chandra, Anu; Rizvi, Saliha; Ahmed, Faisal; Eba, Ale; Mahdi, Farzana
2015-01-01
Hypertension has a multi-factorial background based on genetic and environmental interactive factors. ACE, FABP2 and GST genes have been suggested to be involved in the development of hypertension. However, the results have been inconsistent. The present study was carried out to investigate the association of ACE (rs4646994), FABP2 (rs1799883) and GST (GSTM1 null or positive genotype and GSTT1 null or positive genotype) genes polymorphism with essential HTN cases and controls. This study includes 138 essential hypertension (HTN) patients and 116 age-, sex- and ethnicity-matched control subjects. GST (GSTM1 null or positive genotype and GSTT1 null or positive genotype) genes polymorphisms were evaluated by multiplex PCR, ACE (rs4646994) gene polymorphisms by PCR and FABP2 (rs1799883) gene polymorphisms by PCR-RFLP method. Significant differences were obtained in the frequencies of ACE DD, II genotype (p = 0.006, 0.003), GSTT1 null, GSTM1 positive genotype (p = 0.048, 0.010) and FABP2 Ala54/Ala54 genotype (p = 0.049) between essential HTN cases and controls. It is concluded that ACE (rs 4646994), FABP2 (rs1799883) and GST (GSTM1 null or positive genotype and GSTT1 null or positive genotype) genes polymorphism are associated with HTN. Further investigation with a larger sample size may be required to validate this study.
FlnA-null megakaryocytes prematurely release large and fragile platelets that circulate poorly
Jurak Begonja, Antonija; Hoffmeister, Karin M.; Hartwig, John H.
2011-01-01
Filamin A (FlnA) is a large cytoplasmic protein that crosslinks actin filaments and anchors membrane receptors and signaling intermediates. FlnAloxP PF4-Cre mice that lack FlnA in the megakaryocyte (MK) lineage have a severe macrothrombocytopenia because of accelerated platelet clearance. Macrophage ablation by injection of clodronate-encapsulated liposomes increases blood platelet counts in FlnAloxP PF4-Cre mice and reveals the desintegration of FlnA-null platelets into microvesicles, a process that occurs spontaneously during storage. FlnAloxP PF4-Cre bone marrows and spleens have a 2.5- to 5-fold increase in MK numbers, indicating increased thrombopoiesis in vivo. Analysis of platelet production in vitro reveals that FlnA-null MKs prematurely convert their cytoplasm into large CD61+ platelet-sized particles, reminiscent of the large platelets observed in vivo. FlnA stabilizes the platelet von Willebrand factor receptor, as surface expression of von Willebrand factor receptor components is normal on FlnA-null MKs but decreased on FlnA-null platelets. Further, FlnA-null platelets contain multiple GPIbα degradation products and have increased expression of the ADAM17 and MMP9 metalloproteinases. Together, the findings indicate that FlnA-null MKs prematurely release large and fragile platelets that are removed rapidly from the circulation by macrophages. PMID:21652675
31 CFR 536.202 - Effect of transfers violating the provisions of this part.
Code of Federal Regulations, 2010 CFR
2010-07-01
... a specially designated narcotics trafficker has or has had an interest since such date, is null and... which otherwise would be null and void or unenforceable by virtue of the provisions of this section shall not be deemed to be null and void or unenforceable as to any person with whom such property was...
31 CFR 585.202 - Effect of transfers violating the provisions of this part.
Code of Federal Regulations, 2010 CFR
2010-07-01
... authorization hereunder and involves any property or interest in property blocked pursuant to § 585.201 is null... hereunder. (d) Transfers of property which otherwise would be null and void or unenforceable by virtue of the provisions of this section shall not be deemed to be null and void or unenforceable as to any...
Null Effects and Publication Bias in Special Education Research
ERIC Educational Resources Information Center
Cook, Bryan G.; Therrien, William J.
2017-01-01
Researchers sometimes conduct a study and find that the predicted relation between variables did not exist or that the intervention did not have a positive impact on student outcomes; these are referred to as null findings because they fail to disconfirm the null hypothesis. Rather than consider such studies as failures and disregard the null…
A Short Term Real Time Study in Syntactic Change.
ERIC Educational Resources Information Center
Duarte, Maria Eugenia Lamoglia
Recent research has shown that Brazilian Portuguese is undergoing a change regarding the null subject parameter, evolving from a null subject to a non-null subject language. This paper presents the results of a short term, real time study of speakers of Brazilian Portuguese with low and mid levels of formal education. The study was based on…
Seasonal reproductive endothermy in tegu lizards.
Tattersall, Glenn J; Leite, Cleo A C; Sanders, Colin E; Cadena, Viviana; Andrade, Denis V; Abe, Augusto S; Milsom, William K
2016-01-01
With some notable exceptions, small ectothermic vertebrates are incapable of endogenously sustaining a body temperature substantially above ambient temperature. This view was challenged by our observations of nighttime body temperatures sustained well above ambient (up to 10°C) during the reproductive season in tegu lizards (~2 kg). This led us to hypothesize that tegus have an enhanced capacity to augment heat production and heat conservation. Increased metabolic rates and decreased thermal conductance are the same mechanisms involved in body temperature regulation in those vertebrates traditionally acknowledged as "true endotherms": the birds and mammals. The appreciation that a modern ectotherm the size of the earliest mammals can sustain an elevated body temperature through metabolic rates approaching that of endotherms enlightens the debate over endothermy origins, providing support for the parental care model of endothermy, but not for the assimilation capacity model of endothermy. It also indicates that, contrary to prevailing notions, ectotherms can engage in facultative endothermy, providing a physiological analog in the evolutionary transition to true endothermy.
Seasonal reproductive endothermy in tegu lizards
Tattersall, Glenn J.; Leite, Cleo A. C.; Sanders, Colin E.; Cadena, Viviana; Andrade, Denis V.; Abe, Augusto S.; Milsom, William K.
2016-01-01
With some notable exceptions, small ectothermic vertebrates are incapable of endogenously sustaining a body temperature substantially above ambient temperature. This view was challenged by our observations of nighttime body temperatures sustained well above ambient (up to 10°C) during the reproductive season in tegu lizards (~2 kg). This led us to hypothesize that tegus have an enhanced capacity to augment heat production and heat conservation. Increased metabolic rates and decreased thermal conductance are the same mechanisms involved in body temperature regulation in those vertebrates traditionally acknowledged as “true endotherms”: the birds and mammals. The appreciation that a modern ectotherm the size of the earliest mammals can sustain an elevated body temperature through metabolic rates approaching that of endotherms enlightens the debate over endothermy origins, providing support for the parental care model of endothermy, but not for the assimilation capacity model of endothermy. It also indicates that, contrary to prevailing notions, ectotherms can engage in facultative endothermy, providing a physiological analog in the evolutionary transition to true endothermy. PMID:26844295
Singh, Navneet; Chakrabarty, Subhas
2013-11-15
We recently reported on the isolation and characterization of calcium sensing receptor (CaSR) null human colon cancer cells (Singh et al., Int J Cancer 2013; 132: 1996-2005). CaSR null cells possess a myriad of molecular features that are linked to a highly malignant and drug resistant phenotype of colon cancer. The CaSR null phenotype can be maintained in defined human embryonic stem cell culture medium. We now show that the CaSR null cells can be induced to differentiate in conventional culture medium, regained the expression of CaSR with a concurrent reversal of the cellular and molecular features associated with the null phenotype. These features include cellular morphology, expression of colon cancer stem cell markers, expression of survivin and thymidylate synthase and sensitivity to fluorouracil. Other features include the expression of epithelial mesenchymal transition linked molecules and transcription factors, oncogenic miRNAs and tumor suppressive molecule and miRNA. With the exception of cancer stem cell markers, the reversal of molecular features, upon the induction of CaSR expression, is directly linked to the expression and function of CaSR because blocking CaSR induction by shRNA circumvented such reversal. We further report that methylation and demethylation of the CaSR gene promoter underlie CaSR expression. Due to the malignant nature of the CaSR null cells, inclusion of the CaSR null phenotype in disease management may improve on the mortality of this disease. Because CaSR is a robust promoter of differentiation and mediates its action through diverse mechanisms and pathways, inactivation of CaSR may serve as a new paradigm in colon carcinogenesis. Copyright © 2013 UICC.
Xu, Guogang; Vogel, Kristine S; McMahan, C Alex; Herbert, Damon C; Walter, Christi A
2010-12-01
During the first wave of spermatogenesis, and in response to ionizing radiation, elevated mutant frequencies are reduced to a low level by unidentified mechanisms. Apoptosis is occurring in the same time frame that the mutant frequency declines. We examined the role of apoptosis in regulating mutant frequency during spermatogenesis. Apoptosis and mutant frequencies were determined in spermatogenic cells obtained from Bax-null or Trp53-null mice. The results showed that spermatogenic lineage apoptosis was markedly decreased in Bax-null mice and was accompanied by a significantly increased spontaneous mutant frequency in seminiferous tubule cells compared to that of wild-type mice. Apoptosis profiles in the seminiferous tubules for Trp53-null were similar to control mice. Spontaneous mutant frequencies in pachytene spermatocytes and in round spermatids from Trp53-null mice were not significantly different from those of wild-type mice. However, epididymal spermatozoa from Trp53-null mice displayed a greater spontaneous mutant frequency compared to that from wild-type mice. A greater proportion of spontaneous transversions and a greater proportion of insertions/deletions 15 days after ionizing radiation were observed in Trp53-null mice compared to wild-type mice. Base excision repair activity in mixed germ cell nuclear extracts prepared from Trp53-null mice was significantly lower than that for wild-type controls. These data indicate that BAX-mediated apoptosis plays a significant role in regulating spontaneous mutagenesis in seminiferous tubule cells obtained from neonatal mice, whereas tumor suppressor TRP53 plays a significant role in regulating spontaneous mutagenesis between postmeiotic round spermatid and epididymal spermatozoon stages of spermiogenesis.
Desmin Cytoskeleton Linked to Muscle Mitochondrial Distribution and Respiratory Function
Milner, Derek J.; Mavroidis, Manolis; Weisleder, Noah; Capetanaki, Yassemi
2000-01-01
Ultrastructural studies have previously suggested potential association of intermediate filaments (IFs) with mitochondria. Thus, we have investigated mitochondrial distribution and function in muscle lacking the IF protein desmin. Immunostaining of skeletal muscle tissue sections, as well as histochemical staining for the mitochondrial marker enzymes cytochrome C oxidase and succinate dehydrogenase, demonstrate abnormal accumulation of subsarcolemmal clumps of mitochondria in predominantly slow twitch skeletal muscle of desmin-null mice. Ultrastructural observation of desmin-null cardiac muscle demonstrates in addition to clumping, extensive mitochondrial proliferation in a significant fraction of the myocytes, particularly after work overload. These alterations are frequently associated with swelling and degeneration of the mitochondrial matrix. Mitochondrial abnormalities can be detected very early, before other structural defects become obvious. To investigate related changes in mitochondrial function, we have analyzed ADP-stimulated respiration of isolated muscle mitochondria, and ADP-stimulated mitochondrial respiration in situ using saponin skinned muscle fibers. The in vitro maximal rates of respiration in isolated cardiac mitochondria from desmin-null and wild-type mice were similar. However, mitochondrial respiration in situ is significantly altered in desmin-null muscle. Both the maximal rate of ADP-stimulated oxygen consumption and the dissociation constant (K m) for ADP are significantly reduced in desmin-null cardiac and soleus muscle compared with controls. Respiratory parameters for desmin-null fast twitch gastrocnemius muscle were unaffected. Additionally, respiratory measurements in the presence of creatine indicate that coupling of creatine kinase and the adenine translocator is lost in desmin-null soleus muscle. This coupling is unaffected in cardiac muscle from desmin-null animals. All of these studies indicate that desmin IFs play a significant role in mitochondrial positioning and respiratory function in cardiac and skeletal muscle. PMID:10995435
The prolyl isomerase Pin1 modulates development of CD8+ cDC in mice.
Barberi, Theresa J; Dunkle, Alexis; He, You-Wen; Racioppi, Luigi; Means, Anthony R
2012-01-01
Pin1 has previously been described to regulate cells that participate in both innate and adaptive immunity. Thus far, however, no role for Pin1 has been described in modulating conventional dendritic cells, innate antigen presenting cells that potently activate naïve T cells, thereby bridging innate and adaptive immune responses. When challenged with LPS, Pin1-null mice failed to accumulate spleen conventional dendritic cells (cDC). Analysis of steady-state spleen DC populations revealed that Pin1-null mice had fewer CD8+ cDC. This defect was recapitulated by culturing Pin1-null bone marrow with the DC-instructive cytokine Flt3 Ligand. Additionally, injection of Flt3 Ligand for 9 days failed to induce robust expansion of CD8+ cDC in Pin1-null mice. Upon infection with Listeria monocytogenes, Pin1-null mice were defective in stimulating proliferation of adoptively transferred WT CD8+ T cells, suggesting that decreases in Pin1 null CD8+ cDC may affect T cell responses to infection in vivo. Finally, upon analyzing expression of proteins involved in DC development, elevated expression of PU.1 was detected in Pin1-null cells, which resulted from an increase in PU.1 protein half-life. We have identified a novel role for Pin1 as a modulator of CD8+ cDC development. Consistent with reduced numbers of CD8+ cDC in Pin1-null mice, we find that the absence of Pin1 impairs CD8+ T cell proliferation in response to infection with Listeria monocytogenes. These data suggest that, via regulation of CD8+ cDC production, Pin1 may serve as an important modulator of adaptive immunity.
Hepatic effects of a methionine-choline-deficient diet in hepatocyte RXRalpha-null mice.
Gyamfi, Maxwell Afari; Tanaka, Yuji; He, Lin; Klaassen, Curtis D; Wan, Yu-Jui Yvonne
2009-01-15
Retinoid X receptor-alpha (RXRalpha) is an obligate partner for several nuclear hormone receptors that regulate important physiological processes in the liver. In this study the impact of hepatocyte RXRalpha deficiency on methionine and choline deficient (MCD) diet-induced steatosis, oxidative stress, inflammation, and hepatic transporters gene expression were examined. The mRNA of sterol regulatory element-binding protein (SREBP)-regulated genes, important for lipid synthesis, were not altered in wild type (WT) mice, but were increased 2.0- to 5.4-fold in hepatocyte RXRalpha-null (H-RXRalpha-null) mice fed a MCD diet for 14 days. Furthermore, hepatic mRNAs and proteins essential for fatty acid beta-oxidation were not altered in WT mice, but were decreased in the MCD diet-fed H-RXRalpha-null mice, resulting in increased hepatic free fatty acid levels. Cyp2e1 enzyme activity and lipid peroxide levels were induced only in MCD-fed WT mice. In contrast, hepatic mRNA levels of pro-inflammatory factors were increased only in H-RXRalpha-null mice fed the MCD diet. Hepatic uptake transporters Oatp1a1 and Oatp1b2 mRNA levels were decreased in WT mice fed the MCD diet, whereas the efflux transporter Mrp4 was increased. However, in the H-RXRalpha-null mice, the MCD diet only moderately decreased Oatp1a1 and induced both Oatp1a4 and Mrp4 gene expression. Whereas the MCD diet increased serum bile acid levels and alkaline phosphatase activity in both WT and H-RXRalpha-null mice, serum ALT levels were induced (2.9-fold) only in the H-RXRalpha-null mice. In conclusion, these data suggest a critical role for RXRalpha in hepatic fatty acid homeostasis and protection against MCD-induced hepatocyte injury.
Altered Anterior Segment Biometric Parameters in Mice Deficient in SPARC.
Ho, Henrietta; Htoon, Hla M; Yam, Gary Hin-Fai; Toh, Li Zhen; Lwin, Nyein Chan; Chu, Stephanie; Lee, Ying Shi; Wong, Tina T; Seet, Li-Fong
2017-01-01
Secreted protein acidic and rich in cysteine (SPARC) and Hevin are structurally related matricellular proteins involved in extracellular matrix assembly. In this study, we compared the anterior chamber biometric parameters and iris collagen properties in SPARC-, Hevin- and SPARC-/Hevin-null with wild-type (WT) mice. The right eyes of 53 WT, 35 SPARC-, 56 Hevin-, and 63 SPARC-/Hevin-null mice were imaged using the RTVue-100 Fourier-domain optical coherence tomography system. The parameters measured were anterior chamber depth (ACD), trabecular-iris space area (TISA), angle opening distance (AOD), and pupil diameter. Biometric data were analyzed using analysis of covariance and adjusted for age, sex, and pupil diameter. Expression of Col1a1, Col8a1, and Col8a2 transcripts in the irises was measured by quantitative polymerase chain reaction. Collagen fibril thickness was evaluated by transmission electron microscopy. Mice that were SPARC- and SPARC-/Hevin-null had 1.28- and 1.25-fold deeper ACD, 1.45- and 1.53-fold larger TISA, as well as 1.42- and 1.51-fold wider AOD than WT, respectively. These measurements were not significantly different between SPARC- and SPARC-/Hevin-null mice. The SPARC-null iris expressed lower Col1a1, but higher Col8a1 and Col8a2 transcripts compared with WT. Collagen fibrils in the SPARC- and SPARC-/Hevin-null irises were 1.5- and 1.7-fold thinner than WT, respectively. The Hevin-null iris did not differ from WT in these collagen properties. SPARC-null mice have deeper anterior chamber as well as wider drainage angles compared with WT. Therefore, SPARC plays a key role in influencing the spatial organization of the anterior segment, potentially via modulation of collagen properties, while Hevin is not likely to be involved.
Park, Jin-Ah; Kim, Jung-Mi; Park, Seung-Moon; Kim, Dae-Hyuk
2012-04-01
The gene CpSte11 of Cryphonectria parasitica, which encodes a yeast Ste11 homologue, was cloned and characterized. Gene replacement analysis revealed a high frequency of CpSte11 null mutants. When compared with the wild-type parent strain, CpSte11 null mutants showed no difference in terms of growth rate or pigmentation. However, CpSte11 null mutants showed a marked decrease in both the number and size of stromal pustules on chestnut twigs. The virulence test showed that, in comparison with those of the wild-type and virus-infected hypovirulent strains, CpSte11 null mutants produced necrotic areas of intermediate size. Disruption of the CpSte11 gene also resulted in defects in female fertility. Down-regulation of transcripts for the mating pheromone precursor gene, Mf2/2, and mating response transcription factors, such as cpst12 and pro1, was observed in CpSte11 null mutants. The down-regulation of Mf2/2, cpst12 and pro1 was also observed in the mutant phenotype of Cpmk2, a mating response Fus3-like mitogen-activated protein kinase (MAPK) gene, but not in the mutant of Cpmk1, a high-osmolarity glycerol Hog1-like MAPK gene. These results indicate that the cloned CpSte11 gene is functionally involved in the mating response pathway and acts through downstream targets, including Cpmk2, cpst12, pro1 and Mf2/2. However, the characteristics of the CpSte11 null mutant were fully phenocopied only in the cpst12 null mutant, but not in other studied null mutants of components of the putative mating response pathway. © 2011 THE AUTHORS. MOLECULAR PLANT PATHOLOGY © 2011 BSPP AND BLACKWELL PUBLISHING LTD.
Subpulse drifting, nulling, and mode changing in PSR J1822-2256
NASA Astrophysics Data System (ADS)
Basu, Rahul; Mitra, Dipanjan
2018-05-01
We report a detailed observational study of the single pulses from the pulsar J1822-2256. The pulsar shows the presence of subpulse drifting, nulling as well as multiple emission modes. During these observations the pulsar existed primarily in two modes; mode A with prominent drift bands and mode B which was more disorderly without any clear subpulse drifting. A third mode C was also seen for a short duration with a different drifting periodicity compared to mode A. The nulls were present throughout the observations but were more frequent during the disorderly B mode. The nulling also exhibited periodicity with a clear peak in the fluctuation spectra. Before the transition from mode A to nulling the pulsar switched to a third drifting state with periodicity different from both mode A and C. The diversity seen in the single pulse behaviour from the pulsar J1822-2256 provides an unique window into the emission physics.
Dowdall, Jayme R.; Sadow, Peter M.; Hartnick, Christopher; Vinarsky, Vladimir; Mou, Hongmei; Zhao, Rui; Song, Phillip C.; Franco, Ramon A.; Rajagopal, Jayaraj
2016-01-01
Objectives/Hypothesis A precise molecular schema for classifying the different cell types of the normal human vocal fold epithelium is lacking. We hypothesize that the true vocal fold epithelium has a cellular architecture and organization similar to that of other stratified squamous epithelia including the skin, cornea, oral mucosa, and esophagus. In analogy to disorders of the skin and gastrointestinal tract, a molecular definition of the normal cell types within the human vocal fold epithelium and a description of their geometric relationships should serve as a foundation for characterizing cellular changes associated with metaplasia, dysplasia, and cancer. Study Design Qualitative study with adult human larynges. Methods Histologic sections of normal human laryngeal tissue were analyzed for morphology (hematoxylin and eosin) and immunohistochemical protein expression profile, including cytokeratins (CK13 and CK14), cornified envelope proteins (involucrin), basal cells (NGFR/p75), and proliferation markers (Ki67). Results We demonstrated that three distinct cell strata with unique marker profiles are present within the stratified squamous epithelium of the true vocal fold. We used these definitions to establish that cell proliferation is restricted to certain cell types and layers within the epithelium. These distinct cell types are reproducible across five normal adult larynges. Conclusion We have established that three layers of cells are present within the normal adult stratified squamous epithelium of the true vocal fold. Furthermore, replicating cell populations are largely restricted to the parabasal strata within the epithelium. This delineation of distinct cell populations will facilitate future studies of vocal fold regeneration and cancer. Level of Evidence N/A. PMID:25988619
Dowdall, Jayme R; Sadow, Peter M; Hartnick, Christopher; Vinarsky, Vladimir; Mou, Hongmei; Zhao, Rui; Song, Phillip C; Franco, Ramon A; Rajagopal, Jayaraj
2015-09-01
A precise molecular schema for classifying the different cell types of the normal human vocal fold epithelium is lacking. We hypothesize that the true vocal fold epithelium has a cellular architecture and organization similar to that of other stratified squamous epithelia including the skin, cornea, oral mucosa, and esophagus. In analogy to disorders of the skin and gastrointestinal tract, a molecular definition of the normal cell types within the human vocal fold epithelium and a description of their geometric relationships should serve as a foundation for characterizing cellular changes associated with metaplasia, dysplasia, and cancer. Qualitative study with adult human larynges. Histologic sections of normal human laryngeal tissue were analyzed for morphology (hematoxylin and eosin) and immunohistochemical protein expression profile, including cytokeratins (CK13 and CK14), cornified envelope proteins (involucrin), basal cells (NGFR/p75), and proliferation markers (Ki67). We demonstrated that three distinct cell strata with unique marker profiles are present within the stratified squamous epithelium of the true vocal fold. We used these definitions to establish that cell proliferation is restricted to certain cell types and layers within the epithelium. These distinct cell types are reproducible across five normal adult larynges. We have established that three layers of cells are present within the normal adult stratified squamous epithelium of the true vocal fold. Furthermore, replicating cell populations are largely restricted to the parabasal strata within the epithelium. This delineation of distinct cell populations will facilitate future studies of vocal fold regeneration and cancer. N/A. © 2015 The American Laryngological, Rhinological and Otological Society, Inc.
New Methods of Entanglement with Spatial Modes of Light
2014-02-01
Poincare beam by state nulling. ....................................... 15 Figure 13: Poincare patterns measured by imaging polarimetry ...perform imaging polarimetry . This entails taking six single photon images, pixel by pixel, after the passage through six different polarization filters...state nulling [21,22] and by imaging polarimetry [24]. Figure 12 shows the result of state nulling measurements in diagnosing the mode of a Poincare
ERIC Educational Resources Information Center
Trafimow, David
2017-01-01
There has been much controversy over the null hypothesis significance testing procedure, with much of the criticism centered on the problem of inverse inference. Specifically, p gives the probability of the finding (or one more extreme) given the null hypothesis, whereas the null hypothesis significance testing procedure involves drawing a…
31 CFR 588.202 - Effect of transfers violating the provisions of this part.
Code of Federal Regulations, 2010 CFR
2010-07-01
... any property or interest in property blocked pursuant to § 588.201(a), is null and void and shall not... part. (d) Transfers of property that otherwise would be null and void or unenforceable by virtue of the provisions of this section shall not be deemed to be null and void or unenforceable as to any person with...
31 CFR 594.202 - Effect of transfers violating the provisions of this part.
Code of Federal Regulations, 2010 CFR
2010-07-01
... any property or interest in property blocked pursuant to § 594.201(a), is null and void and shall not... property that otherwise would be null and void or unenforceable by virtue of the provisions of this section shall not be deemed to be null and void or unenforceable as to any person with whom such property was...
31 CFR 543.202 - Effect of transfers violating the provisions of this part.
Code of Federal Regulations, 2010 CFR
2010-07-01
... any property or interest in property blocked pursuant to § 543.201(a), is null and void and shall not..., instruction, or license issued pursuant to this part. (d) Transfers of property that otherwise would be null and void or unenforceable by virtue of the provisions of this section shall not be deemed to be null...
31 CFR 548.202 - Effect of transfers violating the provisions of this part.
Code of Federal Regulations, 2010 CFR
2010-07-01
... interest in property blocked pursuant to § 548.201(a), is null and void and shall not be the basis for the..., instruction, or license issued pursuant to this part. (d) Transfers of property that otherwise would be null and void or unenforceable by virtue of the provisions of this section shall not be deemed to be null...
31 CFR 538.202 - Effect of transfers violating the provisions of this part.
Code of Federal Regulations, 2010 CFR
2010-07-01
... property or interest in property blocked pursuant to § 538.201 is null and void and shall not be the basis... be null and void or unenforceable by virtue of the provisions of this section shall not be deemed to be null and void or unenforceable as to any person with whom such property was held or maintained...
31 CFR 545.202 - Effect of transfers violating the provisions of this part.
Code of Federal Regulations, 2010 CFR
2010-07-01
..., and that involves any property or interest in property blocked pursuant to § 545.201(a), is null and... property that otherwise would be null and void or unenforceable by virtue of the provisions of this section shall not be deemed to be null and void or unenforceable as to any person with whom such property was...
31 CFR 541.202 - Effect of transfers violating the provisions of this part.
Code of Federal Regulations, 2010 CFR
2010-07-01
... any property or interest in property blocked pursuant to § 541.201(a), is null and void and shall not... part. (d) Transfers of property that otherwise would be null and void or unenforceable by virtue of the provisions of this section shall not be deemed to be null and void or unenforceable as to any person with...
31 CFR 597.202 - Effect of transfers violating the provisions of this part.
Code of Federal Regulations, 2010 CFR
2010-07-01
... such date, is null and void and shall not be the basis for the assertion or recognition of any interest... assets which otherwise would be null and void or unenforceable by virtue of the provisions of this section shall not be deemed to be null and void or unenforceable as to any financial institution with whom...
31 CFR 587.202 - Effect of transfers violating the provisions of this part.
Code of Federal Regulations, 2010 CFR
2010-07-01
... blocked pursuant to § 587.201(a), is null and void and shall not be the basis for the assertion or... issued pursuant to this part. (d) Transfers of property that otherwise would be null and void or unenforceable by virtue of the provisions of this section shall not be deemed to be null and void or...