Sample records for formal statistical tests

  1. Formal Operations and Learning Style Predict Success in Statistics and Computer Science Courses.

    ERIC Educational Resources Information Center

    Hudak, Mary A.; Anderson, David E.

    1990-01-01

    Studies 94 undergraduate students in introductory statistics and computer science courses. Applies Formal Operations Reasoning Test (FORT) and Kolb's Learning Style Inventory (LSI). Finds that substantial numbers of students have not achieved the formal operation level of cognitive maturity. Emphasizes need to examine students learning style and…

  2. Effectiveness of groundwater governance structures and institutions in Tanzania

    NASA Astrophysics Data System (ADS)

    Gudaga, J. L.; Kabote, S. J.; Tarimo, A. K. P. R.; Mosha, D. B.; Kashaigili, J. J.

    2018-05-01

    This paper examines effectiveness of groundwater governance structures and institutions in Mbarali District, Mbeya Region. The paper adopts exploratory sequential research design to collect quantitative and qualitative data. A random sample of 90 groundwater users with 50% women was involved in the survey. Descriptive statistics, Kruskal-Wallis H test and Mann-Whitney U test were used to compare the differences in responses between groups, while qualitative data were subjected to content analysis. The results show that the Village Councils and Community Water Supply Organizations (COWSOs) were effective in governing groundwater. The results also show statistical significant difference on the overall extent of effectiveness of the Village Councils in governing groundwater between villages ( P = 0.0001), yet there was no significant difference ( P > 0.05) between male and female responses on the effectiveness of Village Councils, village water committees and COWSOs. The Mann-Whitney U test showed statistical significant difference between male and female responses on effectiveness of formal and informal institutions ( P = 0.0001), such that informal institutions were effective relative to formal institutions. The Kruskal-Wallis H test also showed statistical significant difference ( P ≤ 0.05) on the extent of effectiveness of formal institutions, norms and values between low, medium and high categories. The paper concludes that COWSOs were more effective in governing groundwater than other groundwater governance structures. Similarly, norms and values were more effective than formal institutions. The paper recommends sensitization and awareness creation on formal institutions so that they can influence water users' behaviour to govern groundwater.

  3. Experience and Explanation: Using Videogames to Prepare Students for Formal Instruction in Statistics

    ERIC Educational Resources Information Center

    Arena, Dylan A.; Schwartz, Daniel L.

    2014-01-01

    Well-designed digital games can deliver powerful experiences that are difficult to provide through traditional instruction, while traditional instruction can deliver formal explanations that are not a natural fit for gameplay. Combined, they can accomplish more than either can alone. An experiment tested this claim using the topic of statistics,…

  4. Inference as Prediction

    ERIC Educational Resources Information Center

    Watson, Jane

    2007-01-01

    Inference, or decision making, is seen in curriculum documents as the final step in a statistical investigation. For a formal statistical enquiry this may be associated with sophisticated tests involving probability distributions. For young students without the mathematical background to perform such tests, it is still possible to draw informal…

  5. The Development of Introductory Statistics Students' Informal Inferential Reasoning and Its Relationship to Formal Inferential Reasoning

    ERIC Educational Resources Information Center

    Jacob, Bridgette L.

    2013-01-01

    The difficulties introductory statistics students have with formal statistical inference are well known in the field of statistics education. "Informal" statistical inference has been studied as a means to introduce inferential reasoning well before and without the formalities of formal statistical inference. This mixed methods study…

  6. A New Statistic for Evaluating Item Response Theory Models for Ordinal Data. CRESST Report 839

    ERIC Educational Resources Information Center

    Cai, Li; Monroe, Scott

    2014-01-01

    We propose a new limited-information goodness of fit test statistic C[subscript 2] for ordinal IRT models. The construction of the new statistic lies formally between the M[subscript 2] statistic of Maydeu-Olivares and Joe (2006), which utilizes first and second order marginal probabilities, and the M*[subscript 2] statistic of Cai and Hansen…

  7. Statistical inference and Aristotle's Rhetoric.

    PubMed

    Macdonald, Ranald R

    2004-11-01

    Formal logic operates in a closed system where all the information relevant to any conclusion is present, whereas this is not the case when one reasons about events and states of the world. Pollard and Richardson drew attention to the fact that the reasoning behind statistical tests does not lead to logically justifiable conclusions. In this paper statistical inferences are defended not by logic but by the standards of everyday reasoning. Aristotle invented formal logic, but argued that people mostly get at the truth with the aid of enthymemes--incomplete syllogisms which include arguing from examples, analogies and signs. It is proposed that statistical tests work in the same way--in that they are based on examples, invoke the analogy of a model and use the size of the effect under test as a sign that the chance hypothesis is unlikely. Of existing theories of statistical inference only a weak version of Fisher's takes this into account. Aristotle anticipated Fisher by producing an argument of the form that there were too many cases in which an outcome went in a particular direction for that direction to be plausibly attributed to chance. We can therefore conclude that Aristotle would have approved of statistical inference and there is a good reason for calling this form of statistical inference classical.

  8. Immersive Theater - a Proven Way to Enhance Learning Retention

    NASA Astrophysics Data System (ADS)

    Reiff, P. H.; Zimmerman, L.; Spillane, S.; Sumners, C.

    2014-12-01

    The portable immersive theater has gone from our first demonstration at fall AGU 2003 to a product offered by multiple companies in various versions to literally millions of users per year. As part of our NASA funded outreach program, we conducted a test of learning in a portable Discovery Dome as contrasted with learning the same materials (visuals and sound track) on a computer screen. We tested 200 middle school students (primarily underserved minorities). Paired t-tests and an independent t-test were used to compare the amount of learning that students achieved. Interest questionnaires were administered to participants in formal (public school) settings and focus groups were conducted in informal (museum camp and educational festival) settings. Overall results from the informal and formal educational setting indicated that there was a statistically significant increase in test scores after viewing We Choose Space. There was a statistically significant increase in test scores for students who viewed We Choose Space in the portable Discovery Dome (9.75) as well as with the computer (8.88). However, long-term retention of the material tested on the questionnaire indicated that for students who watched We Choose Space in the portable Discovery Dome, there was a statistically significant long-term increase in test scores (10.47), whereas, six weeks after learning on the computer, the improvements over the initial baseline (3.49) were far less and were not statistically significant. The test score improvement six weeks after learning in the dome was essentially the same as the post test immediately after watching the show, demonstrating virtually no loss of gained information in the six week interval. In the formal educational setting, approximately 34% of the respondents indicated that they wanted to learn more about becoming a scientist, while 35% expressed an interest in a career in space science. In the informal setting, 26% indicated that they were interested in pursuing a career in space science.

  9. Integrating Formal Methods and Testing 2002

    NASA Technical Reports Server (NTRS)

    Cukic, Bojan

    2002-01-01

    Traditionally, qualitative program verification methodologies and program testing are studied in separate research communities. None of them alone is powerful and practical enough to provide sufficient confidence in ultra-high reliability assessment when used exclusively. Significant advances can be made by accounting not only tho formal verification and program testing. but also the impact of many other standard V&V techniques, in a unified software reliability assessment framework. The first year of this research resulted in the statistical framework that, given the assumptions on the success of the qualitative V&V and QA procedures, significantly reduces the amount of testing needed to confidently assess reliability at so-called high and ultra-high levels (10-4 or higher). The coming years shall address the methodologies to realistically estimate the impacts of various V&V techniques to system reliability and include the impact of operational risk to reliability assessment. Combine formal correctness verification, process and product metrics, and other standard qualitative software assurance methods with statistical testing with the aim of gaining higher confidence in software reliability assessment for high-assurance applications. B) Quantify the impact of these methods on software reliability. C) Demonstrate that accounting for the effectiveness of these methods reduces the number of tests needed to attain certain confidence level. D) Quantify and justify the reliability estimate for systems developed using various methods.

  10. Comparing perceived self-management practices of adult type 2 diabetic patients after completion of a structured ADA certified diabetes self-management education program with unstructured individualized nurse practitioner led diabetes self-management education.

    PubMed

    Wooley, Dennis S; Kinner, Tracy J

    2016-11-01

    The purpose was to compare perceived self-management practices of adult type 2 diabetic patients after completing an American Diabetes Association (ADA) certified diabetes self-management education (DSME) program with unstructured individualized nurse practitioner led DSME. Demographic questions and the Self-Care Inventory-Revised (SCIR) were given to two convenience sample patient groups comprising a formal DSME program group and a group within a clinical setting who received informal and unstructured individual education during patient encounters. A t-test was executed between the formal ADA certified education sample and the informal sample's SCI-R individual scores. A second t-test was performed between the two samples' SCI-R mean scores. A t-test determined no statistically significant difference between the formal ADA structured education and informal education samples' SCI-R individual scores. There was not a statistically significant difference between the samples' SCI-R mean scores. The study results suggest that there are not superior DSME settings and instructional approaches. Copyright © 2016 Elsevier Inc. All rights reserved.

  11. A brief history of numbers and statistics with cytometric applications.

    PubMed

    Watson, J V

    2001-02-15

    A brief history of numbers and statistics traces the development of numbers from prehistory to completion of our current system of numeration with the introduction of the decimal fraction by Viete, Stevin, Burgi, and Galileo at the turn of the 16th century. This was followed by the development of what we now know as probability theory by Pascal, Fermat, and Huygens in the mid-17th century which arose in connection with questions in gambling with dice and can be regarded as the origin of statistics. The three main probability distributions on which statistics depend were introduced and/or formalized between the mid-17th and early 19th centuries: the binomial distribution by Pascal; the normal distribution by de Moivre, Gauss, and Laplace, and the Poisson distribution by Poisson. The formal discipline of statistics commenced with the works of Pearson, Yule, and Gosset at the turn of the 19th century when the first statistical tests were introduced. Elementary descriptions of the statistical tests most likely to be used in conjunction with cytometric data are given and it is shown how these can be applied to the analysis of difficult immunofluorescence distributions when there is overlap between the labeled and unlabeled cell populations. Copyright 2001 Wiley-Liss, Inc.

  12. Nonlinear estimation of parameters in biphasic Arrhenius plots.

    PubMed

    Puterman, M L; Hrboticky, N; Innis, S M

    1988-05-01

    This paper presents a formal procedure for the statistical analysis of data on the thermotropic behavior of membrane-bound enzymes generated using the Arrhenius equation and compares the analysis to several alternatives. Data is modeled by a bent hyperbola. Nonlinear regression is used to obtain estimates and standard errors of the intersection of line segments, defined as the transition temperature, and slopes, defined as energies of activation of the enzyme reaction. The methodology allows formal tests of the adequacy of a biphasic model rather than either a single straight line or a curvilinear model. Examples on data concerning the thermotropic behavior of pig brain synaptosomal acetylcholinesterase are given. The data support the biphasic temperature dependence of this enzyme. The methodology represents a formal procedure for statistical validation of any biphasic data and allows for calculation of all line parameters with estimates of precision.

  13. Landau's statistical mechanics for quasi-particle models

    NASA Astrophysics Data System (ADS)

    Bannur, Vishnu M.

    2014-04-01

    Landau's formalism of statistical mechanics [following L. D. Landau and E. M. Lifshitz, Statistical Physics (Pergamon Press, Oxford, 1980)] is applied to the quasi-particle model of quark-gluon plasma. Here, one starts from the expression for pressure and develop all thermodynamics. It is a general formalism and consistent with our earlier studies [V. M. Bannur, Phys. Lett. B647, 271 (2007)] based on Pathria's formalism [following R. K. Pathria, Statistical Mechanics (Butterworth-Heinemann, Oxford, 1977)]. In Pathria's formalism, one starts from the expression for energy density and develop thermodynamics. Both the formalisms are consistent with thermodynamics and statistical mechanics. Under certain conditions, which are wrongly called thermodynamic consistent relation, we recover other formalism of quasi-particle system, like in M. I. Gorenstein and S. N. Yang, Phys. Rev. D52, 5206 (1995), widely studied in quark-gluon plasma.

  14. Experience and Explanation: Using Videogames to Prepare Students for Formal Instruction in Statistics

    NASA Astrophysics Data System (ADS)

    Arena, Dylan A.; Schwartz, Daniel L.

    2014-08-01

    Well-designed digital games can deliver powerful experiences that are difficult to provide through traditional instruction, while traditional instruction can deliver formal explanations that are not a natural fit for gameplay. Combined, they can accomplish more than either can alone. An experiment tested this claim using the topic of statistics, where people's everyday experiences often conflict with normative statistical theories and a videogame might provide an alternate set of experiences for students to draw upon. The research used a game called Stats Invaders!, a variant of the classic videogame Space Invaders. In Stats Invaders!, the locations of descending alien invaders follow probability distributions, and players need to infer the shape of the distributions to play well. The experiment tested whether the game developed participants' intuitions about the structure of random events and thereby prepared them for future learning from a subsequent written passage on probability distributions. Community-college students who played the game and then read the passage learned more than participants who only read the passage.

  15. A Random Variable Approach to Nuclear Targeting and Survivability

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Undem, Halvor A.

    We demonstrate a common mathematical formalism for analyzing problems in nuclear survivability and targeting. This formalism, beginning with a random variable approach, can be used to interpret past efforts in nuclear-effects analysis, including targeting analysis. It can also be used to analyze new problems brought about by the post Cold War Era, such as the potential effects of yield degradation in a permanently untested nuclear stockpile. In particular, we illustrate the formalism through four natural case studies or illustrative problems, linking these to actual past data, modeling, and simulation, and suggesting future uses. In the first problem, we illustrate themore » case of a deterministically modeled weapon used against a deterministically responding target. Classic "Cookie Cutter" damage functions result. In the second problem, we illustrate, with actual target test data, the case of a deterministically modeled weapon used against a statistically responding target. This case matches many of the results of current nuclear targeting modeling and simulation tools, including the result of distance damage functions as complementary cumulative lognormal functions in the range variable. In the third problem, we illustrate the case of a statistically behaving weapon used against a deterministically responding target. In particular, we show the dependence of target damage on weapon yield for an untested nuclear stockpile experiencing yield degradation. Finally, and using actual unclassified weapon test data, we illustrate in the fourth problem the case of a statistically behaving weapon used against a statistically responding target.« less

  16. Using a Discussion about Scientific Controversy to Teach Central Concepts in Experimental Design

    ERIC Educational Resources Information Center

    Bennett, Kimberley Ann

    2015-01-01

    Students may need explicit training in informal statistical reasoning in order to design experiments or use formal statistical tests effectively. By using scientific scandals and media misinterpretation, we can explore the need for good experimental design in an informal way. This article describes the use of a paper that reviews the measles mumps…

  17. Determining the Number of Component Clusters in the Standard Multivariate Normal Mixture Model Using Model-Selection Criteria.

    DTIC Science & Technology

    1983-06-16

    has been advocated by Gnanadesikan and 𔃾ilk (1969), and others in the literature. This suggests that, if we use the formal signficance test type...American Statistical Asso., 62, 1159-1178. Gnanadesikan , R., and Wilk, M..B. (1969). Data Analytic Methods in Multi- variate Statistical Analysis. In

  18. Testing the statistical compatibility of independent data sets

    NASA Astrophysics Data System (ADS)

    Maltoni, M.; Schwetz, T.

    2003-08-01

    We discuss a goodness-of-fit method which tests the compatibility between statistically independent data sets. The method gives sensible results even in cases where the χ2 minima of the individual data sets are very low or when several parameters are fitted to a large number of data points. In particular, it avoids the problem that a possible disagreement between data sets becomes diluted by data points which are insensitive to the crucial parameters. A formal derivation of the probability distribution function for the proposed test statistics is given, based on standard theorems of statistics. The application of the method is illustrated on data from neutrino oscillation experiments, and its complementarity to the standard goodness-of-fit is discussed.

  19. Robust inference from multiple test statistics via permutations: a better alternative to the single test statistic approach for randomized trials.

    PubMed

    Ganju, Jitendra; Yu, Xinxin; Ma, Guoguang Julie

    2013-01-01

    Formal inference in randomized clinical trials is based on controlling the type I error rate associated with a single pre-specified statistic. The deficiency of using just one method of analysis is that it depends on assumptions that may not be met. For robust inference, we propose pre-specifying multiple test statistics and relying on the minimum p-value for testing the null hypothesis of no treatment effect. The null hypothesis associated with the various test statistics is that the treatment groups are indistinguishable. The critical value for hypothesis testing comes from permutation distributions. Rejection of the null hypothesis when the smallest p-value is less than the critical value controls the type I error rate at its designated value. Even if one of the candidate test statistics has low power, the adverse effect on the power of the minimum p-value statistic is not much. Its use is illustrated with examples. We conclude that it is better to rely on the minimum p-value rather than a single statistic particularly when that single statistic is the logrank test, because of the cost and complexity of many survival trials. Copyright © 2013 John Wiley & Sons, Ltd.

  20. The Influence of 16-year-old Students' Gender, Mental Abilities, and Motivation on their Reading and Drawing Submicrorepresentations Achievements

    NASA Astrophysics Data System (ADS)

    Devetak, Iztok; Aleksij Glažar, Saša

    2010-08-01

    Submicrorepresentations (SMRs) are a powerful tool for identifying misconceptions of chemical concepts and for generating proper mental models of chemical phenomena in students' long-term memory during chemical education. The main purpose of the study was to determine which independent variables (gender, formal reasoning abilities, visualization abilities, and intrinsic motivation for learning chemistry) have the maximum influence on students' reading and drawing SMRs. A total of 386 secondary school students (aged 16.3 years) participated in the study. The instruments used in the study were: test of Chemical Knowledge, Test of Logical Thinking, two tests of visualization abilities Patterns and Rotations, and questionnaire on Intrinsic Motivation for Learning Science. The results show moderate, but statistically significant correlations between students' intrinsic motivation, formal reasoning abilities and chemical knowledge at submicroscopic level based on reading and drawing SMRs. Visualization abilities are not statistically significantly correlated with students' success on items that comprise reading or drawing SMRs. It can be also concluded that there is a statistically significant difference between male and female students in solving problems that include reading or drawing SMRs. Based on these statistical results and content analysis of the sample problems, several educational strategies can be implemented for students to develop adequate mental models of chemical concepts on all three levels of representations.

  1. The boundary is mixed

    NASA Astrophysics Data System (ADS)

    Bianchi, Eugenio; Haggard, Hal M.; Rovelli, Carlo

    2017-08-01

    We show that in Oeckl's boundary formalism the boundary vectors that do not have a tensor form represent, in a precise sense, statistical states. Therefore the formalism incorporates quantum statistical mechanics naturally. We formulate general-covariant quantum statistical mechanics in this language. We illustrate the formalism by showing how it accounts for the Unruh effect. We observe that the distinction between pure and mixed states weakens in the general covariant context, suggesting that local gravitational processes are naturally statistical without a sharp quantal versus probabilistic distinction.

  2. Exploring High School Students Beginning Reasoning about Significance Tests with Technology

    ERIC Educational Resources Information Center

    García, Víctor N.; Sánchez, Ernesto

    2017-01-01

    In the present study we analyze how students reason about or make inferences given a particular hypothesis testing problem (without having studied formal methods of statistical inference) when using Fathom. They use Fathom to create an empirical sampling distribution through computer simulation. It is found that most student´s reasoning rely on…

  3. Assessment of long-term impact of formal certified cardiopulmonary resuscitation training program among nurses

    PubMed Central

    Saramma, P. P.; Raj, L. Suja; Dash, P. K.; Sarma, P. S.

    2016-01-01

    Context: Cardiopulmonary resuscitation (CPR) and emergency cardiovascular care guidelines are periodically renewed and published by the American Heart Association. Formal training programs are conducted based on these guidelines. Despite widespread training CPR is often poorly performed. Hospital educators spend a significant amount of time and money in training health professionals and maintaining basic life support (BLS) and advanced cardiac life support (ACLS) skills among them. However, very little data are available in the literature highlighting the long-term impact of these training. Aims: To evaluate the impact of formal certified CPR training program on the knowledge and skill of CPR among nurses, to identify self-reported outcomes of attempted CPR and training needs of nurses. Setting and Design: Tertiary care hospital, Prospective, repeated-measures design. Subjects and Methods: A series of certified BLS and ACLS training programs were conducted during 2010 and 2011. Written and practical performance tests were done. Final testing was undertaken 3–4 years after training. The sample included all available, willing CPR certified nurses and experience matched CPR noncertified nurses. Statistical Analysis Used: SPSS for Windows version 21.0. Results: The majority of the 206 nurses (93 CPR certified and 113 noncertified) were females. There was a statistically significant increase in mean knowledge level and overall performance before and after the formal certified CPR training program (P = 0.000). However, the mean knowledge scores were equivalent among the CPR certified and noncertified nurses, although the certified nurses scored a higher mean score (P = 0.140). Conclusions: Formal certified CPR training program increases CPR knowledge and skill. However, significant long-term effects could not be found. There is a need for regular and periodic recertification. PMID:27303137

  4. Introduction of formal debate into a postgraduate specialty track education programme in periodontics in Japan.

    PubMed

    Saito, A; Fujinami, K

    2011-02-01

    To evaluate the formal debate as an active learning strategy within a postgraduate specialty track education programme in periodontics. A formal debate was implemented as an active learning strategy in the programme. The participants were full-time faculty, residents and dentists attending special courses at a teaching hospital in Japan. They were grouped into two evenly matched opposing teams, judges and audience. As a preparation for the debate, the participants attended a lecture on critical thinking. At the time of debate, each team provided a theme report with a list of references. Performances and contents of the debate were evaluated by the course instructors and audience. Pre- and post-debate testing was used to assess the participants' objective knowledge on clinical periodontology. Evaluation of the debate by the participants revealed that scores for criteria, such as presentation performance, response with logic and rebuttal effectiveness were relatively low. Thirty-eight per cent of the participants demonstrated higher test scores after the debate, although there was no statistically significant difference in the mean scores between pre- and post-tests. At the end of the debate, vast majority of participants recognised the significance and importance of the formal debate in the programme. It was suggested that the incorporation of the formal debate could serve as an educational tool for the postgraduate specialty track programme. © 2011 John Wiley & Sons A/S.

  5. Statistical inference for tumor growth inhibition T/C ratio.

    PubMed

    Wu, Jianrong

    2010-09-01

    The tumor growth inhibition T/C ratio is commonly used to quantify treatment effects in drug screening tumor xenograft experiments. The T/C ratio is converted to an antitumor activity rating using an arbitrary cutoff point and often without any formal statistical inference. Here, we applied a nonparametric bootstrap method and a small sample likelihood ratio statistic to make a statistical inference of the T/C ratio, including both hypothesis testing and a confidence interval estimate. Furthermore, sample size and power are also discussed for statistical design of tumor xenograft experiments. Tumor xenograft data from an actual experiment were analyzed to illustrate the application.

  6. An experiment on the impact of a neonicotinoid pesticide on honeybees: the value of a formal analysis of the data.

    PubMed

    Schick, Robert S; Greenwood, Jeremy J D; Buckland, Stephen T

    2017-01-01

    We assess the analysis of the data resulting from a field experiment conducted by Pilling et al. (PLoS ONE. doi: 10.1371/journal.pone.0077193, 5) on the potential effects of thiamethoxam on honeybees. The experiment had low levels of replication, so Pilling et al. concluded that formal statistical analysis would be misleading. This would be true if such an analysis merely comprised tests of statistical significance and if the investigators concluded that lack of significance meant little or no effect. However, an analysis that includes estimation of the size of any effects-with confidence limits-allows one to reach conclusions that are not misleading and that produce useful insights. For the data of Pilling et al., we use straightforward statistical analysis to show that the confidence limits are generally so wide that any effects of thiamethoxam could have been large without being statistically significant. Instead of formal analysis, Pilling et al. simply inspected the data and concluded that they provided no evidence of detrimental effects and from this that thiamethoxam poses a "low risk" to bees. Conclusions derived from the inspection of the data were not just misleading in this case but also are unacceptable in principle, for if data are inadequate for a formal analysis (or only good enough to provide estimates with wide confidence intervals), then they are bound to be inadequate as a basis for reaching any sound conclusions. Given that the data in this case are largely uninformative with respect to the treatment effect, any conclusions reached from such informal approaches can do little more than reflect the prior beliefs of those involved.

  7. The potential for increased power from combining P-values testing the same hypothesis.

    PubMed

    Ganju, Jitendra; Julie Ma, Guoguang

    2017-02-01

    The conventional approach to hypothesis testing for formal inference is to prespecify a single test statistic thought to be optimal. However, we usually have more than one test statistic in mind for testing the null hypothesis of no treatment effect but we do not know which one is the most powerful. Rather than relying on a single p-value, combining p-values from prespecified multiple test statistics can be used for inference. Combining functions include Fisher's combination test and the minimum p-value. Using randomization-based tests, the increase in power can be remarkable when compared with a single test and Simes's method. The versatility of the method is that it also applies when the number of covariates exceeds the number of observations. The increase in power is large enough to prefer combined p-values over a single p-value. The limitation is that the method does not provide an unbiased estimator of the treatment effect and does not apply to situations when the model includes treatment by covariate interaction.

  8. Building Intuitions about Statistical Inference Based on Resampling

    ERIC Educational Resources Information Center

    Watson, Jane; Chance, Beth

    2012-01-01

    Formal inference, which makes theoretical assumptions about distributions and applies hypothesis testing procedures with null and alternative hypotheses, is notoriously difficult for tertiary students to master. The debate about whether this content should appear in Years 11 and 12 of the "Australian Curriculum: Mathematics" has gone on…

  9. Confirmatory and Competitive Evaluation of Alternative Gene-Environment Interaction Hypotheses

    ERIC Educational Resources Information Center

    Belsky, Jay; Pluess, Michael; Widaman, Keith F.

    2013-01-01

    Background: Most gene-environment interaction (GXE) research, though based on clear, vulnerability-oriented hypotheses, is carried out using exploratory rather than hypothesis-informed statistical tests, limiting power and making formal evaluation of competing GXE propositions difficult. Method: We present and illustrate a new regression technique…

  10. Blended particle filters for large-dimensional chaotic dynamical systems

    PubMed Central

    Majda, Andrew J.; Qi, Di; Sapsis, Themistoklis P.

    2014-01-01

    A major challenge in contemporary data science is the development of statistically accurate particle filters to capture non-Gaussian features in large-dimensional chaotic dynamical systems. Blended particle filters that capture non-Gaussian features in an adaptively evolving low-dimensional subspace through particles interacting with evolving Gaussian statistics on the remaining portion of phase space are introduced here. These blended particle filters are constructed in this paper through a mathematical formalism involving conditional Gaussian mixtures combined with statistically nonlinear forecast models compatible with this structure developed recently with high skill for uncertainty quantification. Stringent test cases for filtering involving the 40-dimensional Lorenz 96 model with a 5-dimensional adaptive subspace for nonlinear blended filtering in various turbulent regimes with at least nine positive Lyapunov exponents are used here. These cases demonstrate the high skill of the blended particle filter algorithms in capturing both highly non-Gaussian dynamical features as well as crucial nonlinear statistics for accurate filtering in extreme filtering regimes with sparse infrequent high-quality observations. The formalism developed here is also useful for multiscale filtering of turbulent systems and a simple application is sketched below. PMID:24825886

  11. Programs for Children with Specific Learning Disabilities. P.L. 91-230, Title VI-G Formal Final Evaluation. (Statistical Analysis of Data).

    ERIC Educational Resources Information Center

    Murphy, Philip J.

    The paper reports the final evaluation of a program for approximately 143 learning disabled (LD) students (grades 6-to-12) from six school districts. A number of test instruments were used to evaluate student progress during the program, including the Wide Range Achievement Test (WRAT), the Durrell Analysis of Reading Difficulty, and the…

  12. Orchestrating high-throughput genomic analysis with Bioconductor

    PubMed Central

    Huber, Wolfgang; Carey, Vincent J.; Gentleman, Robert; Anders, Simon; Carlson, Marc; Carvalho, Benilton S.; Bravo, Hector Corrada; Davis, Sean; Gatto, Laurent; Girke, Thomas; Gottardo, Raphael; Hahne, Florian; Hansen, Kasper D.; Irizarry, Rafael A.; Lawrence, Michael; Love, Michael I.; MacDonald, James; Obenchain, Valerie; Oleś, Andrzej K.; Pagès, Hervé; Reyes, Alejandro; Shannon, Paul; Smyth, Gordon K.; Tenenbaum, Dan; Waldron, Levi; Morgan, Martin

    2015-01-01

    Bioconductor is an open-source, open-development software project for the analysis and comprehension of high-throughput data in genomics and molecular biology. The project aims to enable interdisciplinary research, collaboration and rapid development of scientific software. Based on the statistical programming language R, Bioconductor comprises 934 interoperable packages contributed by a large, diverse community of scientists. Packages cover a range of bioinformatic and statistical applications. They undergo formal initial review and continuous automated testing. We present an overview for prospective users and contributors. PMID:25633503

  13. On Testability of Missing Data Mechanisms in Incomplete Data Sets

    ERIC Educational Resources Information Center

    Raykov, Tenko

    2011-01-01

    This article is concerned with the question of whether the missing data mechanism routinely referred to as missing completely at random (MCAR) is statistically examinable via a test for lack of distributional differences between groups with observed and missing data, and related consequences. A discussion is initially provided, from a formal logic…

  14. An analysis on intersectional collaboration on non-communicable chronic disease prevention and control in China: a cross-sectional survey on main officials of community health service institutions.

    PubMed

    Li, Xing-Ming; Rasooly, Alon; Peng, Bo; JianWang; Xiong, Shu-Yu

    2017-11-10

    Our study aimed to design a tool of evaluating intersectional collaboration on Non-communicable Chronic Disease (NCD) prevention and control, and further to understand the current status of intersectional collaboration in community health service institutions of China. We surveyed 444 main officials of community health service institutions in Beijing, Tianjin, Hubei and Ningxia regions of China in 2014 by using a questionnaire. A model of collaboration measurement, including four relational dimensions of governance, shared goals and vision, formalization and internalization, was used to compare the scores of evaluation scale in NCD management procedures across community healthcare institutions and other ones. Reliability and validity of the evaluation tool on inter-organizational collaboration on NCD prevention and control were verified. The test on tool evaluating inter-organizational collaboration in community NCD management revealed a good reliability and validity (Cronbach's Alpha = 0.89,split-half reliability = 0.84, the variance contribution rate of an extracted principal component = 49.70%). The results of inter-organizational collaboration of different departments and management segments showed there were statistically significant differences in formalization dimension for physical examination (p = 0.01).There was statistically significant difference in governance dimension, formalization dimension and total score of the collaboration scale for health record sector (p = 0.01,0.00,0.00). Statistical differences were found in the formalization dimension for exercise and nutrition health education segment (p = 0.01). There were no statistically significant difference in formalization dimension of medication guidance for psychological consultation, medical referral service and rehabilitation guidance (all p > 0.05). The multi-department collaboration mechanism of NCD prevention and control has been rudimentarily established. Community management institutions and general hospitals are more active in participating in community NCD management with better collaboration score, whereas the CDC shows relatively poor collaboration in China. Xing-ming Li and Alon Rasooly have the same contribution to the paper. Xing-ming Li and Alon Rasooly listed as the same first author.

  15. Statistical analysis of Turbine Engine Diagnostic (TED) field test data

    NASA Astrophysics Data System (ADS)

    Taylor, Malcolm S.; Monyak, John T.

    1994-11-01

    During the summer of 1993, a field test of turbine engine diagnostic (TED) software, developed jointly by U.S. Army Research Laboratory and the U.S. Army Ordnance Center and School, was conducted at Fort Stuart, GA. The data were collected in conformance with a cross-over design, some of whose considerations are detailed. The initial analysis of the field test data was exploratory, followed by a more formal investigation. Technical aspects of the data analysis insights that were elicited are reported.

  16. An astronomer's guide to period searching

    NASA Astrophysics Data System (ADS)

    Schwarzenberg-Czerny, A.

    2003-03-01

    We concentrate on analysis of unevenly sampled time series, interrupted by periodic gaps, as often encountered in astronomy. While some of our conclusions may appear surprising, all are based on classical statistical principles of Fisher & successors. Except for discussion of the resolution issues, it is best for the reader to forget temporarily about Fourier transforms and to concentrate on problems of fitting of a time series with a model curve. According to their statistical content we divide the issues into several sections, consisting of: (ii) statistical numerical aspects of model fitting, (iii) evaluation of fitted models as hypotheses testing, (iv) the role of the orthogonal models in signal detection (v) conditions for equivalence of periodograms (vi) rating sensitivity by test power. An experienced observer working with individual objects would benefit little from formalized statistical approach. However, we demonstrate the usefulness of this approach in evaluation of performance of periodograms and in quantitative design of large variability surveys.

  17. Education Research: The challenge of incorporating formal research methodology training in a neurology residency.

    PubMed

    Leira, E C; Granner, M A; Torner, J C; Callison, R C; Adams, H P

    2008-05-13

    Physicians often do not have good understanding of research methodology. Unfortunately, the mechanism to achieve this important competency in a busy neurology residency program remains unclear. We tested the value and degree of acceptance by neurology residents of a multimodal educational intervention that consisted of biweekly teaching sessions in place of an existing journal club, as a way to provide formal training in research and statistical techniques. We used a pre- and post-test design with an educational intervention in between using neurology residents at the University of Iowa as subjects. Each test had 40 questions of research methodology. The educational intervention consisted of a biweekly, structured, topic-centered, research methodology-oriented elective seminar following a year-long predefined curriculum. An exit survey was offered to gather resident's perceptions about the course. While a majority of residents agreed that the intervention enhanced their knowledge of research methodology, only 23% attended more than 40% of the sessions. There was no difference between pretest and post-test scores (p = 0.40). Our experience suggests that, in order to accomplish the Accreditation Council for Graduate Medical Education goals regarding increasing competency of residents in knowledge about research methodology, a major restructuring in the neurology residency curriculum with more intense formal training would be necessary.

  18. Small sample estimation of the reliability function for technical products

    NASA Astrophysics Data System (ADS)

    Lyamets, L. L.; Yakimenko, I. V.; Kanishchev, O. A.; Bliznyuk, O. A.

    2017-12-01

    It is demonstrated that, in the absence of big statistic samples obtained as a result of testing complex technical products for failure, statistic estimation of the reliability function of initial elements can be made by the moments method. A formal description of the moments method is given and its advantages in the analysis of small censored samples are discussed. A modified algorithm is proposed for the implementation of the moments method with the use of only the moments at which the failures of initial elements occur.

  19. Statistical Irreversible Thermodynamics in the Framework of Zubarev's Nonequilibrium Statistical Operator Method

    NASA Astrophysics Data System (ADS)

    Luzzi, R.; Vasconcellos, A. R.; Ramos, J. G.; Rodrigues, C. G.

    2018-01-01

    We describe the formalism of statistical irreversible thermodynamics constructed based on Zubarev's nonequilibrium statistical operator (NSO) method, which is a powerful and universal tool for investigating the most varied physical phenomena. We present brief overviews of the statistical ensemble formalism and statistical irreversible thermodynamics. The first can be constructed either based on a heuristic approach or in the framework of information theory in the Jeffreys-Jaynes scheme of scientific inference; Zubarev and his school used both approaches in formulating the NSO method. We describe the main characteristics of statistical irreversible thermodynamics and discuss some particular considerations of several authors. We briefly describe how Rosenfeld, Bohr, and Prigogine proposed to derive a thermodynamic uncertainty principle.

  20. Using Informal Inferential Reasoning to Develop Formal Concepts: Analyzing an Activity

    ERIC Educational Resources Information Center

    Weinberg, Aaron; Wiesner, Emilie; Pfaff, Thomas J.

    2010-01-01

    Inferential reasoning is a central component of statistics. Researchers have suggested that students should develop an informal understanding of the ideas that underlie inference before learning the concepts formally. This paper presents a hands-on activity that is designed to help students in an introductory statistics course draw informal…

  1. What Influences Mental Illness? Discrepancies Between Medical Education and Conception.

    PubMed

    Einstein, Evan Hy; Klepacz, Lidia

    2017-01-01

    This preliminary study examined the differences between what was taught during a formal medical education and medical students' and psychiatry residents' conceptions of notions regarding the causes and determinants of mental illness. The authors surveyed 74 medical students and 11 residents via convenience sampling. The survey contained 18 statements which were rated twice based on truthfulness in terms of a participant's formal education and conception, respectively. Descriptive statistics and a Wilcoxon signed rank test determined differences between education and conception. Results showed that students were less likely to perceive a neurotransmitter imbalance to cause mental illness, as opposed to what was emphasized during a formal medical education. Students and residents also understood the importance of factors such as systemic racism and socioeconomic status in the development of mental illness, which were factors that did not receive heavy emphasis during medical education. Furthermore, students and residents believed that not only did mental illnesses have nonuniform pathologies, but that the Diagnostic and Statistical Manual of Mental Disorders also had the propensity to sometimes arbitrarily categorize individuals with potentially negative consequences. If these notions are therefore part of students' and residents' conceptions, as well as documented in the literature, then it seems appropriate for medical education to be further developed to emphasize these ideas.

  2. Robust inference for group sequential trials.

    PubMed

    Ganju, Jitendra; Lin, Yunzhi; Zhou, Kefei

    2017-03-01

    For ethical reasons, group sequential trials were introduced to allow trials to stop early in the event of extreme results. Endpoints in such trials are usually mortality or irreversible morbidity. For a given endpoint, the norm is to use a single test statistic and to use that same statistic for each analysis. This approach is risky because the test statistic has to be specified before the study is unblinded, and there is loss in power if the assumptions that ensure optimality for each analysis are not met. To minimize the risk of moderate to substantial loss in power due to a suboptimal choice of a statistic, a robust method was developed for nonsequential trials. The concept is analogous to diversification of financial investments to minimize risk. The method is based on combining P values from multiple test statistics for formal inference while controlling the type I error rate at its designated value.This article evaluates the performance of 2 P value combining methods for group sequential trials. The emphasis is on time to event trials although results from less complex trials are also included. The gain or loss in power with the combination method relative to a single statistic is asymmetric in its favor. Depending on the power of each individual test, the combination method can give more power than any single test or give power that is closer to the test with the most power. The versatility of the method is that it can combine P values from different test statistics for analysis at different times. The robustness of results suggests that inference from group sequential trials can be strengthened with the use of combined tests. Copyright © 2017 John Wiley & Sons, Ltd.

  3. Learning and understanding the Kruskal-Wallis one-way analysis-of-variance-by-ranks test for differences among three or more independent groups.

    PubMed

    Chan, Y; Walmsley, R P

    1997-12-01

    When several treatment methods are available for the same problem, many clinicians are faced with the task of deciding which treatment to use. Many clinicians may have conducted informal "mini-experiments" on their own to determine which treatment is best suited for the problem. These results are usually not documented or reported in a formal manner because many clinicians feel that they are "statistically challenged." Another reason may be because clinicians do not feel they have controlled enough test conditions to warrant analysis. In this update, a statistic is described that does not involve complicated statistical assumptions, making it a simple and easy-to-use statistical method. This update examines the use of two statistics and does not deal with other issues that could affect clinical research such as issues affecting credibility. For readers who want a more in-depth examination of this topic, references have been provided. The Kruskal-Wallis one-way analysis-of-variance-by-ranks test (or H test) is used to determine whether three or more independent groups are the same or different on some variable of interest when an ordinal level of data or an interval or ratio level of data is available. A hypothetical example will be presented to explain when and how to use this statistic, how to interpret results using the statistic, the advantages and disadvantages of the statistic, and what to look for in a written report. This hypothetical example will involve the use of ratio data to demonstrate how to choose between using the nonparametric H test and the more powerful parametric F test.

  4. Extending local canonical correlation analysis to handle general linear contrasts for FMRI data.

    PubMed

    Jin, Mingwu; Nandy, Rajesh; Curran, Tim; Cordes, Dietmar

    2012-01-01

    Local canonical correlation analysis (CCA) is a multivariate method that has been proposed to more accurately determine activation patterns in fMRI data. In its conventional formulation, CCA has several drawbacks that limit its usefulness in fMRI. A major drawback is that, unlike the general linear model (GLM), a test of general linear contrasts of the temporal regressors has not been incorporated into the CCA formalism. To overcome this drawback, a novel directional test statistic was derived using the equivalence of multivariate multiple regression (MVMR) and CCA. This extension will allow CCA to be used for inference of general linear contrasts in more complicated fMRI designs without reparameterization of the design matrix and without reestimating the CCA solutions for each particular contrast of interest. With the proper constraints on the spatial coefficients of CCA, this test statistic can yield a more powerful test on the inference of evoked brain regional activations from noisy fMRI data than the conventional t-test in the GLM. The quantitative results from simulated and pseudoreal data and activation maps from fMRI data were used to demonstrate the advantage of this novel test statistic.

  5. Extending Local Canonical Correlation Analysis to Handle General Linear Contrasts for fMRI Data

    PubMed Central

    Jin, Mingwu; Nandy, Rajesh; Curran, Tim; Cordes, Dietmar

    2012-01-01

    Local canonical correlation analysis (CCA) is a multivariate method that has been proposed to more accurately determine activation patterns in fMRI data. In its conventional formulation, CCA has several drawbacks that limit its usefulness in fMRI. A major drawback is that, unlike the general linear model (GLM), a test of general linear contrasts of the temporal regressors has not been incorporated into the CCA formalism. To overcome this drawback, a novel directional test statistic was derived using the equivalence of multivariate multiple regression (MVMR) and CCA. This extension will allow CCA to be used for inference of general linear contrasts in more complicated fMRI designs without reparameterization of the design matrix and without reestimating the CCA solutions for each particular contrast of interest. With the proper constraints on the spatial coefficients of CCA, this test statistic can yield a more powerful test on the inference of evoked brain regional activations from noisy fMRI data than the conventional t-test in the GLM. The quantitative results from simulated and pseudoreal data and activation maps from fMRI data were used to demonstrate the advantage of this novel test statistic. PMID:22461786

  6. Data Acquisition and Preprocessing in Studies on Humans: What Is Not Taught in Statistics Classes?

    PubMed

    Zhu, Yeyi; Hernandez, Ladia M; Mueller, Peter; Dong, Yongquan; Forman, Michele R

    2013-01-01

    The aim of this paper is to address issues in research that may be missing from statistics classes and important for (bio-)statistics students. In the context of a case study, we discuss data acquisition and preprocessing steps that fill the gap between research questions posed by subject matter scientists and statistical methodology for formal inference. Issues include participant recruitment, data collection training and standardization, variable coding, data review and verification, data cleaning and editing, and documentation. Despite the critical importance of these details in research, most of these issues are rarely discussed in an applied statistics program. One reason for the lack of more formal training is the difficulty in addressing the many challenges that can possibly arise in the course of a study in a systematic way. This article can help to bridge this gap between research questions and formal statistical inference by using an illustrative case study for a discussion. We hope that reading and discussing this paper and practicing data preprocessing exercises will sensitize statistics students to these important issues and achieve optimal conduct, quality control, analysis, and interpretation of a study.

  7. Assessment of long-term impact of formal certified cardiopulmonary resuscitation training program among nurses.

    PubMed

    Saramma, P P; Raj, L Suja; Dash, P K; Sarma, P S

    2016-04-01

    Cardiopulmonary resuscitation (CPR) and emergency cardiovascular care guidelines are periodically renewed and published by the American Heart Association. Formal training programs are conducted based on these guidelines. Despite widespread training CPR is often poorly performed. Hospital educators spend a significant amount of time and money in training health professionals and maintaining basic life support (BLS) and advanced cardiac life support (ACLS) skills among them. However, very little data are available in the literature highlighting the long-term impact of these training. To evaluate the impact of formal certified CPR training program on the knowledge and skill of CPR among nurses, to identify self-reported outcomes of attempted CPR and training needs of nurses. Tertiary care hospital, Prospective, repeated-measures design. A series of certified BLS and ACLS training programs were conducted during 2010 and 2011. Written and practical performance tests were done. Final testing was undertaken 3-4 years after training. The sample included all available, willing CPR certified nurses and experience matched CPR noncertified nurses. SPSS for Windows version 21.0. The majority of the 206 nurses (93 CPR certified and 113 noncertified) were females. There was a statistically significant increase in mean knowledge level and overall performance before and after the formal certified CPR training program (P = 0.000). However, the mean knowledge scores were equivalent among the CPR certified and noncertified nurses, although the certified nurses scored a higher mean score (P = 0.140). Formal certified CPR training program increases CPR knowledge and skill. However, significant long-term effects could not be found. There is a need for regular and periodic recertification.

  8. The Statistical Mechanics of Solar Wind Hydroxylation at the Moon, Within Lunar Magnetic Anomalies, and at Phobos

    NASA Technical Reports Server (NTRS)

    Farrell, W. M.; Hurley, D. M.; Esposito, V. J.; Mclain, J. L.; Zimmerman, M. I.

    2017-01-01

    We present a new formalism to describe the outgassing of hydrogen initially implanted by the solar wind protons into exposed soils on airless bodies. The formalism applies a statistical mechanics approach similar to that applied recently to molecular adsorption onto activated surfaces. The key element enabling this formalism is the recognition that the interatomic potential between the implanted H and regolith-residing oxides is not of singular value but possess a distribution of trapped energy values at a given temperature, F(U,T). All subsequent derivations of the outward diffusion and H retention rely on the specific properties of this distribution. We find that solar wind hydrogen can be retained if there are sites in the implantation layer with activation energy values exceeding 0.5eV. We especially examine the dependence of H retention applying characteristic energy values found previously for irradiated silica and mature lunar samples. We also apply the formalism to two cases that differ from the typical solar wind implantation at the Moon. First, we test for a case of implantation in magnetic anomaly regions where significantly lower-energy ions of solar wind origin are expected to be incident with the surface. In magnetic anomalies, H retention is found to be reduced due to the reduced ion flux and shallower depth of implantation. Second, we also apply the model to Phobos where the surface temperature range is not as extreme as the Moon. We find the H atom retention in this second case is higher than the lunar case due to the reduced thermal extremes (that reduces outgassing).

  9. Formal and Informal Learning and First-Year Psychology Students’ Development of Scientific Thinking: A Two-Wave Panel Study

    PubMed Central

    Soyyılmaz, Demet; Griffin, Laura M.; Martín, Miguel H.; Kucharský, Šimon; Peycheva, Ekaterina D.; Vaupotič, Nina; Edelsbrunner, Peter A.

    2017-01-01

    Scientific thinking is a predicate for scientific inquiry, and thus important to develop early in psychology students as potential future researchers. The present research is aimed at fathoming the contributions of formal and informal learning experiences to psychology students’ development of scientific thinking during their 1st-year of study. We hypothesize that informal experiences are relevant beyond formal experiences. First-year psychology student cohorts from various European countries will be assessed at the beginning and again at the end of the second semester. Assessments of scientific thinking will include scientific reasoning skills, the understanding of basic statistics concepts, and epistemic cognition. Formal learning experiences will include engagement in academic activities which are guided by university authorities. Informal learning experiences will include non-compulsory, self-guided learning experiences. Formal and informal experiences will be assessed with a newly developed survey. As dispositional predictors, students’ need for cognition and self-efficacy in psychological science will be assessed. In a structural equation model, students’ learning experiences and personal dispositions will be examined as predictors of their development of scientific thinking. Commonalities and differences in predictive weights across universities will be tested. The project is aimed at contributing information for designing university environments to optimize the development of students’ scientific thinking. PMID:28239363

  10. Formal and Informal Learning and First-Year Psychology Students' Development of Scientific Thinking: A Two-Wave Panel Study.

    PubMed

    Soyyılmaz, Demet; Griffin, Laura M; Martín, Miguel H; Kucharský, Šimon; Peycheva, Ekaterina D; Vaupotič, Nina; Edelsbrunner, Peter A

    2017-01-01

    Scientific thinking is a predicate for scientific inquiry, and thus important to develop early in psychology students as potential future researchers. The present research is aimed at fathoming the contributions of formal and informal learning experiences to psychology students' development of scientific thinking during their 1st-year of study. We hypothesize that informal experiences are relevant beyond formal experiences. First-year psychology student cohorts from various European countries will be assessed at the beginning and again at the end of the second semester. Assessments of scientific thinking will include scientific reasoning skills, the understanding of basic statistics concepts, and epistemic cognition. Formal learning experiences will include engagement in academic activities which are guided by university authorities. Informal learning experiences will include non-compulsory, self-guided learning experiences. Formal and informal experiences will be assessed with a newly developed survey. As dispositional predictors, students' need for cognition and self-efficacy in psychological science will be assessed. In a structural equation model, students' learning experiences and personal dispositions will be examined as predictors of their development of scientific thinking. Commonalities and differences in predictive weights across universities will be tested. The project is aimed at contributing information for designing university environments to optimize the development of students' scientific thinking.

  11. Video as Supplementary Material. The Impact of an Alternative Development of Study Materials in Solving Formal Economic Problems. Working Papers in Distance Education No. 4.

    ERIC Educational Resources Information Center

    Laaser, W.; And Others

    This study investigated the efficiency of video as an additional teaching aid for a statistics course offered by the Fernuniversitat (Open University, West Germany). A total of 65 distance students and internal students from the Universities of Bochum and Dortmund were divided into five groups to test the effects of five alternative treatments:…

  12. Statistics of Smoothed Cosmic Fields in Perturbation Theory. I. Formulation and Useful Formulae in Second-Order Perturbation Theory

    NASA Astrophysics Data System (ADS)

    Matsubara, Takahiko

    2003-02-01

    We formulate a general method for perturbative evaluations of statistics of smoothed cosmic fields and provide useful formulae for application of the perturbation theory to various statistics. This formalism is an extensive generalization of the method used by Matsubara, who derived a weakly nonlinear formula of the genus statistic in a three-dimensional density field. After describing the general method, we apply the formalism to a series of statistics, including genus statistics, level-crossing statistics, Minkowski functionals, and a density extrema statistic, regardless of the dimensions in which each statistic is defined. The relation between the Minkowski functionals and other geometrical statistics is clarified. These statistics can be applied to several cosmic fields, including three-dimensional density field, three-dimensional velocity field, two-dimensional projected density field, and so forth. The results are detailed for second-order theory of the formalism. The effect of the bias is discussed. The statistics of smoothed cosmic fields as functions of rescaled threshold by volume fraction are discussed in the framework of second-order perturbation theory. In CDM-like models, their functional deviations from linear predictions plotted against the rescaled threshold are generally much smaller than that plotted against the direct threshold. There is still a slight meatball shift against rescaled threshold, which is characterized by asymmetry in depths of troughs in the genus curve. A theory-motivated asymmetry factor in the genus curve is proposed.

  13. Granular statistical mechanics - Building on the legacy of Sir Sam Edwards

    NASA Astrophysics Data System (ADS)

    Blumenfeld, Raphael

    When Sir Sam Edwards laid down the foundations for the statistical mechanics of jammed granular materials he opened a new field in soft condensed matter and many followed. In this presentation we review briefly the Edwards formalism and some of its less discussed consequences. We point out that the formalism is useful for other classes of systems - cellular and porous materials. A certain shortcoming of the original formalism is then discussed and a modification to overcome it is proposed. Finally, a derivation of an equation of state with the new formalism is presented; the equation of state is analogous to the PVT relation for thermal gases, relating the volume, the boundary stress and measures of the structural and stress fluctuations. NUDT, Changsha, China, Imperial College London, UK, Cambridge University, UK.

  14. Social and Spill-Over Benefits as Motivating Factors to Investment in Formal Education in Africa: A Reflection around Ghanaian, Kenyan and Rwandan Contexts

    ERIC Educational Resources Information Center

    Ampofo, S. Y.; Bizimana, B.; Ndayambaje, I.; Karongo, V.; Lawrence, K. Lyn; Orodho, J. A.

    2015-01-01

    This study examined the social and spill-over benefits as motivating factors to investment in formal education in selected countries in Africa. The paper had three objectives, namely) to profile the key statistics of formal schooling; ii) examine the formal education and iii) link national goals of education with expectations in Ghana, Kenya and…

  15. The influence of education on performance of adults on the Clock Drawing Test.

    PubMed

    de Noronha, Ísis Franci Cavalcanti; Barreto, Simone Dos Santos; Ortiz, Karin Zazo

    2018-01-01

    The Clock Drawing Test (CDT) is an important instrument for screening individuals suspected of having cognitive impairment. To determine the influence of education on the performance of healthy adults on the CDT. A total of 121 drawings by healthy adults without neurological complaints or impairments were analysed. Participants were stratified by educational level into 4 subgroups: 27 illiterate adults, 34 individuals with 1-4 years of formal education, 30 with 5-11 years, and 30 adults with >11 years' formal education. Scores on the CDT were analyzed based on a scale of 1-10 points according to the criteria of Sunderland et al. (1989).¹ The Kruskal-Wallis test was applied to compare the different education groups. Tukey's multiple comparisons test was used when a significant factor was found. Although scores were higher with greater education, statistically significant differences on the CDT were found only between the illiterate and other educated groups. The CDT proved especially difficult for illiterate individuals, who had lower scores. These results suggest that this screening test is suitable for assessing mainly visuoconstructional praxis and providing an overall impression of cognitive function among individuals, independently of years of education.

  16. The influence of education on performance of adults on the Clock Drawing Test

    PubMed Central

    de Noronha, Ísis Franci Cavalcanti; Barreto, Simone dos Santos; Ortiz, Karin Zazo

    2018-01-01

    ABSTRACT The Clock Drawing Test (CDT) is an important instrument for screening individuals suspected of having cognitive impairment. Objective: To determine the influence of education on the performance of healthy adults on the CDT. Methods: A total of 121 drawings by healthy adults without neurological complaints or impairments were analysed. Participants were stratified by educational level into 4 subgroups: 27 illiterate adults, 34 individuals with 1-4 years of formal education, 30 with 5-11 years, and 30 adults with >11 years' formal education. Scores on the CDT were analyzed based on a scale of 1-10 points according to the criteria of Sunderland et al. (1989).¹ The Kruskal-Wallis test was applied to compare the different education groups. Tukey's multiple comparisons test was used when a significant factor was found. Results: Although scores were higher with greater education, statistically significant differences on the CDT were found only between the illiterate and other educated groups. Conclusion: The CDT proved especially difficult for illiterate individuals, who had lower scores. These results suggest that this screening test is suitable for assessing mainly visuoconstructional praxis and providing an overall impression of cognitive function among individuals, independently of years of education. PMID:29682235

  17. Enabling High-Energy, High-Voltage Lithium-Ion Cells: Standardization of Coin-Cell Assembly, Electrochemical Testing, and Evaluation of Full Cells

    DOE PAGES

    Long, Brandon R.; Rinaldo, Steven G.; Gallagher, Kevin G.; ...

    2016-11-09

    Coin-cells are often the test format of choice for laboratories engaged in battery research and development as they provide a convenient platform for rapid testing of new materials on a small scale. However, reliable, reproducible data via the coin-cell format is inherently difficult, particularly in the full-cell configuration. In addition, statistical evaluation to prove the consistency and reliability of such data is often neglected. Herein we report on several studies aimed at formalizing physical process parameters and coin-cell construction related to full cells. Statistical analysis and performance benchmarking approaches are advocated as a means to more confidently track changes inmore » cell performance. Finally, we show that trends in the electrochemical data obtained from coin-cells can be reliable and informative when standardized approaches are implemented in a consistent manner.« less

  18. The inner mass power spectrum of galaxies using strong gravitational lensing: beyond linear approximation

    NASA Astrophysics Data System (ADS)

    Chatterjee, Saikat; Koopmans, Léon V. E.

    2018-02-01

    In the last decade, the detection of individual massive dark matter sub-haloes has been possible using potential correction formalism in strong gravitational lens imaging. Here, we propose a statistical formalism to relate strong gravitational lens surface brightness anomalies to the lens potential fluctuations arising from dark matter distribution in the lens galaxy. We consider these fluctuations as a Gaussian random field in addition to the unperturbed smooth lens model. This is very similar to weak lensing formalism and we show that in this way we can measure the power spectrum of these perturbations to the potential. We test the method by applying it to simulated mock lenses of different geometries and by performing an MCMC analysis of the theoretical power spectra. This method can measure density fluctuations in early type galaxies on scales of 1-10 kpc at typical rms levels of a per cent, using a single lens system observed with the Hubble Space Telescope with typical signal-to-noise ratios obtained in a single orbit.

  19. Statistical evidence for common ancestry: Application to primates.

    PubMed

    Baum, David A; Ané, Cécile; Larget, Bret; Solís-Lemus, Claudia; Ho, Lam Si Tung; Boone, Peggy; Drummond, Chloe P; Bontrager, Martin; Hunter, Steven J; Saucier, William

    2016-06-01

    Since Darwin, biologists have come to recognize that the theory of descent from common ancestry (CA) is very well supported by diverse lines of evidence. However, while the qualitative evidence is overwhelming, we also need formal methods for quantifying the evidential support for CA over the alternative hypothesis of separate ancestry (SA). In this article, we explore a diversity of statistical methods using data from the primates. We focus on two alternatives to CA, species SA (the separate origin of each named species) and family SA (the separate origin of each family). We implemented statistical tests based on morphological, molecular, and biogeographic data and developed two new methods: one that tests for phylogenetic autocorrelation while correcting for variation due to confounding ecological traits and a method for examining whether fossil taxa have fewer derived differences than living taxa. We overwhelmingly rejected both species and family SA with infinitesimal P values. We compare these results with those from two companion papers, which also found tremendously strong support for the CA of all primates, and discuss future directions and general philosophical issues that pertain to statistical testing of historical hypotheses such as CA. © 2016 The Author(s). Evolution © 2016 The Society for the Study of Evolution.

  20. Great Computational Intelligence in the Formal Sciences via Analogical Reasoning

    DTIC Science & Technology

    2017-05-08

    computational harnessing of traditional mathematical statistics (as e.g. covered in Hogg, Craig & McKean 2005) is used to power statistical learning techniques...AFRL-AFOSR-VA-TR-2017-0099 Great Computational Intelligence in the Formal Sciences via Analogical Reasoning Selmer Bringsjord RENSSELAER POLYTECHNIC...08-05-2017 2. REPORT TYPE Final Performance 3. DATES COVERED (From - To) 15 Oct 2011 to 31 Dec 2016 4. TITLE AND SUBTITLE Great Computational

  1. Meta-analysis of diagnostic test data: a bivariate Bayesian modeling approach.

    PubMed

    Verde, Pablo E

    2010-12-30

    In the last decades, the amount of published results on clinical diagnostic tests has expanded very rapidly. The counterpart to this development has been the formal evaluation and synthesis of diagnostic results. However, published results present substantial heterogeneity and they can be regarded as so far removed from the classical domain of meta-analysis, that they can provide a rather severe test of classical statistical methods. Recently, bivariate random effects meta-analytic methods, which model the pairs of sensitivities and specificities, have been presented from the classical point of view. In this work a bivariate Bayesian modeling approach is presented. This approach substantially extends the scope of classical bivariate methods by allowing the structural distribution of the random effects to depend on multiple sources of variability. Meta-analysis is summarized by the predictive posterior distributions for sensitivity and specificity. This new approach allows, also, to perform substantial model checking, model diagnostic and model selection. Statistical computations are implemented in the public domain statistical software (WinBUGS and R) and illustrated with real data examples. Copyright © 2010 John Wiley & Sons, Ltd.

  2. Formal testing and utilization of streaming media to improve flight crew safety knowledge.

    PubMed

    Bellazzini, Marc A; Rankin, Peter M; Quisling, Jason; Gangnon, Ronald; Kohrs, Mike

    2008-01-01

    Increased concerns over the safety of air medical transport have prompted development of novel ways to increase safety. The objective of our study was to determine if an Internet streaming media safety video increased crew safety knowledge. 23 out of 40 crew members took an online safety pre-test, watched a safety video specific to our program and completed immediate and long-term post-testing 6 months later. Mean pre-test, post-test and 6 month follow up test scores were 84.9%, 92.3% and 88.4% respectively. There was a statistically significant difference in all scores (p

  3. SPSS and SAS procedures for estimating indirect effects in simple mediation models.

    PubMed

    Preacher, Kristopher J; Hayes, Andrew F

    2004-11-01

    Researchers often conduct mediation analysis in order to indirectly assess the effect of a proposed cause on some outcome through a proposed mediator. The utility of mediation analysis stems from its ability to go beyond the merely descriptive to a more functional understanding of the relationships among variables. A necessary component of mediation is a statistically and practically significant indirect effect. Although mediation hypotheses are frequently explored in psychological research, formal significance tests of indirect effects are rarely conducted. After a brief overview of mediation, we argue the importance of directly testing the significance of indirect effects and provide SPSS and SAS macros that facilitate estimation of the indirect effect with a normal theory approach and a bootstrap approach to obtaining confidence intervals, as well as the traditional approach advocated by Baron and Kenny (1986). We hope that this discussion and the macros will enhance the frequency of formal mediation tests in the psychology literature. Electronic copies of these macros may be downloaded from the Psychonomic Society's Web archive at www.psychonomic.org/archive/.

  4. Model Checking Techniques for Assessing Functional Form Specifications in Censored Linear Regression Models.

    PubMed

    León, Larry F; Cai, Tianxi

    2012-04-01

    In this paper we develop model checking techniques for assessing functional form specifications of covariates in censored linear regression models. These procedures are based on a censored data analog to taking cumulative sums of "robust" residuals over the space of the covariate under investigation. These cumulative sums are formed by integrating certain Kaplan-Meier estimators and may be viewed as "robust" censored data analogs to the processes considered by Lin, Wei & Ying (2002). The null distributions of these stochastic processes can be approximated by the distributions of certain zero-mean Gaussian processes whose realizations can be generated by computer simulation. Each observed process can then be graphically compared with a few realizations from the Gaussian process. We also develop formal test statistics for numerical comparison. Such comparisons enable one to assess objectively whether an apparent trend seen in a residual plot reects model misspecification or natural variation. We illustrate the methods with a well known dataset. In addition, we examine the finite sample performance of the proposed test statistics in simulation experiments. In our simulation experiments, the proposed test statistics have good power of detecting misspecification while at the same time controlling the size of the test.

  5. Statistical assessment of the learning curves of health technologies.

    PubMed

    Ramsay, C R; Grant, A M; Wallace, S A; Garthwaite, P H; Monk, A F; Russell, I T

    2001-01-01

    (1) To describe systematically studies that directly assessed the learning curve effect of health technologies. (2) Systematically to identify 'novel' statistical techniques applied to learning curve data in other fields, such as psychology and manufacturing. (3) To test these statistical techniques in data sets from studies of varying designs to assess health technologies in which learning curve effects are known to exist. METHODS - STUDY SELECTION (HEALTH TECHNOLOGY ASSESSMENT LITERATURE REVIEW): For a study to be included, it had to include a formal analysis of the learning curve of a health technology using a graphical, tabular or statistical technique. METHODS - STUDY SELECTION (NON-HEALTH TECHNOLOGY ASSESSMENT LITERATURE SEARCH): For a study to be included, it had to include a formal assessment of a learning curve using a statistical technique that had not been identified in the previous search. METHODS - DATA SOURCES: Six clinical and 16 non-clinical biomedical databases were searched. A limited amount of handsearching and scanning of reference lists was also undertaken. METHODS - DATA EXTRACTION (HEALTH TECHNOLOGY ASSESSMENT LITERATURE REVIEW): A number of study characteristics were abstracted from the papers such as study design, study size, number of operators and the statistical method used. METHODS - DATA EXTRACTION (NON-HEALTH TECHNOLOGY ASSESSMENT LITERATURE SEARCH): The new statistical techniques identified were categorised into four subgroups of increasing complexity: exploratory data analysis; simple series data analysis; complex data structure analysis, generic techniques. METHODS - TESTING OF STATISTICAL METHODS: Some of the statistical methods identified in the systematic searches for single (simple) operator series data and for multiple (complex) operator series data were illustrated and explored using three data sets. The first was a case series of 190 consecutive laparoscopic fundoplication procedures performed by a single surgeon; the second was a case series of consecutive laparoscopic cholecystectomy procedures performed by ten surgeons; the third was randomised trial data derived from the laparoscopic procedure arm of a multicentre trial of groin hernia repair, supplemented by data from non-randomised operations performed during the trial. RESULTS - HEALTH TECHNOLOGY ASSESSMENT LITERATURE REVIEW: Of 4571 abstracts identified, 272 (6%) were later included in the study after review of the full paper. Some 51% of studies assessed a surgical minimal access technique and 95% were case series. The statistical method used most often (60%) was splitting the data into consecutive parts (such as halves or thirds), with only 14% attempting a more formal statistical analysis. The reporting of the studies was poor, with 31% giving no details of data collection methods. RESULTS - NON-HEALTH TECHNOLOGY ASSESSMENT LITERATURE SEARCH: Of 9431 abstracts assessed, 115 (1%) were deemed appropriate for further investigation and, of these, 18 were included in the study. All of the methods for complex data sets were identified in the non-clinical literature. These were discriminant analysis, two-stage estimation of learning rates, generalised estimating equations, multilevel models, latent curve models, time series models and stochastic parameter models. In addition, eight new shapes of learning curves were identified. RESULTS - TESTING OF STATISTICAL METHODS: No one particular shape of learning curve performed significantly better than another. The performance of 'operation time' as a proxy for learning differed between the three procedures. Multilevel modelling using the laparoscopic cholecystectomy data demonstrated and measured surgeon-specific and confounding effects. The inclusion of non-randomised cases, despite the possible limitations of the method, enhanced the interpretation of learning effects. CONCLUSIONS - HEALTH TECHNOLOGY ASSESSMENT LITERATURE REVIEW: The statistical methods used for assessing learning effects in health technology assessment have been crude and the reporting of studies poor. CONCLUSIONS - NON-HEALTH TECHNOLOGY ASSESSMENT LITERATURE SEARCH: A number of statistical methods for assessing learning effects were identified that had not hitherto been used in health technology assessment. There was a hierarchy of methods for the identification and measurement of learning, and the more sophisticated methods for both have had little if any use in health technology assessment. This demonstrated the value of considering fields outside clinical research when addressing methodological issues in health technology assessment. CONCLUSIONS - TESTING OF STATISTICAL METHODS: It has been demonstrated that the portfolio of techniques identified can enhance investigations of learning curve effects. (ABSTRACT TRUNCATED)

  6. The compartment bag test (CBT) for enumerating fecal indicator bacteria: Basis for design and interpretation of results.

    PubMed

    Gronewold, Andrew D; Sobsey, Mark D; McMahan, Lanakila

    2017-06-01

    For the past several years, the compartment bag test (CBT) has been employed in water quality monitoring and public health protection around the world. To date, however, the statistical basis for the design and recommended procedures for enumerating fecal indicator bacteria (FIB) concentrations from CBT results have not been formally documented. Here, we provide that documentation following protocols for communicating the evolution of similar water quality testing procedures. We begin with an overview of the statistical theory behind the CBT, followed by a description of how that theory was applied to determine an optimal CBT design. We then provide recommendations for interpreting CBT results, including procedures for estimating quantiles of the FIB concentration probability distribution, and the confidence of compliance with recognized water quality guidelines. We synthesize these values in custom user-oriented 'look-up' tables similar to those developed for other FIB water quality testing methods. Modified versions of our tables are currently distributed commercially as part of the CBT testing kit. Published by Elsevier B.V.

  7. Longitudinal Effects on Early Adolescent Language: A Twin Study

    PubMed Central

    DeThorne, Laura Segebart; Smith, Jamie Mahurin; Betancourt, Mariana Aparicio; Petrill, Stephen A.

    2016-01-01

    Purpose We evaluated genetic and environmental contributions to individual differences in language skills during early adolescence, measured by both language sampling and standardized tests, and examined the extent to which these genetic and environmental effects are stable across time. Method We used structural equation modeling on latent factors to estimate additive genetic, shared environmental, and nonshared environmental effects on variance in standardized language skills (i.e., Formal Language) and productive language-sample measures (i.e., Productive Language) in a sample of 527 twins across 3 time points (mean ages 10–12 years). Results Individual differences in the Formal Language factor were influenced primarily by genetic factors at each age, whereas individual differences in the Productive Language factor were primarily due to nonshared environmental influences. For the Formal Language factor, the stability of genetic effects was high across all 3 time points. For the Productive Language factor, nonshared environmental effects showed low but statistically significant stability across adjacent time points. Conclusions The etiology of language outcomes may differ substantially depending on assessment context. In addition, the potential mechanisms for nonshared environmental influences on language development warrant further investigation. PMID:27732720

  8. Experimental design and statistical methods for improved hit detection in high-throughput screening.

    PubMed

    Malo, Nathalie; Hanley, James A; Carlile, Graeme; Liu, Jing; Pelletier, Jerry; Thomas, David; Nadon, Robert

    2010-09-01

    Identification of active compounds in high-throughput screening (HTS) contexts can be substantially improved by applying classical experimental design and statistical inference principles to all phases of HTS studies. The authors present both experimental and simulated data to illustrate how true-positive rates can be maximized without increasing false-positive rates by the following analytical process. First, the use of robust data preprocessing methods reduces unwanted variation by removing row, column, and plate biases. Second, replicate measurements allow estimation of the magnitude of the remaining random error and the use of formal statistical models to benchmark putative hits relative to what is expected by chance. Receiver Operating Characteristic (ROC) analyses revealed superior power for data preprocessed by a trimmed-mean polish method combined with the RVM t-test, particularly for small- to moderate-sized biological hits.

  9. Formal functional test designs with a test representation language

    NASA Technical Reports Server (NTRS)

    Hops, J. M.

    1993-01-01

    The application of the category-partition method to the test design phase of hardware, software, or system test development is discussed. The method provides a formal framework for reducing the total number of possible test cases to a minimum logical subset for effective testing. An automatic tool and a formal language were developed to implement the method and produce the specification of test cases.

  10. Master equation theory applied to the redistribution of polarized radiation in the weak radiation field limit. V. The two-term atom

    NASA Astrophysics Data System (ADS)

    Bommier, Véronique

    2017-11-01

    Context. In previous papers of this series, we presented a formalism able to account for both statistical equilibrium of a multilevel atom and coherent and incoherent scatterings (partial redistribution). Aims: This paper provides theoretical expressions of the redistribution function for the two-term atom. This redistribution function includes both coherent (RII) and incoherent (RIII) scattering contributions with their branching ratios. Methods: The expressions were derived by applying the formalism outlined above. The statistical equilibrium equation for the atomic density matrix is first formally solved in the case of the two-term atom with unpolarized and infinitely sharp lower levels. Then the redistribution function is derived by substituting this solution for the expression of the emissivity. Results: Expressions are provided for both magnetic and non-magnetic cases. Atomic fine structure is taken into account. Expressions are also separately provided under zero and non-zero hyperfine structure. Conclusions: Redistribution functions are widely used in radiative transfer codes. In our formulation, collisional transitions between Zeeman sublevels within an atomic level (depolarizing collisions effect) are taken into account when possible (I.e., in the non-magnetic case). However, the need for a formal solution of the statistical equilibrium as a preliminary step prevents us from taking into account collisional transfers between the levels of the upper term. Accounting for these collisional transfers could be done via a numerical solution of the statistical equilibrium equation system.

  11. Influence of Culture on Secondary School Students' Understanding of Statistics: A Fijian Perspective

    ERIC Educational Resources Information Center

    Sharma, Sashi

    2014-01-01

    Although we use statistical notions daily in making decisions, research in statistics education has focused mostly on formal statistics. Further, everyday culture may influence informal ideas of statistics. Yet, there appears to be minimal literature that deals with the educational implications of the role of culture. This paper will discuss the…

  12. Not Just a Sum? Identifying Different Types of Interplay between Constituents in Combined Interventions

    PubMed Central

    Van Deun, Katrijn; Thorrez, Lieven; van den Berg, Robert A.; Smilde, Age K.; Van Mechelen, Iven

    2015-01-01

    Motivation Experiments in which the effect of combined manipulations is compared with the effects of their pure constituents have received a great deal of attention. Examples include the study of combination therapies and the comparison of double and single knockout model organisms. Often the effect of the combined manipulation is not a mere addition of the effects of its constituents, with quite different forms of interplay between the constituents being possible. Yet, a well-formalized taxonomy of possible forms of interplay is lacking, let alone a statistical methodology to test for their presence in empirical data. Results Starting from a taxonomy of a broad range of forms of interplay between constituents of a combined manipulation, we propose a sound statistical hypothesis testing framework to test for the presence of each particular form of interplay. We illustrate the framework with analyses of public gene expression data on the combined treatment of dendritic cells with curdlan and GM-CSF and show that these lead to valuable insights into the mode of action of the constituent treatments and their combination. Availability and Implementation R code implementing the statistical testing procedure for microarray gene expression data is available as supplementary material. The data are available from the Gene Expression Omnibus with accession number GSE32986. PMID:25965065

  13. Not Just a Sum? Identifying Different Types of Interplay between Constituents in Combined Interventions.

    PubMed

    Van Deun, Katrijn; Thorrez, Lieven; van den Berg, Robert A; Smilde, Age K; Van Mechelen, Iven

    2015-01-01

    Experiments in which the effect of combined manipulations is compared with the effects of their pure constituents have received a great deal of attention. Examples include the study of combination therapies and the comparison of double and single knockout model organisms. Often the effect of the combined manipulation is not a mere addition of the effects of its constituents, with quite different forms of interplay between the constituents being possible. Yet, a well-formalized taxonomy of possible forms of interplay is lacking, let alone a statistical methodology to test for their presence in empirical data. Starting from a taxonomy of a broad range of forms of interplay between constituents of a combined manipulation, we propose a sound statistical hypothesis testing framework to test for the presence of each particular form of interplay. We illustrate the framework with analyses of public gene expression data on the combined treatment of dendritic cells with curdlan and GM-CSF and show that these lead to valuable insights into the mode of action of the constituent treatments and their combination. R code implementing the statistical testing procedure for microarray gene expression data is available as supplementary material. The data are available from the Gene Expression Omnibus with accession number GSE32986.

  14. Preferences for and Barriers to Formal and Informal Athletic Training Continuing Education Activities

    PubMed Central

    Armstrong, Kirk J.; Weidner, Thomas G.

    2011-01-01

    Context: Our previous research determined the frequency of participation and perceived effect of formal and informal continuing education (CE) activities. However, actual preferences for and barriers to CE must be characterized. Objective: To determine the types of formal and informal CE activities preferred by athletic trainers (ATs) and barriers to their participation in these activities. Design: Cross-sectional study. Setting: Athletic training practice settings. Patients or Other Participants: Of a geographically stratified random sample of 1000 ATs, 427 ATs (42.7%) completed the survey. Main Outcome Measure(s): As part of a larger study, the Survey of Formal and Informal Athletic Training Continuing Education Activities (FIATCEA) was developed and administered electronically. The FIATCEA consists of demographic characteristics and Likert scale items (1 = strongly disagree, 5 = strongly agree) about preferred CE activities and barriers to these activities. Internal consistency of survey items, as determined by Cronbach α, was 0.638 for preferred CE activities and 0.860 for barriers to these activities. Descriptive statistics were computed for all items. Differences between respondent demographic characteristics and preferred CE activities and barriers to these activities were determined via analysis of variance and dependent t tests. The α level was set at .05. Results: Hands-on clinical workshops and professional networking were the preferred formal and informal CE activities, respectively. The most frequently reported barriers to formal CE were the cost of attending and travel distance, whereas the most frequently reported barriers to informal CE were personal and job-specific factors. Differences were noted between both the cost of CE and travel distance to CE and all other barriers to CE participation (F1,411 = 233.54, P < .001). Conclusions: Overall, ATs preferred formal CE activities. The same barriers (eg, cost, travel distance) to formal CE appeared to be universal to all ATs. Informal CE was highly valued by ATs because it could be individualized. PMID:22488195

  15. Formal and Informal Continuing Education Activities and Athletic Training Professional Practice

    PubMed Central

    Armstrong, Kirk J.; Weidner, Thomas G.

    2010-01-01

    Abstract Context: Continuing education (CE) is intended to promote professional growth and, ultimately, to enhance professional practice. Objective: To determine certified athletic trainers' participation in formal (ie, approved for CE credit) and informal (ie, not approved for CE credit) CE activities and the perceived effect these activities have on professional practice with regard to improving knowledge, clinical skills and abilities, attitudes toward patient care, and patient care itself. Design: Cross-sectional study. Setting: Athletic training practice settings. Patients or Other Participants: Of a geographic, stratified random sample of 1000 athletic trainers, 427 (42.7%) completed the survey. Main Outcome Measure(s): The Survey of Formal and Informal Athletic Training Continuing Education Activities was developed and administered electronically. The survey consisted of demographic characteristics and Likert-scale items regarding CE participation and perceived effect of CE on professional practice. Internal consistency of survey items was determined using the Cronbach α (α  =  0.945). Descriptive statistics were computed for all items. An analysis of variance and dependent t tests were calculated to determine differences among respondents' demographic characteristics and their participation in, and perceived effect of, CE activities. The α level was set at .05. Results: Respondents completed more informal CE activities than formal CE activities. Participation in informal CE activities included reading athletic training journals (75.4%), whereas formal CE activities included attending a Board of Certification–approved workshop, seminar, or professional conference not conducted by the National Athletic Trainers' Association or affiliates or committees (75.6%). Informal CE activities were perceived to improve clinical skills or abilities and attitudes toward patient care. Formal CE activities were perceived to enhance knowledge. Conclusions: More respondents completed informal CE activities than formal CE activities. Both formal and informal CE activities were perceived to enhance athletic training professional practice. Informal CE activities should be explored and considered for CE credit. PMID:20446842

  16. Hunting high and low: disentangling primordial and late-time non-Gaussianity with cosmic densities in spheres

    NASA Astrophysics Data System (ADS)

    Uhlemann, C.; Pajer, E.; Pichon, C.; Nishimichi, T.; Codis, S.; Bernardeau, F.

    2018-03-01

    Non-Gaussianities of dynamical origin are disentangled from primordial ones using the formalism of large deviation statistics with spherical collapse dynamics. This is achieved by relying on accurate analytical predictions for the one-point probability distribution function and the two-point clustering of spherically averaged cosmic densities (sphere bias). Sphere bias extends the idea of halo bias to intermediate density environments and voids as underdense regions. In the presence of primordial non-Gaussianity, sphere bias displays a strong scale dependence relevant for both high- and low-density regions, which is predicted analytically. The statistics of densities in spheres are built to model primordial non-Gaussianity via an initial skewness with a scale dependence that depends on the bispectrum of the underlying model. The analytical formulas with the measured non-linear dark matter variance as input are successfully tested against numerical simulations. For local non-Gaussianity with a range from fNL = -100 to +100, they are found to agree within 2 per cent or better for densities ρ ∈ [0.5, 3] in spheres of radius 15 Mpc h-1 down to z = 0.35. The validity of the large deviation statistics formalism is thereby established for all observationally relevant local-type departures from perfectly Gaussian initial conditions. The corresponding estimators for the amplitude of the non-linear variance σ8 and primordial skewness fNL are validated using a fiducial joint maximum likelihood experiment. The influence of observational effects and the prospects for a future detection of primordial non-Gaussianity from joint one- and two-point densities-in-spheres statistics are discussed.

  17. University of California Conference on Statistical Mechanics (4th) Held March 26-28, 1990

    DTIC Science & Technology

    1990-03-28

    and S. Lago, Chem. Phys., Z, 5750 (1983) Shear Viscosity Calculation via Equilibrium Molecular Dynamics: Einstenian vs. Green - Kubo Formalism by Adel A...through the application of the Green - Kubo approach. Although the theoretical equivalence between both formalisms was demonstrated by Helfand [3], their...like equations and of different expressions based on the Green - Kubo formalism. In contrast to Hoheisel and Vogelsang’s conclusions [2], we find that

  18. Statistical mechanics of few-particle systems: exact results for two useful models

    NASA Astrophysics Data System (ADS)

    Miranda, Enrique N.

    2017-11-01

    The statistical mechanics of small clusters (n ˜ 10-50 elements) of harmonic oscillators and two-level systems is studied exactly, following the microcanonical, canonical and grand canonical formalisms. For clusters with several hundred particles, the results from the three formalisms coincide with those found in the thermodynamic limit. However, for clusters formed by a few tens of elements, the three ensembles yield different results. For a cluster with a few tens of harmonic oscillators, when the heat capacity per oscillator is evaluated within the canonical formalism, it reaches a limit value equal to k B , as in the thermodynamic case, while within the microcanonical formalism the limit value is k B (1-1/n). This difference could be measured experimentally. For a cluster with a few tens of two-level systems, the heat capacity evaluated within the canonical and microcanonical ensembles also presents differences that could be detected experimentally. Both the microcanonical and grand canonical formalism show that the entropy is non-additive for systems this small, while the canonical ensemble reaches the opposite conclusion. These results suggest that the microcanonical ensemble is the most appropriate for dealing with systems with tens of particles.

  19. Causal criteria and counterfactuals; nothing more (or less) than scientific common sense.

    PubMed

    Phillips, Carl V; Goodman, Karen J

    2006-05-26

    Two persistent myths in epidemiology are that we can use a list of "causal criteria" to provide an algorithmic approach to inferring causation and that a modern "counterfactual model" can assist in the same endeavor. We argue that these are neither criteria nor a model, but that lists of causal considerations and formalizations of the counterfactual definition of causation are nevertheless useful tools for promoting scientific thinking. They set us on the path to the common sense of scientific inquiry, including testing hypotheses (really putting them to a test, not just calculating simplistic statistics), responding to the Duhem-Quine problem, and avoiding many common errors. Austin Bradford Hill's famous considerations are thus both over-interpreted by those who would use them as criteria and under-appreciated by those who dismiss them as flawed. Similarly, formalizations of counterfactuals are under-appreciated as lessons in basic scientific thinking. The need for lessons in scientific common sense is great in epidemiology, which is taught largely as an engineering discipline and practiced largely as technical tasks, making attention to core principles of scientific inquiry woefully rare.

  20. Analyzing Seasonal Variations in Suicide With Fourier Poisson Time-Series Regression: A Registry-Based Study From Norway, 1969-2007.

    PubMed

    Bramness, Jørgen G; Walby, Fredrik A; Morken, Gunnar; Røislien, Jo

    2015-08-01

    Seasonal variation in the number of suicides has long been acknowledged. It has been suggested that this seasonality has declined in recent years, but studies have generally used statistical methods incapable of confirming this. We examined all suicides occurring in Norway during 1969-2007 (more than 20,000 suicides in total) to establish whether seasonality decreased over time. Fitting of additive Fourier Poisson time-series regression models allowed for formal testing of a possible linear decrease in seasonality, or a reduction at a specific point in time, while adjusting for a possible smooth nonlinear long-term change without having to categorize time into discrete yearly units. The models were compared using Akaike's Information Criterion and analysis of variance. A model with a seasonal pattern was significantly superior to a model without one. There was a reduction in seasonality during the period. Both the model assuming a linear decrease in seasonality and the model assuming a change at a specific point in time were both superior to a model assuming constant seasonality, thus confirming by formal statistical testing that the magnitude of the seasonality in suicides has diminished. The additive Fourier Poisson time-series regression model would also be useful for studying other temporal phenomena with seasonal components. © The Author 2015. Published by Oxford University Press on behalf of the Johns Hopkins Bloomberg School of Public Health. All rights reserved. For permissions, please e-mail: journals.permissions@oup.com.

  1. Velocity bias in the distribution of dark matter halos

    NASA Astrophysics Data System (ADS)

    Baldauf, Tobias; Desjacques, Vincent; Seljak, Uroš

    2015-12-01

    The standard formalism for the coevolution of halos and dark matter predicts that any initial halo velocity bias rapidly decays to zero. We argue that, when the purpose is to compute statistics like power spectra etc., the coupling in the momentum conservation equation for the biased tracers must be modified. Our new formulation predicts the constancy in time of any statistical halo velocity bias present in the initial conditions, in agreement with peak theory. We test this prediction by studying the evolution of a conserved halo population in N -body simulations. We establish that the initial simulated halo density and velocity statistics show distinct features of the peak model and, thus, deviate from the simple local Lagrangian bias. We demonstrate, for the first time, that the time evolution of their velocity is in tension with the rapid decay expected in the standard approach.

  2. Lossy chaotic electromagnetic reverberation chambers: Universal statistical behavior of the vectorial field

    NASA Astrophysics Data System (ADS)

    Gros, J.-B.; Kuhl, U.; Legrand, O.; Mortessagne, F.

    2016-03-01

    The effective Hamiltonian formalism is extended to vectorial electromagnetic waves in order to describe statistical properties of the field in reverberation chambers. The latter are commonly used in electromagnetic compatibility tests. As a first step, the distribution of wave intensities in chaotic systems with varying opening in the weak coupling limit for scalar quantum waves is derived by means of random matrix theory. In this limit the only parameters are the modal overlap and the number of open channels. Using the extended effective Hamiltonian, we describe the intensity statistics of the vectorial electromagnetic eigenmodes of lossy reverberation chambers. Finally, the typical quantity of interest in such chambers, namely, the distribution of the electromagnetic response, is discussed. By determining the distribution of the phase rigidity, describing the coupling to the environment, using random matrix numerical data, we find good agreement between the theoretical prediction and numerical calculations of the response.

  3. Statistics of Macroturbulence from Flow Equations

    NASA Astrophysics Data System (ADS)

    Marston, Brad; Iadecola, Thomas; Qi, Wanming

    2012-02-01

    Probability distribution functions of stochastically-driven and frictionally-damped fluids are governed by a linear framework that resembles quantum many-body theory. Besides the Fokker-Planck approach, there is a closely related Hopf functional methodfootnotetextOokie Ma and J. B. Marston, J. Stat. Phys. Th. Exp. P10007 (2005).; in both formalisms, zero modes of linear operators describe the stationary non-equilibrium statistics. To access the statistics, we generalize the flow equation approachfootnotetextF. Wegner, Ann. Phys. 3, 77 (1994). (also known as the method of continuous unitary transformationsfootnotetextS. D. Glazek and K. G. Wilson, Phys. Rev. D 48, 5863 (1993); Phys. Rev. D 49, 4214 (1994).) to find the zero mode. We test the approach using a prototypical model of geophysical and astrophysical flows on a rotating sphere that spontaneously organizes into a coherent jet. Good agreement is found with low-order equal-time statistics accumulated by direct numerical simulation, the traditional method. Different choices for the generators of the continuous transformations, and for closure approximations of the operator algebra, are discussed.

  4. DOE Office of Scientific and Technical Information (OSTI.GOV)

    Graesser, Jordan B; Cheriyadat, Anil M; Vatsavai, Raju

    The high rate of global urbanization has resulted in a rapid increase in informal settlements, which can be de ned as unplanned, unauthorized, and/or unstructured housing. Techniques for ef ciently mapping these settlement boundaries can bene t various decision making bodies. From a remote sensing perspective, informal settlements share unique spatial characteristics that distinguish them from other types of structures (e.g., industrial, commercial, and formal residential). These spatial characteristics are often captured in high spatial resolution satellite imagery. We analyzed the role of spatial, structural, and contextual features (e.g., GLCM, Histogram of Oriented Gradients, Line Support Regions, Lacunarity) for urbanmore » neighborhood mapping, and computed several low-level image features at multiple scales to characterize local neighborhoods. The decision parameters to classify formal-, informal-, and non-settlement classes were learned under Decision Trees and a supervised classi cation framework. Experiments were conducted on high-resolution satellite imagery from the CitySphere collection, and four different cities (i.e., Caracas, Kabul, Kandahar, and La Paz) with varying spatial characteristics were represented. Overall accuracy ranged from 85% in La Paz, Bolivia, to 92% in Kandahar, Afghanistan. While the disparities between formal and informal neighborhoods varied greatly, many of the image statistics tested proved robust.« less

  5. E-assessment of prior learning: a pilot study of interactive assessment of staff with no formal education who are working in Swedish elderly care

    PubMed Central

    2014-01-01

    Background The current paper presents a pilot study of interactive assessment using information and communication technology (ICT) to evaluate the knowledge, skills and abilities of staff with no formal education who are working in Swedish elderly care. Methods Theoretical and practical assessment methods were developed and used with simulated patients and computer-based tests to identify strengths and areas for personal development among staff with no formal education. Results Of the 157 staff with no formal education, 87 began the practical and/or theoretical assessments, and 63 completed both assessments. Several of the staff passed the practical assessments, except the morning hygiene assessment, where several failed. Other areas for staff development, i.e. where several failed (>50%), were the theoretical assessment of the learning objectives: Health, Oral care, Ergonomics, hygiene, esthetic, environmental, Rehabilitation, Assistive technology, Basic healthcare and Laws and organization. None of the staff passed all assessments. Number of years working in elderly care and staff age were not statistically significantly related to the total score of grades on the various learning objectives. Conclusion The interactive assessments were useful in assessing staff members’ practical and theoretical knowledge, skills, and abilities and in identifying areas in need of development. It is important that personnel who lack formal qualifications be clearly identified and given a chance to develop their competence through training, both theoretical and practical. The interactive e-assessment approach analyzed in the present pilot study could serve as a starting point. PMID:24742168

  6. Investigation of pore size and energy distributions by statistical physics formalism applied to agriculture products

    NASA Astrophysics Data System (ADS)

    Aouaini, Fatma; Knani, Salah; Yahia, Manel Ben; Bahloul, Neila; Ben Lamine, Abdelmottaleb; Kechaou, Nabil

    2015-12-01

    In this paper, we present a new investigation that allows determining the pore size distribution (PSD) in a porous medium. This PSD is achieved by using the desorption isotherms of four varieties of olive leaves. This is by the means of statistical physics formalism and Kelvin's law. The results are compared with those obtained with scanning electron microscopy. The effect of temperature on the distribution function of pores has been studied. The influence of each parameter on the PSD is interpreted. A similar function of adsorption energy distribution, AED, is deduced from the PSD.

  7. An analysis of the cognitive deficit of schizophrenia based on the Piaget developmental theory.

    PubMed

    Torres, Alejandro; Olivares, Jose M; Rodriguez, Angel; Vaamonde, Antonio; Berrios, German E

    2007-01-01

    The objective of the study was to evaluate from the perspective of the Piaget developmental model the cognitive functioning of a sample of patients diagnosed with schizophrenia. Fifty patients with schizophrenia (Diagnostic and Statistical Manual of Mental Disorders, Fourth Edition) and 40 healthy matched controls were evaluated by means of the Longeot Logical Thought Evaluation Scale. Only 6% of the subjects with schizophrenia reached the "formal period," and 70% remained at the "concrete operations" stage. The corresponding figures for the control sample were 25% and 15%, respectively. These differences were statistically significant. The samples were specifically differentiable on the permutation, probabilities, and pendulum tests of the scale. The Longeot Logical Thought Evaluation Scale can discriminate between subjects with schizophrenia and healthy controls.

  8. A formal framework of scenario creation and analysis of extreme hydrological events

    NASA Astrophysics Data System (ADS)

    Lohmann, D.

    2007-12-01

    We are presenting a formal framework for a hydrological risk analysis. Different measures of risk will be introduced, such as average annual loss or occurrence exceedance probability. These are important measures for e.g. insurance companies to determine the cost of insurance. One key aspect of investigating the potential consequences of extreme hydrological events (floods and draughts) is the creation of meteorological scenarios that reflect realistic spatial and temporal patterns of precipitation that also have correct local statistics. 100,000 years of these meteorological scenarios are used in a calibrated rainfall-runoff-flood-loss-risk model to produce flood and draught events that have never been observed. The results of this hazard model are statistically analyzed and linked to socio-economic data and vulnerability functions to show the impact of severe flood events. We are showing results from the Risk Management Solutions (RMS) Europe Flood Model to introduce this formal framework.

  9. Strategies Used by Students to Compare Two Data Sets

    ERIC Educational Resources Information Center

    Reaburn, Robyn

    2012-01-01

    One of the common tasks of inferential statistics is to compare two data sets. Long before formal statistical procedures, however, students can be encouraged to make comparisons between data sets and therefore build up intuitive statistical reasoning. Such tasks also give meaning to the data collection students may do. This study describes the…

  10. Combining statistical inference and decisions in ecology

    USGS Publications Warehouse

    Williams, Perry J.; Hooten, Mevin B.

    2016-01-01

    Statistical decision theory (SDT) is a sub-field of decision theory that formally incorporates statistical investigation into a decision-theoretic framework to account for uncertainties in a decision problem. SDT provides a unifying analysis of three types of information: statistical results from a data set, knowledge of the consequences of potential choices (i.e., loss), and prior beliefs about a system. SDT links the theoretical development of a large body of statistical methods including point estimation, hypothesis testing, and confidence interval estimation. The theory and application of SDT have mainly been developed and published in the fields of mathematics, statistics, operations research, and other decision sciences, but have had limited exposure in ecology. Thus, we provide an introduction to SDT for ecologists and describe its utility for linking the conventionally separate tasks of statistical investigation and decision making in a single framework. We describe the basic framework of both Bayesian and frequentist SDT, its traditional use in statistics, and discuss its application to decision problems that occur in ecology. We demonstrate SDT with two types of decisions: Bayesian point estimation, and an applied management problem of selecting a prescribed fire rotation for managing a grassland bird species. Central to SDT, and decision theory in general, are loss functions. Thus, we also provide basic guidance and references for constructing loss functions for an SDT problem.

  11. For a statistical interpretation of Helmholtz' thermal displacement

    NASA Astrophysics Data System (ADS)

    Podio-Guidugli, Paolo

    2016-11-01

    On moving from the classic papers by Einstein and Langevin on Brownian motion, two consistent statistical interpretations are given for the thermal displacement, a scalar field formally introduced by Helmholtz, whose time derivative is by definition the absolute temperature.

  12. What constitutes a good hand offs in the emergency department: a patient's perspective.

    PubMed

    Downey, La Vonne; Zun, Leslie; Burke, Trena

    2013-01-01

    The aim is to determine, from the patient's perspective, what constitutes a good hand-off procedure in the emergency department (ED). The secondary purpose is to evaluate what impact a formalized hand-off had on patient knowledge, throughput and customer service This study used a randomized controlled clinical trial involving two unique hand-off approaches and a convenience sample. The study alternated between the current hand-off process that documented the process but not specific elements (referred to as the informal process) to one using the IPASS the BATON process (considered the formal process). Consenting patients completed a 12-question validated questionnaire on how the process was perceived by patients and about their understanding why they waited in the ED. Statistical analysis using SPSS calculated descriptive frequencies and t-tests. In total 107 patients were enrolled: 50 in the informal and 57 in the formal group. Most patients had positive answers to the customer survey. There were significant differences between formal and informal groups: recalling the oncoming and outgoing physician coming to the patient's bed (p = 0.000), with more formal group recalling that than informal group patients; the oncoming physician introducing him/herself (p = 0.01), with more from the formal group answering yes and the physician discussing tests and implications with formal group patients (p = 0.02). This study was done at an urban inner city ED, a fact that may have skewed its results. A comparison of suburban and rural EDs would make the results stronger. It also reflected a very high level of customer satisfaction within the ED. This lack of variance may have meant that the correlation between customer service and handoffs was missed or underrepresented. There was no codified observation of either those using the IPASS the BATON script or those using informal procedures, so no comparison of level and types of information given between the two groups was done. There could have been a bias of those attending who had internalized the IPASS the BATON procedures and used them even when they were assigned to the informal group. A hand off from one physician to the next in the emergency department is best done using a formalized process. IPASS the BATON is a useful tool for hand off in the ED in part because it involved the patient in the process. The formal hand off increased communication between patient and doctor as its use increased the patient's opportunity to ask and respond to questions. The researchers evaluated an ED physician specific hand-off process and illustrate the value and impact of involving patients in the hand-off process.

  13. On the statistical distribution in a deformed solid

    NASA Astrophysics Data System (ADS)

    Gorobei, N. N.; Luk'yanenko, A. S.

    2017-09-01

    A modification of the Gibbs distribution in a thermally insulated mechanically deformed solid, where its linear dimensions (shape parameters) are excluded from statistical averaging and included among the macroscopic parameters of state alongside with the temperature, is proposed. Formally, this modification is reduced to corresponding additional conditions when calculating the statistical sum. The shape parameters and the temperature themselves are found from the conditions of mechanical and thermal equilibria of a body, and their change is determined using the first law of thermodynamics. Known thermodynamic phenomena are analyzed for the simple model of a solid, i.e., an ensemble of anharmonic oscillators, within the proposed formalism with an accuracy of up to the first order by the anharmonicity constant. The distribution modification is considered for the classic and quantum temperature regions apart.

  14. Gauging Skills of Hospital Security Personnel: a Statistically-driven, Questionnaire-based Approach.

    PubMed

    Rinkoo, Arvind Vashishta; Mishra, Shubhra; Rahesuddin; Nabi, Tauqeer; Chandra, Vidha; Chandra, Hem

    2013-01-01

    This study aims to gauge the technical and soft skills of the hospital security personnel so as to enable prioritization of their training needs. A cross sectional questionnaire based study was conducted in December 2011. Two separate predesigned and pretested questionnaires were used for gauging soft skills and technical skills of the security personnel. Extensive statistical analysis, including Multivariate Analysis (Pillai-Bartlett trace along with Multi-factorial ANOVA) and Post-hoc Tests (Bonferroni Test) was applied. The 143 participants performed better on the soft skills front with an average score of 6.43 and standard deviation of 1.40. The average technical skills score was 5.09 with a standard deviation of 1.44. The study avowed a need for formal hands on training with greater emphasis on technical skills. Multivariate analysis of the available data further helped in identifying 20 security personnel who should be prioritized for soft skills training and a group of 36 security personnel who should receive maximum attention during technical skills training. This statistically driven approach can be used as a prototype by healthcare delivery institutions worldwide, after situation specific customizations, to identify the training needs of any category of healthcare staff.

  15. Gauging Skills of Hospital Security Personnel: a Statistically-driven, Questionnaire-based Approach

    PubMed Central

    Rinkoo, Arvind Vashishta; Mishra, Shubhra; Rahesuddin; Nabi, Tauqeer; Chandra, Vidha; Chandra, Hem

    2013-01-01

    Objectives This study aims to gauge the technical and soft skills of the hospital security personnel so as to enable prioritization of their training needs. Methodology A cross sectional questionnaire based study was conducted in December 2011. Two separate predesigned and pretested questionnaires were used for gauging soft skills and technical skills of the security personnel. Extensive statistical analysis, including Multivariate Analysis (Pillai-Bartlett trace along with Multi-factorial ANOVA) and Post-hoc Tests (Bonferroni Test) was applied. Results The 143 participants performed better on the soft skills front with an average score of 6.43 and standard deviation of 1.40. The average technical skills score was 5.09 with a standard deviation of 1.44. The study avowed a need for formal hands on training with greater emphasis on technical skills. Multivariate analysis of the available data further helped in identifying 20 security personnel who should be prioritized for soft skills training and a group of 36 security personnel who should receive maximum attention during technical skills training. Conclusion This statistically driven approach can be used as a prototype by healthcare delivery institutions worldwide, after situation specific customizations, to identify the training needs of any category of healthcare staff. PMID:23559904

  16. Stata companion.

    PubMed

    Brennan, Jennifer Sousa

    2010-01-01

    This chapter is an introductory reference guide highlighting some of the most common statistical topics, broken down into both command-line syntax and graphical interface point-and-click commands. This chapter serves to supplement more formal statistics lessons and expedite using Stata to compute basic analyses.

  17. Estimating breeding proportions and testing hypotheses about costs of reproduction with capture-recapture data

    USGS Publications Warehouse

    Nichols, James D.; Hines, James E.; Pollock, Kenneth H.; Hinz, Robert L.; Link, William A.

    1994-01-01

    The proportion of animals in a population that breeds is an important determinant of population growth rate. Usual estimates of this quantity from field sampling data assume that the probability of appearing in the capture or count statistic is the same for animals that do and do not breed. A similar assumption is required by most existing methods used to test ecologically interesting hypotheses about reproductive costs using field sampling data. However, in many field sampling situations breeding and nonbreeding animals are likely to exhibit different probabilities of being seen or caught. In this paper, we propose the use of multistate capture-recapture models for these estimation and testing problems. This methodology permits a formal test of the hypothesis of equal capture/sighting probabilities for breeding and nonbreeding individuals. Two estimators of breeding proportion (and associated standard errors) are presented, one for the case of equal capture probabilities and one for the case of unequal capture probabilities. The multistate modeling framework also yields formal tests of hypotheses about reproductive costs to future reproduction or survival or both fitness components. The general methodology is illustrated using capture-recapture data on female meadow voles, Microtus pennsylvanicus. Resulting estimates of the proportion of reproductively active females showed strong seasonal variation, as expected, with low breeding proportions in midwinter. We found no evidence of reproductive costs extracted in subsequent survival or reproduction. We believe that this methodological framework has wide application to problems in animal ecology concerning breeding proportions and phenotypic reproductive costs.

  18. Exact Solution of the Two-Level System and the Einstein Solid in the Microcanonical Formalism

    ERIC Educational Resources Information Center

    Bertoldi, Dalia S.; Bringa, Eduardo M.; Miranda, E. N.

    2011-01-01

    The two-level system and the Einstein model of a crystalline solid are taught in every course of statistical mechanics and they are solved in the microcanonical formalism because the number of accessible microstates can be easily evaluated. However, their solutions are usually presented using the Stirling approximation to deal with factorials. In…

  19. A Comparative Study of Pre-Service Education for Preschool Teachers in China and the United States

    ERIC Educational Resources Information Center

    Gong, Xin; Wang, Pengcheng

    2017-01-01

    This study provides a comparative analysis of the pre-service education system for preschool educators in China and the United States. Based on collected data and materials (literature, policy documents, and statistical data), we compare two areas of pre-service training: (1) the formal system; (2) the informal system. In the formal system, most…

  20. The frequency of dyscalculia among primary school children.

    PubMed

    Jovanović, Gordana; Jovanović, Zoran; Banković-Gajić, Jelena; Nikolić, Anđelka; Svetozarević, Srđana; Ignjatović-Ristić, Dragana

    2013-06-01

    Formal education, daily living activities and jobs require knowledge and application skills of counting and simple mathematical operations. Problems with mathematics start in primary school and persist till adulthood. This is known as dyscalculia and its prevalence in the school population ranges from 3 to 6.5%. The study included 1424 third-grade students (aged 9-10) of all primary schools in the City of Kragujevac, Serbia. Tests in mathematics were given in order to determine their mathematical achievement. 1078 students (538 boys and 540 girls) completed all five tests. The frequency of dyscalculia in the sample was 9.9%. The difference between boys and girls according to the total score on the test was statistically significant (p<0.005). The difference between students according to their school achievement (excellent, very good, good, sufficient and insufficient) was statistically significant for all tests (p<0.0005). The influence of place of residence/school was significant for all tests (p<0.0005). Independent prognostic variables associated with dyscalculia are marks in mathematics and Serbian language. Frequency of dyscalculia of 9.9% in the sample is higher than in the other similar studies. Further research should identify possible causes of such frequency of dyscalculia in order to improve students` mathematical abilities.

  1. Comparing Two Tests of Formal Reasoning in a College Chemistry Context

    ERIC Educational Resources Information Center

    Jiang, Bo; Xu, Xiaoying; Garcia, Alicia; Lewis, Jennifer E.

    2010-01-01

    The Test of Logical Thinking (TOLT) and the Group Assessment of Logical Thinking (GALT) are two of the instruments most widely used by science educators and researchers to measure students' formal reasoning abilities. Based on Piaget's cognitive development theory, formal thinking ability has been shown to be essential for student achievement in…

  2. Adolescent Egocentrism and Formal Operations: Tests of a Theoretical Assumption.

    ERIC Educational Resources Information Center

    Lapsley, David K.; And Others

    1986-01-01

    Describes two studies of the theoretical relation between adolescent egocentrism and formal operations. Study 1 used the Adolescent Egocentrism Scale (AES) and Lunzer's battery of formal reasoning tasks to assess 183 adolescents. Study 2 administered the AES, the Imaginary Audience Scale (IAS), and the Test of Logical Thinking to 138 adolescents.…

  3. A model of the human observer and decision maker

    NASA Technical Reports Server (NTRS)

    Wewerinke, P. H.

    1981-01-01

    The decision process is described in terms of classical sequential decision theory by considering the hypothesis that an abnormal condition has occurred by means of a generalized likelihood ratio test. For this, a sufficient statistic is provided by the innovation sequence which is the result of the perception an information processing submodel of the human observer. On the basis of only two model parameters, the model predicts the decision speed/accuracy trade-off and various attentional characteristics. A preliminary test of the model for single variable failure detection tasks resulted in a very good fit of the experimental data. In a formal validation program, a variety of multivariable failure detection tasks was investigated and the predictive capability of the model was demonstrated.

  4. Variation in Lithic Technological Strategies among the Neanderthals of Gibraltar

    PubMed Central

    Shipton, Ceri; Clarkson, Christopher; Bernal, Marco Antonio; Boivin, Nicole; Finlayson, Clive; Finlayson, Geraldine; Fa, Darren; Pacheco, Francisco Giles; Petraglia, Michael

    2013-01-01

    The evidence for Neanderthal lithic technology is reviewed and summarized for four caves on The Rock of Gibraltar: Vanguard, Beefsteak, Ibex and Gorham’s. Some of the observed patterns in technology are statistically tested including raw material selection, platform preparation, and the use of formal and expedient technological schemas. The main parameters of technological variation are examined through detailed analysis of the Gibraltar cores and comparison with samples from the classic Mousterian sites of Le Moustier and Tabun C. The Gibraltar Mousterian, including the youngest assemblage from Layer IV of Gorham’s Cave, spans the typical Middle Palaeolithic range of variation from radial Levallois to unidirectional and multi-platform flaking schemas, with characteristic emphasis on the former. A diachronic pattern of change in the Gorham’s Cave sequence is documented, with the younger assemblages utilising more localized raw material and less formal flaking procedures. We attribute this change to a reduction in residential mobility as the climate deteriorated during Marine Isotope Stage 3 and the Neanderthal population contracted into a refugium. PMID:23762312

  5. Turkish university students' knowledge of biotechnology and attitudes toward biotechnological applications.

    PubMed

    Öztürk-Akar, Ebru

    2017-03-04

    This study questions the presumed relation between formal schooling and scientific literacy about biotechnologies. Comparing science and nonscience majors' knowledge of and attitudes toward biotechnological applications, conclusions are drawn if their formal learnings improve pupils' understandings of and attitudes toward biotechnology applications. Sample of the study consists of 403 undergraduate and graduate students, 198 nonscience, and 205 science majors. The Biotechnology Knowledge Questionnaire and the Biotechnology Attitude Questionnaire were administered. Descriptive statistics (mean and percentages), t test, and correlations were used to examine the participants' knowledge of biotechnology and attitudes toward biotechnological applications and differences as regards their majors. Although the science majors had higher knowledge and attitude scores than the nonscience majors, it is not possible to say that they have sufficient knowledge of biotechnologies. Besides, the participants' attitudes toward biotechnological applications were not considerably related to their knowledge of biotechnology. © 2016 by The International Union of Biochemistry and Molecular Biology, 45(2):115-125, 2017. © 2016 The International Union of Biochemistry and Molecular Biology.

  6. Discrete Mathematical Approaches to Graph-Based Traffic Analysis

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Joslyn, Cliff A.; Cowley, Wendy E.; Hogan, Emilie A.

    2014-04-01

    Modern cyber defense and anlaytics requires general, formal models of cyber systems. Multi-scale network models are prime candidates for such formalisms, using discrete mathematical methods based in hierarchically-structured directed multigraphs which also include rich sets of labels. An exemplar of an application of such an approach is traffic analysis, that is, observing and analyzing connections between clients, servers, hosts, and actors within IP networks, over time, to identify characteristic or suspicious patterns. Towards that end, NetFlow (or more generically, IPFLOW) data are available from routers and servers which summarize coherent groups of IP packets flowing through the network. In thismore » paper, we consider traffic analysis of Netflow using both basic graph statistics and two new mathematical measures involving labeled degree distributions and time interval overlap measures. We do all of this over the VAST test data set of 96M synthetic Netflow graph edges, against which we can identify characteristic patterns of simulated ground-truth network attacks.« less

  7. Bayesian inference of physiologically meaningful parameters from body sway measurements.

    PubMed

    Tietäväinen, A; Gutmann, M U; Keski-Vakkuri, E; Corander, J; Hæggström, E

    2017-06-19

    The control of the human body sway by the central nervous system, muscles, and conscious brain is of interest since body sway carries information about the physiological status of a person. Several models have been proposed to describe body sway in an upright standing position, however, due to the statistical intractability of the more realistic models, no formal parameter inference has previously been conducted and the expressive power of such models for real human subjects remains unknown. Using the latest advances in Bayesian statistical inference for intractable models, we fitted a nonlinear control model to posturographic measurements, and we showed that it can accurately predict the sway characteristics of both simulated and real subjects. Our method provides a full statistical characterization of the uncertainty related to all model parameters as quantified by posterior probability density functions, which is useful for comparisons across subjects and test settings. The ability to infer intractable control models from sensor data opens new possibilities for monitoring and predicting body status in health applications.

  8. Extending Working Life: Which Competencies are Crucial in Near-Retirement Age?

    PubMed

    Wiktorowicz, Justyna

    2018-01-01

    Nowadays, one of the most important economic and social phenomena is population ageing. Due to the low activity rate of older people, one of the most important challenges is to take various actions involving active ageing, which is supposed to extending working life, and along with it-improve the competencies of older people. The aim of this paper is to evaluate the relevance of different competencies for extending working life, with limiting the analysis for Poland. The paper also assesses the competencies of mature Polish people (aged 50+, but still in working age). In the statistical analysis, I used logistic regression, as well as descriptive statistics and appropriate statistical tests. The results show that among the actions aimed at extending working life, the most important are those related to lifelong learning, targeted at improving the competencies of the older generation. The competencies (both soft and hard) of people aged 50+ are more important than their formal education.

  9. 25 CFR 36.42 - Standard XV-Counseling services.

    Code of Federal Regulations, 2014 CFR

    2014-04-01

    ... objective assessment of student academic performance. Required formal tests shall be administered annually... standards, schools may use the state mandated academic achievement tests and accompanying requirements. These formal tests and their subtest contents, as well as the test-related procedures, shall include...

  10. 25 CFR 36.42 - Standard XV-Counseling services.

    Code of Federal Regulations, 2012 CFR

    2012-04-01

    ... objective assessment of student academic performance. Required formal tests shall be administered annually... standards, schools may use the state mandated academic achievement tests and accompanying requirements. These formal tests and their subtest contents, as well as the test-related procedures, shall include...

  11. 25 CFR 36.42 - Standard XV-Counseling services.

    Code of Federal Regulations, 2013 CFR

    2013-04-01

    ... objective assessment of student academic performance. Required formal tests shall be administered annually... standards, schools may use the state mandated academic achievement tests and accompanying requirements. These formal tests and their subtest contents, as well as the test-related procedures, shall include...

  12. 25 CFR 36.42 - Standard XV-Counseling services.

    Code of Federal Regulations, 2010 CFR

    2010-04-01

    ... objective assessment of student academic performance. Required formal tests shall be administered annually... standards, schools may use the state mandated academic achievement tests and accompanying requirements. These formal tests and their subtest contents, as well as the test-related procedures, shall include...

  13. 25 CFR 36.42 - Standard XV-Counseling services.

    Code of Federal Regulations, 2011 CFR

    2011-04-01

    ... objective assessment of student academic performance. Required formal tests shall be administered annually... standards, schools may use the state mandated academic achievement tests and accompanying requirements. These formal tests and their subtest contents, as well as the test-related procedures, shall include...

  14. Data-Driven Learning of Q-Matrix

    PubMed Central

    Liu, Jingchen; Xu, Gongjun; Ying, Zhiliang

    2013-01-01

    The recent surge of interests in cognitive assessment has led to developments of novel statistical models for diagnostic classification. Central to many such models is the well-known Q-matrix, which specifies the item–attribute relationships. This article proposes a data-driven approach to identification of the Q-matrix and estimation of related model parameters. A key ingredient is a flexible T-matrix that relates the Q-matrix to response patterns. The flexibility of the T-matrix allows the construction of a natural criterion function as well as a computationally amenable algorithm. Simulations results are presented to demonstrate usefulness and applicability of the proposed method. Extension to handling of the Q-matrix with partial information is presented. The proposed method also provides a platform on which important statistical issues, such as hypothesis testing and model selection, may be formally addressed. PMID:23926363

  15. LOGICAL REASONING ABILITY AND STUDENT PERFORMANCE IN GENERAL CHEMISTRY.

    PubMed

    Bird, Lillian

    2010-03-01

    Logical reasoning skills of students enrolled in General Chemistry at the University of Puerto Rico in Río Piedras were measured using the Group Assessment of Logical Thinking (GALT) test. The results were used to determine the students' cognitive level (concrete, transitional, formal) as well as their level of performance by logical reasoning mode (mass/volume conservation, proportional reasoning, correlational reasoning, experimental variable control, probabilistic reasoning and combinatorial reasoning). This information was used to identify particular deficiencies and gender effects, and to determine which logical reasoning modes were the best predictors of student performance in the general chemistry course. Statistical tests to analyze the relation between (a) operational level and final grade in both semesters of the course; (b) GALT test results and performance in the ACS General Chemistry Examination; and (c) operational level and student approach (algorithmic or conceptual) towards a test question that may be answered correctly using either strategy, were also performed.

  16. LOGICAL REASONING ABILITY AND STUDENT PERFORMANCE IN GENERAL CHEMISTRY

    PubMed Central

    Bird, Lillian

    2010-01-01

    Logical reasoning skills of students enrolled in General Chemistry at the University of Puerto Rico in Río Piedras were measured using the Group Assessment of Logical Thinking (GALT) test. The results were used to determine the students’ cognitive level (concrete, transitional, formal) as well as their level of performance by logical reasoning mode (mass/volume conservation, proportional reasoning, correlational reasoning, experimental variable control, probabilistic reasoning and combinatorial reasoning). This information was used to identify particular deficiencies and gender effects, and to determine which logical reasoning modes were the best predictors of student performance in the general chemistry course. Statistical tests to analyze the relation between (a) operational level and final grade in both semesters of the course; (b) GALT test results and performance in the ACS General Chemistry Examination; and (c) operational level and student approach (algorithmic or conceptual) towards a test question that may be answered correctly using either strategy, were also performed. PMID:21373364

  17. Aspects of First Year Statistics Students' Reasoning When Performing Intuitive Analysis of Variance: Effects of Within- and Between-Group Variability

    ERIC Educational Resources Information Center

    Trumpower, David L.

    2015-01-01

    Making inferences about population differences based on samples of data, that is, performing intuitive analysis of variance (IANOVA), is common in everyday life. However, the intuitive reasoning of individuals when making such inferences (even following statistics instruction), often differs from the normative logic of formal statistics. The…

  18. Evaluation of the performance of statistical tests used in making cleanup decisions at Superfund sites. Part 1: Choosing an appropriate statistical test

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Berman, D.W.; Allen, B.C.; Van Landingham, C.B.

    1998-12-31

    The decision rules commonly employed to determine the need for cleanup are evaluated both to identify conditions under which they lead to erroneous conclusions and to quantify the rate that such errors occur. Their performance is also compared with that of other applicable decision rules. The authors based the evaluation of decision rules on simulations. Results are presented as power curves. These curves demonstrate that the degree of statistical control achieved is independent of the form of the null hypothesis. The loss of statistical control that occurs when a decision rule is applied to a data set that does notmore » satisfy the rule`s validity criteria is also clearly demonstrated. Some of the rules evaluated do not offer the formal statistical control that is an inherent design feature of other rules. Nevertheless, results indicate that such informal decision rules may provide superior overall control of error rates, when their application is restricted to data exhibiting particular characteristics. The results reported here are limited to decision rules applied to uncensored and lognormally distributed data. To optimize decision rules, it is necessary to evaluate their behavior when applied to data exhibiting a range of characteristics that bracket those common to field data. The performance of decision rules applied to data sets exhibiting a broader range of characteristics is reported in the second paper of this study.« less

  19. Probing gravity theory and cosmic acceleration using (in)consistency tests between cosmological data sets

    NASA Astrophysics Data System (ADS)

    Ishak-Boushaki, Mustapha B.

    2018-06-01

    Testing general relativity at cosmological scales and probing the cause of cosmic acceleration are among important objectives targeted by incoming and future astronomical surveys and experiments. I present our recent results on (in)consistency tests that can provide insights about the underlying gravity theory and cosmic acceleration using cosmological data sets. We use new statistical measures that can detect discordances between data sets when present. We use an algorithmic procedure based on these new measures that is able to identify in some cases whether an inconsistency is due to problems related to systematic effects in the data or to the underlying model. Some recent published tensions between data sets are also examined using our formalism, including the Hubble constant measurements, Planck and Large-Scale-Structure. (Work supported in part by NSF under Grant No. AST-1517768).

  20. Enrichment analysis in high-throughput genomics - accounting for dependency in the NULL.

    PubMed

    Gold, David L; Coombes, Kevin R; Wang, Jing; Mallick, Bani

    2007-03-01

    Translating the overwhelming amount of data generated in high-throughput genomics experiments into biologically meaningful evidence, which may for example point to a series of biomarkers or hint at a relevant pathway, is a matter of great interest in bioinformatics these days. Genes showing similar experimental profiles, it is hypothesized, share biological mechanisms that if understood could provide clues to the molecular processes leading to pathological events. It is the topic of further study to learn if or how a priori information about the known genes may serve to explain coexpression. One popular method of knowledge discovery in high-throughput genomics experiments, enrichment analysis (EA), seeks to infer if an interesting collection of genes is 'enriched' for a Consortium particular set of a priori Gene Ontology Consortium (GO) classes. For the purposes of statistical testing, the conventional methods offered in EA software implicitly assume independence between the GO classes. Genes may be annotated for more than one biological classification, and therefore the resulting test statistics of enrichment between GO classes can be highly dependent if the overlapping gene sets are relatively large. There is a need to formally determine if conventional EA results are robust to the independence assumption. We derive the exact null distribution for testing enrichment of GO classes by relaxing the independence assumption using well-known statistical theory. In applications with publicly available data sets, our test results are similar to the conventional approach which assumes independence. We argue that the independence assumption is not detrimental.

  1. An investigation of successful and unsuccessful students' problem solving in stoichiometry

    NASA Astrophysics Data System (ADS)

    Gulacar, Ozcan

    In this study, I investigated how successful and unsuccessful students solve stoichiometry problems. I focus on three research questions: (1) To what extent do the difficulties in solving stoichiometry problems stem from poor understanding of pieces (domain-specific knowledge) versus students' inability to link those pieces together (conceptual knowledge)? (2) What are the differences between successful and unsuccessful students in knowledge, ability, and practice? (3) Is there a connection between students' (a) cognitive development levels, (b) formal (proportional) reasoning abilities, (c) working memory capacities, (d) conceptual understanding of particle nature of matter, (e) understanding of the mole concept, and their problem-solving achievement in stoichiometry? In this study, nine successful students and eight unsuccessful students participated. Both successful and unsuccessful students were selected among the students taking a general chemistry course at a mid-western university. The students taking this class were all science, non-chemistry majors. Characteristics of successful and unsuccessful students were determined through tests, audio and videotapes analyses, and subjects' written works. The Berlin Particle Concept Inventory, the Mole Concept Achievement Test, the Test of Logical Thinking, the Digits Backward Test, and the Longeot Test were used to measure students' conceptual understanding of particle nature of matter and mole concept, formal (proportional) reasoning ability, working memory capacity, and cognitive development, respectively. Think-aloud problem-solving protocols were also used to better explore the differences between successful and unsuccessful students' knowledge structures and behaviors during problem solving. Although successful students did not show significantly better performance on doing pieces (domain-specific knowledge) and solving exercises than unsuccessful counterparts did, they appeared to be more successful in linking the pieces (conceptual knowledge) and solving complex problems than the unsuccessful student did. Successful students also appeared to be different in how they approach problems, what strategies they use, and in making fewer algorithmic mistakes when compared to unsuccessful students. Successful students, however, did not seem to be statistically significantly different from the unsuccessful students in terms of quantitatively tested cognitive abilities except formal (proportional) reasoning ability and in the understanding of mole concept.

  2. Two Paradoxes in Linear Regression Analysis.

    PubMed

    Feng, Ge; Peng, Jing; Tu, Dongke; Zheng, Julia Z; Feng, Changyong

    2016-12-25

    Regression is one of the favorite tools in applied statistics. However, misuse and misinterpretation of results from regression analysis are common in biomedical research. In this paper we use statistical theory and simulation studies to clarify some paradoxes around this popular statistical method. In particular, we show that a widely used model selection procedure employed in many publications in top medical journals is wrong. Formal procedures based on solid statistical theory should be used in model selection.

  3. A Linguistic Truth-Valued Temporal Reasoning Formalism and Its Implementation

    NASA Astrophysics Data System (ADS)

    Lu, Zhirui; Liu, Jun; Augusto, Juan C.; Wang, Hui

    Temporality and uncertainty are important features of many real world systems. Solving problems in such systems requires the use of formal mechanism such as logic systems, statistical methods or other reasoning and decision-making methods. In this paper, we propose a linguistic truth-valued temporal reasoning formalism to enable the management of both features concurrently using a linguistic truth valued logic and a temporal logic. We also provide a backward reasoning algorithm which allows the answering of user queries. A simple but realistic scenario in a smart home application is used to illustrate our work.

  4. Combining statistical inference and decisions in ecology.

    PubMed

    Williams, Perry J; Hooten, Mevin B

    2016-09-01

    Statistical decision theory (SDT) is a sub-field of decision theory that formally incorporates statistical investigation into a decision-theoretic framework to account for uncertainties in a decision problem. SDT provides a unifying analysis of three types of information: statistical results from a data set, knowledge of the consequences of potential choices (i.e., loss), and prior beliefs about a system. SDT links the theoretical development of a large body of statistical methods, including point estimation, hypothesis testing, and confidence interval estimation. The theory and application of SDT have mainly been developed and published in the fields of mathematics, statistics, operations research, and other decision sciences, but have had limited exposure in ecology. Thus, we provide an introduction to SDT for ecologists and describe its utility for linking the conventionally separate tasks of statistical investigation and decision making in a single framework. We describe the basic framework of both Bayesian and frequentist SDT, its traditional use in statistics, and discuss its application to decision problems that occur in ecology. We demonstrate SDT with two types of decisions: Bayesian point estimation and an applied management problem of selecting a prescribed fire rotation for managing a grassland bird species. Central to SDT, and decision theory in general, are loss functions. Thus, we also provide basic guidance and references for constructing loss functions for an SDT problem. © 2016 by the Ecological Society of America.

  5. Information Technology and Literacy Assessment.

    ERIC Educational Resources Information Center

    Balajthy, Ernest

    2002-01-01

    Compares technology predictions from around 1989 with the technology of 2002. Discusses the place of computer-based assessment today, computer-scored testing, computer-administered formal assessment, Internet-based formal assessment, computerized adaptive tests, placement tests, informal assessment, electronic portfolios, information management,…

  6. Fundamental Statistical Descriptions of Plasma Turbulence in Magnetic Fields

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    John A. Krommes

    2001-02-16

    A pedagogical review of the historical development and current status (as of early 2000) of systematic statistical theories of plasma turbulence is undertaken. Emphasis is on conceptual foundations and methodology, not practical applications. Particular attention is paid to equations and formalism appropriate to strongly magnetized, fully ionized plasmas. Extensive reference to the literature on neutral-fluid turbulence is made, but the unique properties and problems of plasmas are emphasized throughout. Discussions are given of quasilinear theory, weak-turbulence theory, resonance-broadening theory, and the clump algorithm. Those are developed independently, then shown to be special cases of the direct-interaction approximation (DIA), which providesmore » a central focus for the article. Various methods of renormalized perturbation theory are described, then unified with the aid of the generating-functional formalism of Martin, Siggia, and Rose. A general expression for the renormalized dielectric function is deduced and discussed in detail. Modern approaches such as decimation and PDF methods are described. Derivations of DIA-based Markovian closures are discussed. The eddy-damped quasinormal Markovian closure is shown to be nonrealizable in the presence of waves, and a new realizable Markovian closure is presented. The test-field model and a realizable modification thereof are also summarized. Numerical solutions of various closures for some plasma-physics paradigms are reviewed. The variational approach to bounds on transport is developed. Miscellaneous topics include Onsager symmetries for turbulence, the interpretation of entropy balances for both kinetic and fluid descriptions, self-organized criticality, statistical interactions between disparate scales, and the roles of both mean and random shear. Appendices are provided on Fourier transform conventions, dimensional and scaling analysis, the derivations of nonlinear gyrokinetic and gyrofluid equations, stochasticity criteria for quasilinear theory, formal aspects of resonance-broadening theory, Novikov's theorem, the treatment of weak inhomogeneity, the derivation of the Vlasov weak-turbulence wave kinetic equation from a fully renormalized description, some features of a code for solving the direct-interaction approximation and related Markovian closures, the details of the solution of the EDQNM closure for a solvable three-wave model, and the notation used in the article.« less

  7. A sub-ensemble theory of ideal quantum measurement processes

    NASA Astrophysics Data System (ADS)

    Allahverdyan, Armen E.; Balian, Roger; Nieuwenhuizen, Theo M.

    2017-01-01

    In order to elucidate the properties currently attributed to ideal measurements, one must explain how the concept of an individual event with a well-defined outcome may emerge from quantum theory which deals with statistical ensembles, and how different runs issued from the same initial state may end up with different final states. This so-called "measurement problem" is tackled with two guidelines. On the one hand, the dynamics of the macroscopic apparatus A coupled to the tested system S is described mathematically within a standard quantum formalism, where " q-probabilities" remain devoid of interpretation. On the other hand, interpretative principles, aimed to be minimal, are introduced to account for the expected features of ideal measurements. Most of the five principles stated here, which relate the quantum formalism to physical reality, are straightforward and refer to macroscopic variables. The process can be identified with a relaxation of S + A to thermodynamic equilibrium, not only for a large ensemble E of runs but even for its sub-ensembles. The different mechanisms of quantum statistical dynamics that ensure these types of relaxation are exhibited, and the required properties of the Hamiltonian of S + A are indicated. The additional theoretical information provided by the study of sub-ensembles remove Schrödinger's quantum ambiguity of the final density operator for E which hinders its direct interpretation, and bring out a commutative behaviour of the pointer observable at the final time. The latter property supports the introduction of a last interpretative principle, needed to switch from the statistical ensembles and sub-ensembles described by quantum theory to individual experimental events. It amounts to identify some formal " q-probabilities" with ordinary frequencies, but only those which refer to the final indications of the pointer. The desired properties of ideal measurements, in particular the uniqueness of the result for each individual run of the ensemble and von Neumann's reduction, are thereby recovered with economic interpretations. The status of Born's rule involving both A and S is re-evaluated, and contextuality of quantum measurements is made obvious.

  8. Expected values and variances of Bragg peak intensities measured in a nanocrystalline powder diffraction experiment

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Öztürk, Hande; Noyan, I. Cevdet

    A rigorous study of sampling and intensity statistics applicable for a powder diffraction experiment as a function of crystallite size is presented. Our analysis yields approximate equations for the expected value, variance and standard deviations for both the number of diffracting grains and the corresponding diffracted intensity for a given Bragg peak. The classical formalism published in 1948 by Alexander, Klug & Kummer [J. Appl. Phys.(1948),19, 742–753] appears as a special case, limited to large crystallite sizes, here. It is observed that both the Lorentz probability expression and the statistics equations used in the classical formalism are inapplicable for nanocrystallinemore » powder samples.« less

  9. Expected values and variances of Bragg peak intensities measured in a nanocrystalline powder diffraction experiment

    DOE PAGES

    Öztürk, Hande; Noyan, I. Cevdet

    2017-08-24

    A rigorous study of sampling and intensity statistics applicable for a powder diffraction experiment as a function of crystallite size is presented. Our analysis yields approximate equations for the expected value, variance and standard deviations for both the number of diffracting grains and the corresponding diffracted intensity for a given Bragg peak. The classical formalism published in 1948 by Alexander, Klug & Kummer [J. Appl. Phys.(1948),19, 742–753] appears as a special case, limited to large crystallite sizes, here. It is observed that both the Lorentz probability expression and the statistics equations used in the classical formalism are inapplicable for nanocrystallinemore » powder samples.« less

  10. Educational interventions to improve knowledge and skills of interns towards prevention and control of hospital-associated infections.

    PubMed

    Dogra, Sandeep; Mahajan, Ruchita; Jad, Beena; Mahajan, Bella

    2015-08-01

    We believe that there is significant educational deficit amongst interns regarding up-to-date formal knowledge and skills on healthcare-associated infections (HAIs) which might compromise patient safety. This urgently requires curriculum innovations to ensure their formal training on HAIs prevention and control. Education of interns to improve their knowledge and skills toward HAIs prevention and control. This pilot study was conducted in interns using a multimodal approach consisting of a combination of videos, PowerPoint presentation, and hands-on demonstration to provide applied and practical teaching on prevention and control of HAIs. Pre- and post-test assessment of knowledge, attitude, and skills was carried out by multiple choice questions, 5-point Likert scale, and Objective Structured Practical Examination respectively. Paired t-test. A statistically significant improvement in the overall score rates between pre- and post-test of intern's was seen, suggesting that educational programs have a positive effect. Intern's felt benefitted from interventions focused on HAIs prevention and control and hoped that such sessions are integrated in the regular undergraduate curriculum. A majority of the students felt that their learning style assessment matched well with their own perception of learning preference. Assessment drives learning; hence strengthening the contribution of health-care workers to HAIs prevention programs should include measures that enhance knowledge, improve skills and develop appropriate attitudes, resulting in safety and quality of patient care.

  11. Robust functional statistics applied to Probability Density Function shape screening of sEMG data.

    PubMed

    Boudaoud, S; Rix, H; Al Harrach, M; Marin, F

    2014-01-01

    Recent studies pointed out possible shape modifications of the Probability Density Function (PDF) of surface electromyographical (sEMG) data according to several contexts like fatigue and muscle force increase. Following this idea, criteria have been proposed to monitor these shape modifications mainly using High Order Statistics (HOS) parameters like skewness and kurtosis. In experimental conditions, these parameters are confronted with small sample size in the estimation process. This small sample size induces errors in the estimated HOS parameters restraining real-time and precise sEMG PDF shape monitoring. Recently, a functional formalism, the Core Shape Model (CSM), has been used to analyse shape modifications of PDF curves. In this work, taking inspiration from CSM method, robust functional statistics are proposed to emulate both skewness and kurtosis behaviors. These functional statistics combine both kernel density estimation and PDF shape distances to evaluate shape modifications even in presence of small sample size. Then, the proposed statistics are tested, using Monte Carlo simulations, on both normal and Log-normal PDFs that mimic observed sEMG PDF shape behavior during muscle contraction. According to the obtained results, the functional statistics seem to be more robust than HOS parameters to small sample size effect and more accurate in sEMG PDF shape screening applications.

  12. Emerging Concepts of Data Integration in Pathogen Phylodynamics.

    PubMed

    Baele, Guy; Suchard, Marc A; Rambaut, Andrew; Lemey, Philippe

    2017-01-01

    Phylodynamics has become an increasingly popular statistical framework to extract evolutionary and epidemiological information from pathogen genomes. By harnessing such information, epidemiologists aim to shed light on the spatio-temporal patterns of spread and to test hypotheses about the underlying interaction of evolutionary and ecological dynamics in pathogen populations. Although the field has witnessed a rich development of statistical inference tools with increasing levels of sophistication, these tools initially focused on sequences as their sole primary data source. Integrating various sources of information, however, promises to deliver more precise insights in infectious diseases and to increase opportunities for statistical hypothesis testing. Here, we review how the emerging concept of data integration is stimulating new advances in Bayesian evolutionary inference methodology which formalize a marriage of statistical thinking and evolutionary biology. These approaches include connecting sequence to trait evolution, such as for host, phenotypic and geographic sampling information, but also the incorporation of covariates of evolutionary and epidemic processes in the reconstruction procedures. We highlight how a full Bayesian approach to covariate modeling and testing can generate further insights into sequence evolution, trait evolution, and population dynamics in pathogen populations. Specific examples demonstrate how such approaches can be used to test the impact of host on rabies and HIV evolutionary rates, to identify the drivers of influenza dispersal as well as the determinants of rabies cross-species transmissions, and to quantify the evolutionary dynamics of influenza antigenicity. Finally, we briefly discuss how data integration is now also permeating through the inference of transmission dynamics, leading to novel insights into tree-generative processes and detailed reconstructions of transmission trees. [Bayesian inference; birth–death models; coalescent models; continuous trait evolution; covariates; data integration; discrete trait evolution; pathogen phylodynamics.

  13. Emerging Concepts of Data Integration in Pathogen Phylodynamics

    PubMed Central

    Baele, Guy; Suchard, Marc A.; Rambaut, Andrew; Lemey, Philippe

    2017-01-01

    Phylodynamics has become an increasingly popular statistical framework to extract evolutionary and epidemiological information from pathogen genomes. By harnessing such information, epidemiologists aim to shed light on the spatio-temporal patterns of spread and to test hypotheses about the underlying interaction of evolutionary and ecological dynamics in pathogen populations. Although the field has witnessed a rich development of statistical inference tools with increasing levels of sophistication, these tools initially focused on sequences as their sole primary data source. Integrating various sources of information, however, promises to deliver more precise insights in infectious diseases and to increase opportunities for statistical hypothesis testing. Here, we review how the emerging concept of data integration is stimulating new advances in Bayesian evolutionary inference methodology which formalize a marriage of statistical thinking and evolutionary biology. These approaches include connecting sequence to trait evolution, such as for host, phenotypic and geographic sampling information, but also the incorporation of covariates of evolutionary and epidemic processes in the reconstruction procedures. We highlight how a full Bayesian approach to covariate modeling and testing can generate further insights into sequence evolution, trait evolution, and population dynamics in pathogen populations. Specific examples demonstrate how such approaches can be used to test the impact of host on rabies and HIV evolutionary rates, to identify the drivers of influenza dispersal as well as the determinants of rabies cross-species transmissions, and to quantify the evolutionary dynamics of influenza antigenicity. Finally, we briefly discuss how data integration is now also permeating through the inference of transmission dynamics, leading to novel insights into tree-generative processes and detailed reconstructions of transmission trees. [Bayesian inference; birth–death models; coalescent models; continuous trait evolution; covariates; data integration; discrete trait evolution; pathogen phylodynamics. PMID:28173504

  14. A Tool for Requirements-Based Programming

    NASA Technical Reports Server (NTRS)

    Rash, James L.; Hinchey, Michael G.; Rouff, Christopher A.; Gracanin, Denis; Erickson, John

    2005-01-01

    Absent a general method for mathematically sound, automated transformation of customer requirements into a formal model of the desired system, developers must resort to either manual application of formal methods or to system testing (either manual or automated). While formal methods have afforded numerous successes, they present serious issues, e.g., costs to gear up to apply them (time, expensive staff), and scalability and reproducibility when standards in the field are not settled. The testing path cannot be walked to the ultimate goal, because exhaustive testing is infeasible for all but trivial systems. So system verification remains problematic. System or requirements validation is similarly problematic. The alternatives available today depend on either having a formal model or pursuing enough testing to enable the customer to be certain that system behavior meets requirements. The testing alternative for non-trivial systems always have some system behaviors unconfirmed and therefore is not the answer. To ensure that a formal model is equivalent to the customer s requirements necessitates that the customer somehow fully understands the formal model, which is not realistic. The predominant view that provably correct system development depends on having a formal model of the system leads to a desire for a mathematically sound method to automate the transformation of customer requirements into a formal model. Such a method, an augmentation of requirements-based programming, will be briefly described in this paper, and a prototype tool to support it will be described. The method and tool enable both requirements validation and system verification for the class of systems whose behavior can be described as scenarios. An application of the tool to a prototype automated ground control system for NASA mission is presented.

  15. Testing New Physics with the Cosmic Microwave Background

    NASA Astrophysics Data System (ADS)

    Gluscevic, Vera

    2013-01-01

    In my thesis work, I have developed and applied tests of new fundamental physics that utilize high-precision CMB polarization measurements. I especially focused on a wide class of dark energy models that propose existence of new scalar fields to explain accelerated expansion of the Universe. Such fields naturally exhibit a weak interaction with photons, giving rise to "cosmic birefringence"---a rotation of the polarization plane of light traveling cosmological distances, which alters the statistics of the CMB fluctuations in the sky by inducing a characteristic B-mode polarization. A birefringent rotation of the CMB would be smoking-gun evidence that dark energy is a dynamical component rather than a cosmological constant, while its absence gives clues about the allowed regions of the parameter space for new models. I developed a full-sky formalism to search for cosmic birefringence by cross-correlating CMB temperature and polarization maps, after allowing for the rotation angle to vary across the sky. With my collaborators, I also proposed a cross-correlation of the rotation-angle estimator with the CMB temperature as a novel statistical probe which can boost signal-to-noise in the case of marginal detection and help disentangle the underlying physical models. I then investigated the degeneracy between the rotation signal and the signals from other exotic scenarios that induce a similar B-mode polarization signature, such as chiral primordial gravitational waves, and demonstrated that these effects are completely separable. Finally, I applied this formalism to WMAP-7 data and derived the first CMB constraint on the power spectrum of the birefringent-rotation angle and presented forecasts for future experiments. To demonstrate the value of this analysis method beyond the search for direction-dependent cosmic birefringence, I have also used it to probe patchy screening from the epoch of cosmic reionization with WMAP-7 data.

  16. Dry eye disease: prevalence, distribution and determinants in a hospital-based population.

    PubMed

    Onwubiko, Stella N; Eze, Boniface I; Udeh, Nnemma N; Arinze, Obinna C; Onwasigwe, Ernest N; Umeh, Rich E

    2014-06-01

    To determine the prevalence, distribution and risk factors for dry eye disease (DED) in a tertiary ophthalmic outpatient population. The study was a cross-sectional descriptive hospital-based survey conducted at the Eye clinic of the University of Nigeria Teaching Hospital (UNTH), Enugu, between September and December, 2011. The participants comprised adult ophthalmic outpatients aged 18 years or older. Participants' sociodemographic data were obtained. Dry eye disease was assessed subjectively with the Ocular Surface Disease Index (OSDI) questionnaire; and objectively with Schirmer's test and Tear-film Break-up Time (TBUT). An OSDI score of ≥ 50 with a TBUT of <10s or Schirmer's test reading of <10mm was considered diagnostic of DED. Descriptive and analytical statistics were performed. In all comparisons, a p<0.05 was considered statistically significant. The participants (n=402) comprised 193 males and 209 females who were aged 50.1 ± 19.06 SD years (range 18-94 years). The majorities of the participants were married - 74.1%, possessed formal education - 86.0% and were civil servants - 33.6%. The prevalence of DED was 19.2%. Dry eye disease was significantly associated with age>40 years (OR 1.88, 95% CI 1.06-3.35, p=0.0004), non-possession of formal education (OR 0.40, 95% CI 0.21-0.74, p=0.001) but not gender (OR 1.48, 95% CI 0.89-2.46, p=0.158). The prevalence of DED among ophthalmic outpatients at UNTH, Enugu, is comparatively high. Older age and illiteracy are predictors of DED. There is need for high index of diagnostic suspicion to prevent sight-threatening complications of DED. Copyright © 2013 British Contact Lens Association. Published by Elsevier Ltd. All rights reserved.

  17. Formative Use of Intuitive Analysis of Variance

    ERIC Educational Resources Information Center

    Trumpower, David L.

    2013-01-01

    Students' informal inferential reasoning (IIR) is often inconsistent with the normative logic underlying formal statistical methods such as Analysis of Variance (ANOVA), even after instruction. In two experiments reported here, student's IIR was assessed using an intuitive ANOVA task at the beginning and end of a statistics course. In both…

  18. Applied Statistics: From Bivariate through Multivariate Techniques [with CD-ROM

    ERIC Educational Resources Information Center

    Warner, Rebecca M.

    2007-01-01

    This book provides a clear introduction to widely used topics in bivariate and multivariate statistics, including multiple regression, discriminant analysis, MANOVA, factor analysis, and binary logistic regression. The approach is applied and does not require formal mathematics; equations are accompanied by verbal explanations. Students are asked…

  19. Potential-of-mean-force description of ionic interactions and structural hydration in biomolecular systems

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Hummer, G.; Garcia, A.E.; Soumpasis, D.M.

    1994-10-01

    To understand the functioning of living organisms on a molecular level, it is crucial to dissect the intricate interplay of the immense number of biological molecules. Most of the biochemical processes in cells occur in a liquid environment formed mainly by water and ions. This solvent environment plays an important role in biological systems. The potential-of-mean-force (PMF) formalism attempts to describe quantitatively the interactions of the solvent with biological macromolecules on the basis of an approximate statistical-mechanical representation. At its current status of development, it deals with ionic effects on the biomolecular structure and with the structural hydration of biomolecules.more » The underlying idea of the PMF formalism is to identify the dominant sources of interactions and incorporate these interactions into the theoretical formalism using PMF`s (or particle correlation functions) extracted from bulk-liquid systems. In the following, the authors shall briefly outline the statistical-mechanical foundation of the PMF formalism and introduce the PMF expansion formalism, which is intimately linked to superposition approximations for higher-order particle correlation functions. The authors shall then sketch applications, which describe the effects of the ionic environment on nucleic-acid structure. Finally, the authors shall present the more recent extension of the PMF idea to describe quantitatively the structural hydration of biomolecules. Results for the interface of ice and water and for the hydration of deoxyribonucleic acid (DNA) will be discussed.« less

  20. Two Paradoxes in Linear Regression Analysis

    PubMed Central

    FENG, Ge; PENG, Jing; TU, Dongke; ZHENG, Julia Z.; FENG, Changyong

    2016-01-01

    Summary Regression is one of the favorite tools in applied statistics. However, misuse and misinterpretation of results from regression analysis are common in biomedical research. In this paper we use statistical theory and simulation studies to clarify some paradoxes around this popular statistical method. In particular, we show that a widely used model selection procedure employed in many publications in top medical journals is wrong. Formal procedures based on solid statistical theory should be used in model selection. PMID:28638214

  1. Robot-assisted laparoscopic skills development: formal versus informal training.

    PubMed

    Benson, Aaron D; Kramer, Brandan A; Boehler, Margaret; Schwind, Cathy J; Schwartz, Bradley F

    2010-08-01

    The learning curve for robotic surgery is not completely defined, and ideal training components have not yet been identified. We attempted to determine whether skill development would be accelerated with formal, organized instruction in robotic surgical techniques versus informal practice alone. Forty-three medical students naive to robotic surgery were randomized into two groups and tested on three tasks using the robotic platform. Between the testing sessions, the students were given equally timed practice sessions. The formal training group participated in an organized, formal training session with instruction from an attending robotic surgeon, whereas the informal training group participated in an equally timed unstructured practice session with the robot. The results were compared based on technical score and time to completion of each task. There was no difference between groups in prepractice testing for any task. In postpractice testing, there was no difference between groups for the ring transfer tasks. However, for the suture placement and knot-tying task, the technical score of the formal training group was significantly better than that of the informal training group (p < 0.001), yet time to completion was not different. Although formal training may not be necessary for basic skills, formal instruction for more advanced skills, such as suture placement and knot tying, is important in developing skills needed for effective robotic surgery. These findings may be important in formulating potential skills labs or training courses for robotic surgery.

  2. Open and Distance Learning and Information and Communication Technologies--Implications for Formal and Non-Formal Education: A Kenyan Case

    ERIC Educational Resources Information Center

    Situma, David Barasa

    2015-01-01

    The female population in Kenya was reported at 50.05% in 2011, according to a World Bank report published in 2012. Despite this slightly higher percentage over males, women in Kenya are not well represented in education and training compared to their male counterparts (Kenya National Bureau of Statistics, 2012). The need to empower girls and women…

  3. On the analysis of studies of choice

    PubMed Central

    Mullins, Eamonn; Agunwamba, Christian C.; Donohoe, Anthony J.

    1982-01-01

    In a review of 103 sets of data from 23 different studies of choice, Baum (1979) concluded that whereas undermatching was most commonly observed for responses, the time measure generally conformed to the matching relation. A reexamination of the evidence presented by Baum concludes that undermatching is the most commonly observed finding for both measures. Use of the coefficient of determination by both Baum (1979) and de Villiers (1977) for assessing when matching occurs is criticized on statistical grounds. An alternative to the loss-in-predictability criterion used by Baum (1979) is proposed. This alternative statistic has a simple operational meaning and is related to the usual F-ratio test. It can therefore be used as a formal test of the hypothesis that matching occurs. Baum (1979) also suggests that slope values of between .90 and 1.11 can be considered good approximations to matching. It is argued that the establishment of a fixed interval as a criterion for determining when matching occurs, is inappropriate. A confidence interval based on the data from any given experiment is suggested as a more useful method of assessment. PMID:16812271

  4. What Sensing Tells Us: Towards a Formal Theory of Testing for Dynamical Systems

    NASA Technical Reports Server (NTRS)

    McIlraith, Sheila; Scherl, Richard

    2005-01-01

    Just as actions can have indirect effects on the state of the world, so too can sensing actions have indirect effects on an agent's state of knowledge. In this paper, we investigate "what sensing actions tell us", i.e., what an agent comes to know indirectly from the outcome of a sensing action, given knowledge of its actions and state constraints that hold in the world. To this end, we propose a formalization of the notion of testing within a dialect of the situation calculus that includes knowledge and sensing actions. Realizing this formalization requires addressing the ramification problem for sensing actions. We formalize simple tests as sensing actions. Complex tests are expressed in the logic programming language Golog. We examine what it means to perform a test, and how the outcome of a test affects an agent's state of knowledge. Finally, we propose automated reasoning techniques for test generation and complex-test verification, under certain restrictions. The work presented in this paper is relevant to a number of application domains including diagnostic problem solving, natural language understanding, plan recognition, and active vision.

  5. Linkage analysis of chromosome 22q12-13 in a United Kingdom/Icelandic sample of 23 multiplex schizophrenia families

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Kalsi, G.; Read, T.; Butler, R.

    A possible linkage to a genetic subtype of schizophrenia and related disorders has been reported on the long arm of chromosome 22 at q12-13. However formal statistical tests in a combined sample could not reject homogeneity and prove that there was linked subgroup of families. We have studied 23 schizophrenia pedigrees to test whether some multiplex schizophrenia families may be linked to the microsatellite markers D22S274 and D22S283 which span the 22q12-13 region. Two point followed by multipoint lod and non-parametric linkage analyses under the assumption of heterogeneity provided no evidence for linkage over the relevant region. 16 refs., 4more » tabs.« less

  6. Assessing Student Preparation through Placement Tests

    NASA Astrophysics Data System (ADS)

    McFate, Craig; Olmsted, John, III

    1999-04-01

    The chemistry department at California State University, Fullerton, uses a placement test of its own design to assess student readiness to enroll in General Chemistry. This test contains items designed to test cognitive skills more than factual knowledge. We have analyzed the ability of this test to predict student success (defined as passing the first-semester course with a C or better) using data for 845 students from four consecutive semesters. In common with other placement tests, we find a weak but statistically significant correlation between test performance and course grades. More meaningfully, there is a strong correlation (R2 = 0.82) between test score and course success, sufficient to use for counseling purposes. An item analysis was conducted to determine what types of questions provide the best predictability. Six questions from the full set of 25 were identified as strong predictors, on the basis of discrimination indices and coefficients of determination that were more than one standard deviation above the mean values for test items. These questions had little in common except for requiring multistep mathematical operations and formal reasoning.

  7. Attitudes towards suicide in urban and rural China: a population based, cross-sectional study.

    PubMed

    Zou, Yaming; Leung, Ricky; Lin, Shao; Yang, Mingan; Lu, Tao; Li, Xianyun; Gu, Jing; Hao, Chun; Dong, Guanghui; Hao, Yuantao

    2016-05-26

    Suicide intervention programs have been guided by findings that attitude towards suicide and suicidal behavior may be causally linked. These findings also make it imperative to identify the factors that influence attitudes towards suicide. However, there has been little research on attitudes towards suicide among the general population, especially in low-income and middle-income countries. This population-based, cross-sectional study investigated the associated factors of attitudes towards suicide among a representative sample of urban and rural adult residents in China. A multi-stage, stratified random sampling approach was implemented to select participants. Data were collected by a survey using the Scale of Public Attitudes about Suicide (SPAS). The survey also collected some socio-demographic factors and suicidal history of participants. Statistical tests were conducted to identify associated factors that account for variations in attitudes towards suicide. The residents in China generally hold a neutral attitude towards suicide. Attitudes towards suicide among Chinese residents were associated with age, duration of formal education, marital status, job and suicidal ideation. Different attitudinal subscales seemed not to share the same risk factors. However, gender, ethnicity, religious belief, housing style and economic status might not influence residents' attitudes towards suicide. Attitudes towards suicide among Chinese urban and rural residents generally had no statistical difference with one notable exception: opinions on whether or not suicides and suicide attempts are different phenomena. Age, duration of formal education, marital status, job and suicidal ideation seem to have an impact on attitudes towards suicide among residents. Urban and rural residents have similar attitudes towards suicide with the only statistically significance difference being their opinions on whether or not suicides and suicide attempts are different phenomena.

  8. Universal calculational recipe for solvent-mediated potential: based on a combination of integral equation theory and density functional theory

    NASA Astrophysics Data System (ADS)

    Zhou, Shiqi

    2004-07-01

    A universal formalism, which enables calculation of solvent-mediated potential (SMP) between two equal or non-equal solute particles with any shape immersed in solvent reservior consisting of atomic particle and/or polymer chain or their mixture, is proposed by importing a density functional theory externally into OZ equation systems. Only if size asymmetry of the solvent bath components is moderate, the present formalism can calculate the SMP in any complex fluids at the present development stage of statistical mechanics, and therefore avoids all of limitations of previous approaches for SMP. Preliminary calculation indicates the reliability of the present formalism.

  9. Statistical mechanics of the Huxley-Simmons model

    NASA Astrophysics Data System (ADS)

    Caruel, M.; Truskinovsky, L.

    2016-06-01

    The chemomechanical model of Huxley and Simmons (HS) [A. F. Huxley and R. M. Simmons, Nature 233, 533 (1971), 10.1038/233533a0] provides a paradigmatic description of mechanically induced collective conformational changes relevant in a variety of biological contexts, from muscles power stroke and hair cell gating to integrin binding and hairpin unzipping. We develop a statistical mechanical perspective on the HS model by exploiting a formal analogy with a paramagnetic Ising model. We first study the equilibrium HS model with a finite number of elements and compute explicitly its mechanical and thermal properties. To model kinetics, we derive a master equation and solve it for several loading protocols. The developed formalism is applicable to a broad range of allosteric systems with mean-field interactions.

  10. The development of principled connections and kind representations.

    PubMed

    Haward, Paul; Wagner, Laura; Carey, Susan; Prasada, Sandeep

    2018-07-01

    Kind representations draw an important distinction between properties that are understood as existing in instances of a kind by virtue of their being the kind of thing they are and properties that are not understood in this manner. For example, the property of barking for the kind dog is understood as being had by dogs by virtue of the fact that they are dogs. These properties are said to have a principled connection to the kind. In contrast, the property of wearing a collar is not understood as existing in instances by virtue of their being dogs, despite the fact that a large percentage of dogs wear collars. Such properties are said to have a statistical connection to the kind. Two experiments tested two signatures of principled connections in 4-7 year olds and adults: (i) that principled connections license normative expectations (e.g., we judge there to be something wrong with a dog that does not bark), and (ii) that principled connections license formal explanations which explain the existence of a property by reference to the kind (e.g., that barks because it is a dog). Experiment 1 showed that both the children and adults have normative expectations for properties that have a principled connection to a kind, but not those that have a mere statistical connection to a kind. Experiment 2 showed that both children and adults are more likely to provide a formal explanation when explaining the existence of properties with a principled connection to a kind than properties with statistical connections to their kinds. Both experiments showed no effect of age (over ages 4, 7, and adulthood) on the extent to which participants differentiated principled and statistical connections. We discuss the implications of the results for theories of conceptual representation and for the structure of explanation. Copyright © 2018 Elsevier B.V. All rights reserved.

  11. BlueSNP: R package for highly scalable genome-wide association studies using Hadoop clusters.

    PubMed

    Huang, Hailiang; Tata, Sandeep; Prill, Robert J

    2013-01-01

    Computational workloads for genome-wide association studies (GWAS) are growing in scale and complexity outpacing the capabilities of single-threaded software designed for personal computers. The BlueSNP R package implements GWAS statistical tests in the R programming language and executes the calculations across computer clusters configured with Apache Hadoop, a de facto standard framework for distributed data processing using the MapReduce formalism. BlueSNP makes computationally intensive analyses, such as estimating empirical p-values via data permutation, and searching for expression quantitative trait loci over thousands of genes, feasible for large genotype-phenotype datasets. http://github.com/ibm-bioinformatics/bluesnp

  12. Crystallography of rare galactic honeycomb structure near supernova 1987a

    NASA Technical Reports Server (NTRS)

    Noever, David A.

    1994-01-01

    Near supernova 1987a, the rare honeycomb structure of 20-30 galactic bubbles measures 30 x 90 light years. Its remarkable regularity in bubble size suggests a single-event origin which may correlate with the nearby supernova. To test the honeycomb's regularity in shape and size, the formalism of statistical crystallography is developed here for bubble sideness. The standard size-shape relations (Lewis's law, Desch's law, and Aboav-Weaire's law) govern area, perimeter and nearest neighbor shapes. Taken together, they predict a highly non-equilibrium structure for the galactic honeycomb which evolves as a bimodal shape distribution without dominant bubble perimeter energy.

  13. On the statistical significance of excess events: Remarks of caution and the need for a standard method of calculation

    NASA Technical Reports Server (NTRS)

    Staubert, R.

    1985-01-01

    Methods for calculating the statistical significance of excess events and the interpretation of the formally derived values are discussed. It is argued that a simple formula for a conservative estimate should generally be used in order to provide a common understanding of quoted values.

  14. Properties of a Formal Method to Model Emergence in Swarm-Based Systems

    NASA Technical Reports Server (NTRS)

    Rouff, Christopher; Vanderbilt, Amy; Truszkowski, Walt; Rash, James; Hinchey, Mike

    2004-01-01

    Future space missions will require cooperation between multiple satellites and/or rovers. Developers are proposing intelligent autonomous swarms for these missions, but swarm-based systems are difficult or impossible to test with current techniques. This viewgraph presentation examines the use of formal methods in testing swarm-based systems. The potential usefulness of formal methods in modeling the ANTS asteroid encounter mission is also examined.

  15. The inner formal structure of the H-T-P drawings: an exploratory study.

    PubMed

    Vass, Z

    1998-08-01

    The study describes some interrelated patterns of traits of the House-Tree-Person (H-T-P) drawings with the instruments of hierarchical cluster analysis. First, according to the literature 1 7 formal or structural aspects of the projective drawings were collected, after which a detailed manual for coding was compiled. Second, the interrater reliability and the consistency of this manual was tested. Third, the hierarchical cluster structure of the reliable and consistent formal aspects was analysed. Results are: (a) a psychometrically tested coding manual of the investigated formal-structural aspects, each of them illustrated with drawings that showed the highest interrater agreement; and (b) the hierarchic cluster structure of the formal aspects of the H-T-P drawings of "normal" adults.

  16. Redshift data and statistical inference

    NASA Technical Reports Server (NTRS)

    Newman, William I.; Haynes, Martha P.; Terzian, Yervant

    1994-01-01

    Frequency histograms and the 'power spectrum analysis' (PSA) method, the latter developed by Yu & Peebles (1969), have been widely employed as techniques for establishing the existence of periodicities. We provide a formal analysis of these two classes of methods, including controlled numerical experiments, to better understand their proper use and application. In particular, we note that typical published applications of frequency histograms commonly employ far greater numbers of class intervals or bins than is advisable by statistical theory sometimes giving rise to the appearance of spurious patterns. The PSA method generates a sequence of random numbers from observational data which, it is claimed, is exponentially distributed with unit mean and variance, essentially independent of the distribution of the original data. We show that the derived random processes is nonstationary and produces a small but systematic bias in the usual estimate of the mean and variance. Although the derived variable may be reasonably described by an exponential distribution, the tail of the distribution is far removed from that of an exponential, thereby rendering statistical inference and confidence testing based on the tail of the distribution completely unreliable. Finally, we examine a number of astronomical examples wherein these methods have been used giving rise to widespread acceptance of statistically unconfirmed conclusions.

  17. Assessment of change in conservation attitudes through zoo education

    NASA Astrophysics Data System (ADS)

    Randall, Teresa

    2011-12-01

    This study was conducted at the Oklahoma City Zoo in fall 2010 and subjects were students' ages 14-18 who either participated in a formal conservation education class led by zoo educators or in a field trip in which they were engaged in free-choice learning. Two research questions were: 1) Does a trip to the zoo affect conservation attitudes and 2) does learning experience, free-choice or formal, affect conservation attitudes? A criterion group design was used and the instrument used to measure conservation attitudes was Tool 4 from the Visitor Evaluation Toolbox produced by the Association of Zoos and Aquariums MIRP study (Falk, J., Bronnenkant, K., Vernon, C., & Heimlich, J., 2009). Group one (N=110) engaged in a free-choice (field trip only) experience and group two (N=367) engaged in a formal conservation education class. The survey was administered retrospectively to both groups upon completion of their learning experience at the zoo. Statistical analysis was conducted using SPSS 17.0. A paired sample t-test showed the overall mean within both groups increased in a positive direction from 67.965 (retrospective) to 72.345 (present). With alpha set at .05 the two-tailed probability was <0.001, therefore confirming that the change in conservation attitudes was significant. An independent sample t-test of the change in scores between the groups produced p values of 0.792 and 0.773 and revealed that the change was not significant. Findings did illustrate that a trip to the zoo did positively and significantly affect conservation attitudes among teens and that the type of learning experience did not significantly affect change in conservation attitude scores.

  18. Hunting Solomonoff's Swans: Exploring the Boundary Between Physics and Statistics in Hydrological Modeling

    NASA Astrophysics Data System (ADS)

    Nearing, G. S.

    2014-12-01

    Statistical models consistently out-perform conceptual models in the short term, however to account for a nonstationary future (or an unobserved past) scientists prefer to base predictions on unchanging and commutable properties of the universe - i.e., physics. The problem with physically-based hydrology models is, of course, that they aren't really based on physics - they are based on statistical approximations of physical interactions, and we almost uniformly lack an understanding of the entropy associated with these approximations. Thermodynamics is successful precisely because entropy statistics are computable for homogeneous (well-mixed) systems, and ergodic arguments explain the success of Newton's laws to describe systems that are fundamentally quantum in nature. Unfortunately, similar arguments do not hold for systems like watersheds that are heterogeneous at a wide range of scales. Ray Solomonoff formalized the situation in 1968 by showing that given infinite evidence, simultaneously minimizing model complexity and entropy in predictions always leads to the best possible model. The open question in hydrology is about what happens when we don't have infinite evidence - for example, when the future will not look like the past, or when one watershed does not behave like another. How do we isolate stationary and commutable components of watershed behavior? I propose that one possible answer to this dilemma lies in a formal combination of physics and statistics. In this talk I outline my recent analogue (Solomonoff's theorem was digital) of Solomonoff's idea that allows us to quantify the complexity/entropy tradeoff in a way that is intuitive to physical scientists. I show how to formally combine "physical" and statistical methods for model development in a way that allows us to derive the theoretically best possible model given any given physics approximation(s) and available observations. Finally, I apply an analogue of Solomonoff's theorem to evaluate the tradeoff between model complexity and prediction power.

  19. Performances on the CogState and Standard Neuropsychological Batteries Among HIV Patients Without Dementia

    PubMed Central

    Overton, Edgar Turner; Kauwe, John S.K.; Paul, Rob; Tashima, Karen; Tate, David F.; Patel, Pragna; Carpenter, Chuck; Patty, David; Brooks, John T.; Clifford, David B

    2013-01-01

    HIV-associated neurocognitive disorders (HAND) remain prevalent but challenging to diagnose particularly among non-demented individuals. To determine whether a brief computerized battery correlates with formal neurocognitive testing, we identified 46 HIV-infected persons who had undergone both formal neurocognitive testing and a brief computerized battery. Simple detection tests correlated best with formal neuropsychological testing. By multivariable regression model, 53% of the variance in the composite Global Deficit Score was accounted for by elements from the brief computerized tool (p<0.01). These data confirm previous correlation data with the computerized battery, yet illustrate remaining challenges for neurocognitive screening. PMID:21877204

  20. Testing for voter rigging in small polling stations

    PubMed Central

    Jimenez, Raúl; Hidalgo, Manuel; Klimek, Peter

    2017-01-01

    Nowadays, a large number of countries combine formal democratic institutions with authoritarian practices. Although in these countries the ruling elites may receive considerable voter support, they often use several manipulation tools to control election outcomes. A common practice of these regimes is the coercion and mobilization of large numbers of voters. This electoral irregularity is known as voter rigging, distinguishing it from vote rigging, which involves ballot stuffing or stealing. We develop a statistical test to quantify the extent to which the results of a particular election display traces of voter rigging. Our key hypothesis is that small polling stations are more susceptible to voter rigging because it is easier to identify opposing individuals, there are fewer eyewitnesses, and interested parties might reasonably expect fewer visits from election observers. We devise a general statistical method for testing whether voting behavior in small polling stations is significantly different from the behavior in their neighbor stations in a way that is consistent with the widespread occurrence of voter rigging. On the basis of a comparative analysis, the method enables third parties to conclude that an explanation other than simple variability is needed to explain geographic heterogeneities in vote preferences. We analyze 21 elections in 10 countries and find significant statistical anomalies compatible with voter rigging in Russia from 2007 to 2011, in Venezuela from 2006 to 2013, and in Uganda in 2011. Particularly disturbing is the case of Venezuela, where the smallest polling stations were decisive to the outcome of the 2013 presidential elections. PMID:28695193

  1. Testing for voter rigging in small polling stations.

    PubMed

    Jimenez, Raúl; Hidalgo, Manuel; Klimek, Peter

    2017-06-01

    Nowadays, a large number of countries combine formal democratic institutions with authoritarian practices. Although in these countries the ruling elites may receive considerable voter support, they often use several manipulation tools to control election outcomes. A common practice of these regimes is the coercion and mobilization of large numbers of voters. This electoral irregularity is known as voter rigging, distinguishing it from vote rigging, which involves ballot stuffing or stealing. We develop a statistical test to quantify the extent to which the results of a particular election display traces of voter rigging. Our key hypothesis is that small polling stations are more susceptible to voter rigging because it is easier to identify opposing individuals, there are fewer eyewitnesses, and interested parties might reasonably expect fewer visits from election observers. We devise a general statistical method for testing whether voting behavior in small polling stations is significantly different from the behavior in their neighbor stations in a way that is consistent with the widespread occurrence of voter rigging. On the basis of a comparative analysis, the method enables third parties to conclude that an explanation other than simple variability is needed to explain geographic heterogeneities in vote preferences. We analyze 21 elections in 10 countries and find significant statistical anomalies compatible with voter rigging in Russia from 2007 to 2011, in Venezuela from 2006 to 2013, and in Uganda in 2011. Particularly disturbing is the case of Venezuela, where the smallest polling stations were decisive to the outcome of the 2013 presidential elections.

  2. A Survey of Logic Formalisms to Support Mishap Analysis

    NASA Technical Reports Server (NTRS)

    Johnson, Chris; Holloway, C. M.

    2003-01-01

    Mishap investigations provide important information about adverse events and near miss incidents. They are intended to help avoid any recurrence of previous failures. Over time, they can also yield statistical information about incident frequencies that helps to detect patterns of failure and can validate risk assessments. However, the increasing complexity of many safety critical systems is posing new challenges for mishap analysis. Similarly, the recognition that many failures have complex, systemic causes has helped to widen the scope of many mishap investigations. These two factors have combined to pose new challenges for the analysis of adverse events. A new generation of formal and semi-formal techniques have been proposed to help investigators address these problems. We introduce the term mishap logics to collectively describe these notations that might be applied to support the analysis of mishaps. The proponents of these notations have argued that they can be used to formally prove that certain events created the necessary and sufficient causes for a mishap to occur. These proofs can be used to reduce the bias that is often perceived to effect the interpretation of adverse events. Others have argued that one cannot use logic formalisms to prove causes in the same way that one might prove propositions or theorems. Such mechanisms cannot accurately capture the wealth of inductive, deductive and statistical forms of inference that investigators must use in their analysis of adverse events. This paper provides an overview of these mishap logics. It also identifies several additional classes of logic that might also be used to support mishap analysis.

  3. Bedside Ultrasound in the Emergency Department to Detect Hydronephrosis for the Evaluation of Suspected Ureteric Colic.

    PubMed

    Shrestha, R; Shakya, R M; Khan A, A

    2016-01-01

    Background Renal colic is a common emergency department presentation. Hydronephrosis is indirect sign of urinary obstruction which may be due to obstructing ureteric calculus and can be detected easily by bedside ultrasound with minimal training. Objective To compare the accuracy of detection of hydronephrosis performed by the emergency physician with that of radiologist's in suspected renal colic cases. Method This was a prospective observational study performed over a period of 6 months. Patients >8 years with provisional diagnosis of renal colic with both the bedside ultrasound and the formal ultrasound performed were included. Presence of hydronephrosis in both ultrasounds and size and location of ureteric stone if present in formal ultrasound was recorded. The accuracy of the emergency physician detection of hydronephrosis was determined using the scan reported by the radiologists as the "gold standard" as computed tomography was unavailable. Statistical analysis was executed using SPSS 17.0. Result Among the 111 included patients, 56.7% had ureteric stone detected in formal ultrasound. The overall sensitivity, specificity, positive predictive value and negative predictive value of bedside ultrasound performed by emergency physician for detection of hydronephrosis with that of formal ultrasound performed by radiologist was 90.8%., 78.3%, 85.5% and 85.7% respectively. Bedside ultrasound and formal ultrasound both detected hydronephrosis more often in patients with larger stones and the difference was statistically significant (p=.000). Conclusion Bedside ultrasound can be potentially used as an important tool in detecting clinically significant hydronephrosis in emergency to evaluate suspected ureteric colic. Focused training in ultrasound could greatly improve the emergency management of these patients.

  4. Impact of cleaning and other interventions on the reduction of hospital-acquired Clostridium difficile infections in two hospitals in England assessed using a breakpoint model.

    PubMed

    Hughes, G J; Nickerson, E; Enoch, D A; Ahluwalia, J; Wilkinson, C; Ayers, R; Brown, N M

    2013-07-01

    Clostridium difficile infection remains a major challenge for hospitals. Although targeted infection control initiatives have been shown to be effective in reducing the incidence of hospital-acquired C. difficile infection, there is little evidence available to assess the effectiveness of specific interventions. To use statistical modelling to detect substantial reductions in the incidence of C. difficile from time series data from two hospitals in England, and relate these time points to infection control interventions. A statistical breakpoints model was fitted to likely hospital-acquired C. difficile infection incidence data from a teaching hospital (2002-2009) and a district general hospital (2005-2009) in England. Models with increasing complexity (i.e. increasing the number of breakpoints) were tested for an improved fit to the data. Partitions estimated from breakpoint models were tested for individual stability using statistical process control charts. Major infection control interventions from both hospitals during this time were grouped according to their primary target (antibiotics, cleaning, isolation, other) and mapped to the model-suggested breakpoints. For both hospitals, breakpoints coincided with enhancements to cleaning protocols. Statistical models enabled formal assessment of the impact of different interventions, and showed that enhancements to deep cleaning programmes are the interventions that have most likely led to substantial reductions in hospital-acquired C. difficile infections at the two hospitals studied. Copyright © 2013 The Healthcare Infection Society. Published by Elsevier Ltd. All rights reserved.

  5. Multivariate assessment of event-related potentials with the t-CWT method.

    PubMed

    Bostanov, Vladimir

    2015-11-05

    Event-related brain potentials (ERPs) are usually assessed with univariate statistical tests although they are essentially multivariate objects. Brain-computer interface applications are a notable exception to this practice, because they are based on multivariate classification of single-trial ERPs. Multivariate ERP assessment can be facilitated by feature extraction methods. One such method is t-CWT, a mathematical-statistical algorithm based on the continuous wavelet transform (CWT) and Student's t-test. This article begins with a geometric primer on some basic concepts of multivariate statistics as applied to ERP assessment in general and to the t-CWT method in particular. Further, it presents for the first time a detailed, step-by-step, formal mathematical description of the t-CWT algorithm. A new multivariate outlier rejection procedure based on principal component analysis in the frequency domain is presented as an important pre-processing step. The MATLAB and GNU Octave implementation of t-CWT is also made publicly available for the first time as free and open source code. The method is demonstrated on some example ERP data obtained in a passive oddball paradigm. Finally, some conceptually novel applications of the multivariate approach in general and of the t-CWT method in particular are suggested and discussed. Hopefully, the publication of both the t-CWT source code and its underlying mathematical algorithm along with a didactic geometric introduction to some basic concepts of multivariate statistics would make t-CWT more accessible to both users and developers in the field of neuroscience research.

  6. Bayesian Sensitivity Analysis of Statistical Models with Missing Data

    PubMed Central

    ZHU, HONGTU; IBRAHIM, JOSEPH G.; TANG, NIANSHENG

    2013-01-01

    Methods for handling missing data depend strongly on the mechanism that generated the missing values, such as missing completely at random (MCAR) or missing at random (MAR), as well as other distributional and modeling assumptions at various stages. It is well known that the resulting estimates and tests may be sensitive to these assumptions as well as to outlying observations. In this paper, we introduce various perturbations to modeling assumptions and individual observations, and then develop a formal sensitivity analysis to assess these perturbations in the Bayesian analysis of statistical models with missing data. We develop a geometric framework, called the Bayesian perturbation manifold, to characterize the intrinsic structure of these perturbations. We propose several intrinsic influence measures to perform sensitivity analysis and quantify the effect of various perturbations to statistical models. We use the proposed sensitivity analysis procedure to systematically investigate the tenability of the non-ignorable missing at random (NMAR) assumption. Simulation studies are conducted to evaluate our methods, and a dataset is analyzed to illustrate the use of our diagnostic measures. PMID:24753718

  7. A round-robin gamma stereotactic radiosurgery dosimetry interinstitution comparison of calibration protocols

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Drzymala, R. E., E-mail: drzymala@wustl.edu; Alvarez, P. E.; Bednarz, G.

    2015-11-15

    Purpose: Absorbed dose calibration for gamma stereotactic radiosurgery is challenging due to the unique geometric conditions, dosimetry characteristics, and nonstandard field size of these devices. Members of the American Association of Physicists in Medicine (AAPM) Task Group 178 on Gamma Stereotactic Radiosurgery Dosimetry and Quality Assurance have participated in a round-robin exchange of calibrated measurement instrumentation and phantoms exploring two approved and two proposed calibration protocols or formalisms on ten gamma radiosurgery units. The objectives of this study were to benchmark and compare new formalisms to existing calibration methods, while maintaining traceability to U.S. primary dosimetry calibration laboratory standards. Methods:more » Nine institutions made measurements using ten gamma stereotactic radiosurgery units in three different 160 mm diameter spherical phantoms [acrylonitrile butadiene styrene (ABS) plastic, Solid Water, and liquid water] and in air using a positioning jig. Two calibrated miniature ionization chambers and one calibrated electrometer were circulated for all measurements. Reference dose-rates at the phantom center were determined using the well-established AAPM TG-21 or TG-51 dose calibration protocols and using two proposed dose calibration protocols/formalisms: an in-air protocol and a formalism proposed by the International Atomic Energy Agency (IAEA) working group for small and nonstandard radiation fields. Each institution’s results were normalized to the dose-rate determined at that institution using the TG-21 protocol in the ABS phantom. Results: Percentages of dose-rates within 1.5% of the reference dose-rate (TG-21 + ABS phantom) for the eight chamber-protocol-phantom combinations were the following: 88% for TG-21, 70% for TG-51, 93% for the new IAEA nonstandard-field formalism, and 65% for the new in-air protocol. Averages and standard deviations for dose-rates over all measurements relative to the TG-21 + ABS dose-rate were 0.999 ± 0.009 (TG-21), 0.991 ± 0.013 (TG-51), 1.000 ± 0.009 (IAEA), and 1.009 ± 0.012 (in-air). There were no statistically significant differences (i.e., p > 0.05) between the two ionization chambers for the TG-21 protocol applied to all dosimetry phantoms. The mean results using the TG-51 protocol were notably lower than those for the other dosimetry protocols, with a standard deviation 2–3 times larger. The in-air protocol was not statistically different from TG-21 for the A16 chamber in the liquid water or ABS phantoms (p = 0.300 and p = 0.135) but was statistically different from TG-21 for the PTW chamber in all phantoms (p = 0.006 for Solid Water, 0.014 for liquid water, and 0.020 for ABS). Results of IAEA formalism were statistically different from TG-21 results only for the combination of the A16 chamber with the liquid water phantom (p = 0.017). In the latter case, dose-rates measured with the two protocols differed by only 0.4%. For other phantom-ionization-chamber combinations, the new IAEA formalism was not statistically different from TG-21. Conclusions: Although further investigation is needed to validate the new protocols for other ionization chambers, these results can serve as a reference to quantitatively compare different calibration protocols and ionization chambers if a particular method is chosen by a professional society to serve as a standardized calibration protocol.« less

  8. Knowledge, skills and attitudes of hospital pharmacists in the use of information technology and electronic tools to support clinical practice: A Brazilian survey

    PubMed Central

    Vasconcelos, Hemerson Bruno da Silva; Woods, David John

    2017-01-01

    This study aimed to identify the knowledge, skills and attitudes of Brazilian hospital pharmacists in the use of information technology and electronic tools to support clinical practice. Methods: A questionnaire was sent by email to clinical pharmacists working public and private hospitals in Brazil. The instrument was validated using the method of Polit and Beck to determine the content validity index. Data (n = 348) were analyzed using descriptive statistics, Pearson's Chi-square test and Gamma correlation tests. Results: Pharmacists had 1–4 electronic devices for personal use, mainly smartphones (84.8%; n = 295) and laptops (81.6%; n = 284). At work, pharmacists had access to a computer (89.4%; n = 311), mostly connected to the internet (83.9%; n = 292). They felt competent (very capable/capable) searching for a web page/web site on a specific subject (100%; n = 348), downloading files (99.7%; n = 347), using spreadsheets (90.2%; n = 314), searching using MeSH terms in PubMed (97.4%; n = 339) and general searching for articles in bibliographic databases (such as Medline/PubMed: 93.4%; n = 325). Pharmacists did not feel competent in using statistical analysis software (somewhat capable/incapable: 78.4%; n = 273). Most pharmacists reported that they had not received formal education to perform most of these actions except searching using MeSH terms. Access to bibliographic databases was available in Brazilian hospitals, however, most pharmacists (78.7%; n = 274) reported daily use of a non-specific search engine such as Google. This result may reflect the lack of formal knowledge and training in the use of bibliographic databases and difficulty with the English language. The need to expand knowledge about information search tools was recognized by most pharmacists in clinical practice in Brazil, especially those with less time dedicated exclusively to clinical activity (Chi-square, p = 0.006). Conclusion: These results will assist in defining minimal competencies for the training of pharmacists in the field of information technology to support clinical practice. Knowledge and skill gaps are evident in the use of bibliographic databases, spreadsheets and statistical tools. PMID:29272292

  9. Knowledge, skills and attitudes of hospital pharmacists in the use of information technology and electronic tools to support clinical practice: A Brazilian survey.

    PubMed

    Néri, Eugenie Desirèe Rabelo; Meira, Assuero Silva; Vasconcelos, Hemerson Bruno da Silva; Woods, David John; Fonteles, Marta Maria de França

    2017-01-01

    This study aimed to identify the knowledge, skills and attitudes of Brazilian hospital pharmacists in the use of information technology and electronic tools to support clinical practice. A questionnaire was sent by email to clinical pharmacists working public and private hospitals in Brazil. The instrument was validated using the method of Polit and Beck to determine the content validity index. Data (n = 348) were analyzed using descriptive statistics, Pearson's Chi-square test and Gamma correlation tests. Pharmacists had 1-4 electronic devices for personal use, mainly smartphones (84.8%; n = 295) and laptops (81.6%; n = 284). At work, pharmacists had access to a computer (89.4%; n = 311), mostly connected to the internet (83.9%; n = 292). They felt competent (very capable/capable) searching for a web page/web site on a specific subject (100%; n = 348), downloading files (99.7%; n = 347), using spreadsheets (90.2%; n = 314), searching using MeSH terms in PubMed (97.4%; n = 339) and general searching for articles in bibliographic databases (such as Medline/PubMed: 93.4%; n = 325). Pharmacists did not feel competent in using statistical analysis software (somewhat capable/incapable: 78.4%; n = 273). Most pharmacists reported that they had not received formal education to perform most of these actions except searching using MeSH terms. Access to bibliographic databases was available in Brazilian hospitals, however, most pharmacists (78.7%; n = 274) reported daily use of a non-specific search engine such as Google. This result may reflect the lack of formal knowledge and training in the use of bibliographic databases and difficulty with the English language. The need to expand knowledge about information search tools was recognized by most pharmacists in clinical practice in Brazil, especially those with less time dedicated exclusively to clinical activity (Chi-square, p = 0.006). These results will assist in defining minimal competencies for the training of pharmacists in the field of information technology to support clinical practice. Knowledge and skill gaps are evident in the use of bibliographic databases, spreadsheets and statistical tools.

  10. Geospatial and Remote Sensing-based Indicators of Settlement Type---Differentiating Informal and Formal Settlements in Guatemala City

    NASA Astrophysics Data System (ADS)

    Owen, Karen K.

    This research addresses the need for reliable, repeatable, quantitative measures to differentiate informal (slum) from formal (planned) settlements using commercial very high resolution imagery and elevation data. Measuring the physical, spatial and spectral qualities of informal settlements is an important precursor for evaluating success toward improving the lives of 100 million slum dwellers worldwide, as pledged by the United Nations Millennium Development Goal Target 7D. A variety of measures were tested based on surface material spectral properties, texture, built-up structure, road network accessibility, and geomorphology from twelve communities in Guatemala City to reveal statistically significant differences between informal and formal settlements that could be applied to other parts of the world without the need for costly or dangerous field surveys. When information from satellite imagery is constrained to roads and residential boundaries, a more precise understanding of human habitation is produced. A classification and regression tree (CART) approach and linear discriminant function analysis enabled a variable dimensionality reduction from the original 23 to 6 variables that are sufficient to differentiate a settlement as informal or formal. The results demonstrate that the entropy texture of roads, the degree of asphalt road surface, the vegetation patch compactness and patch size, the percent of bare soil land cover, the geomorphic profile convexity of the terrain, and the road density distinguish informal from formal settlements with 87--92% accuracy when results are cross-validated. The variables with highest contribution to model outcome that are common to both approaches are entropy texture of roads, vegetation patch size, and vegetation compactness suggesting that road texture, surface materials and vegetation provide the necessary characteristics to distinguish the level of informality of a settlement. The results will assist urban planners and settlement analysts who must process vast amounts of imagery worldwide, enabling them to report annually on slum conditions. An added benefit is the ability to use the measures in data-poor regions of the world without field surveys.

  11. How Framing Statistical Statements Affects Subjective Veracity: Validation and Application of a Multinomial Model for Judgments of Truth

    ERIC Educational Resources Information Center

    Hilbig, Benjamin E.

    2012-01-01

    Extending the well-established negativity bias in human cognition to truth judgments, it was recently shown that negatively framed statistical statements are more likely to be considered true than formally equivalent statements framed positively. However, the underlying processes responsible for this effect are insufficiently understood.…

  12. Statistics of primordial density perturbations from discrete seed masses

    NASA Technical Reports Server (NTRS)

    Scherrer, Robert J.; Bertschinger, Edmund

    1991-01-01

    The statistics of density perturbations for general distributions of seed masses with arbitrary matter accretion is examined. Formal expressions for the power spectrum, the N-point correlation functions, and the density distribution function are derived. These results are applied to the case of uncorrelated seed masses, and power spectra are derived for accretion of both hot and cold dark matter plus baryons. The reduced moments (cumulants) of the density distribution are computed and used to obtain a series expansion for the density distribution function. Analytic results are obtained for the density distribution function in the case of a distribution of seed masses with a spherical top-hat accretion pattern. More generally, the formalism makes it possible to give a complete characterization of the statistical properties of any random field generated from a discrete linear superposition of kernels. In particular, the results can be applied to density fields derived by smoothing a discrete set of points with a window function.

  13. Superstatistics with different kinds of distributions in the deformed formalism

    NASA Astrophysics Data System (ADS)

    Sargolzaeipor, S.; Hassanabadi, H.; Chung, W. S.

    2018-03-01

    In this article, after first introducing superstatistics, the effective Boltzmann factor in a deformed formalism for modified Dirac delta, uniform, two-level and Gamma distributions is derived. Then we make use of the superstatistics for four important problems in physics and the thermodynamic properties of the system are calculated. All results in the limit case are reduced to ordinary statistical mechanics. Furthermore, effects of all parameters in the problems are calculated and shown graphically.

  14. Sub-grid scale models for discontinuous Galerkin methods based on the Mori-Zwanzig formalism

    NASA Astrophysics Data System (ADS)

    Parish, Eric; Duraisamy, Karthk

    2017-11-01

    The optimal prediction framework of Chorin et al., which is a reformulation of the Mori-Zwanzig (M-Z) formalism of non-equilibrium statistical mechanics, provides a framework for the development of mathematically-derived closure models. The M-Z formalism provides a methodology to reformulate a high-dimensional Markovian dynamical system as a lower-dimensional, non-Markovian (non-local) system. In this lower-dimensional system, the effects of the unresolved scales on the resolved scales are non-local and appear as a convolution integral. The non-Markovian system is an exact statement of the original dynamics and is used as a starting point for model development. In this work, we investigate the development of M-Z-based closures model within the context of the Variational Multiscale Method (VMS). The method relies on a decomposition of the solution space into two orthogonal subspaces. The impact of the unresolved subspace on the resolved subspace is shown to be non-local in time and is modeled through the M-Z-formalism. The models are applied to hierarchical discontinuous Galerkin discretizations. Commonalities between the M-Z closures and conventional flux schemes are explored. This work was supported in part by AFOSR under the project ''LES Modeling of Non-local effects using Statistical Coarse-graining'' with Dr. Jean-Luc Cambier as the technical monitor.

  15. Probability density function formalism for optical coherence tomography signal analysis: a controlled phantom study.

    PubMed

    Weatherbee, Andrew; Sugita, Mitsuro; Bizheva, Kostadinka; Popov, Ivan; Vitkin, Alex

    2016-06-15

    The distribution of backscattered intensities as described by the probability density function (PDF) of tissue-scattered light contains information that may be useful for tissue assessment and diagnosis, including characterization of its pathology. In this Letter, we examine the PDF description of the light scattering statistics in a well characterized tissue-like particulate medium using optical coherence tomography (OCT). It is shown that for low scatterer density, the governing statistics depart considerably from a Gaussian description and follow the K distribution for both OCT amplitude and intensity. The PDF formalism is shown to be independent of the scatterer flow conditions; this is expected from theory, and suggests robustness and motion independence of the OCT amplitude (and OCT intensity) PDF metrics in the context of potential biomedical applications.

  16. Location priority for non-formal early childhood education school based on promethee method and map visualization

    NASA Astrophysics Data System (ADS)

    Ayu Nurul Handayani, Hemas; Waspada, Indra

    2018-05-01

    Non-formal Early Childhood Education (non-formal ECE) is an education that is held for children under 4 years old. The implementation in District of Banyumas, Non-formal ECE is monitored by The District Government of Banyumas and helped by Sanggar Kegiatan Belajar (SKB) Purwokerto as one of the organizer of Non-formal Education. The government itself has a program for distributing ECE to all villages in Indonesia. However, The location to construct the ECE school in several years ahead is not arranged yet. Therefore, for supporting that program, a decision support system is made to give some recommendation villages for constructing The ECE building. The data are projected based on Brown’s Double Exponential Smoothing Method and utilizing Preference Ranking Organization Method for Enrichment Evaluation (Promethee) to generate priority order. As the recommendations system, it generates map visualization which is colored according to the priority level of sub-district and village area. The system was tested with black box testing, Promethee testing, and usability testing. The results showed that the system functionality and Promethee algorithm were working properly, and the user was satisfied.

  17. AAC intervention using a VOCA for deaf children with multiple disabilities who received cochlear implantation.

    PubMed

    Lee, Youngmee; Jeong, Sung-Wook; Kim, Lee-Suk

    2013-12-01

    The aim of this study was to examine the efficacy of a new habilitation approach, augmentative and alternative communication (AAC) intervention using a voice output communication aid (VOCA), in improving speech perception, speech production, receptive vocabulary skills, and communicative behaviors in children with cochlear implants (CIs) who had multiple disabilities. Five children with mental retardation and/or cerebral palsy who had used CIs over two years were included in this study. Five children in the control group were matched to children who had AAC intervention on the basis of the type/severity of their additional disabilities and chronological age. They had limited oral communication skills after cochlear implantation because of their limited cognition and oromotor function. The children attended the AAC intervention with parents once a week for 6 months. We evaluated their performance using formal tests, including the monosyllabic word tests, the articulation test, and the receptive vocabulary test. We also assessed parent-child interactions. We analyzed the data using a one-group pretest and posttest design. The mean scores of the formal tests performed in these children improved from 26% to 48% in the phoneme scores of the monosyllabic word tests, from 17% to 35% in the articulation test, and from 11 to 18.4 in the receptive vocabulary test after AAC intervention (all p < .05). Some children in the control group showed improvement in the speech perception, speech production, and receptive vocabulary tests for 6 months, but the differences did not achieve statistical significance (all p > .05). The frequency of spontaneous communicative behaviors (i.e., vocalization, gestures, and words) and imitative words significantly increased after AAC intervention (p < .05). AAC intervention using a VOCA was very useful and effective on improving communicative skills in children with multiple disabilities who had very limited oral communication skills after cochlear implantation. Copyright © 2013. Published by Elsevier Ireland Ltd.

  18. [EVALUATION OF THE EFFECTIVENESS OF ADDITIONAL PROFESSIONAL EDUCATION ON THE BASIS OF HEALTH CARE FACILITY].

    PubMed

    Bohomaz, V M; Rymarenko, P V

    2014-01-01

    In this study we tested methods of facility learning of health care workers as part of a modern model of quality management of medical services. The statistical and qualitative analysis of the effectiveness of additional training in emergency medical care at the health facility using an adapted curriculum and special mannequins. Under the guidance of a certified instructor focus group of 53 doctors and junior medical specialists studied 22 hours. According to a survey of employees trained their level of selfassessment of knowledge and skills sigificantly increased. Also significantly increased the proportion of correct answers in a formalized testing both categories of workers. Using androgological learning model, mannequins simulators and training in small groups at work create the most favorable conditions for effective individual and group practical skills of emergency medicine.

  19. Quantum formalism for classical statistics

    NASA Astrophysics Data System (ADS)

    Wetterich, C.

    2018-06-01

    In static classical statistical systems the problem of information transport from a boundary to the bulk finds a simple description in terms of wave functions or density matrices. While the transfer matrix formalism is a type of Heisenberg picture for this problem, we develop here the associated Schrödinger picture that keeps track of the local probabilistic information. The transport of the probabilistic information between neighboring hypersurfaces obeys a linear evolution equation, and therefore the superposition principle for the possible solutions. Operators are associated to local observables, with rules for the computation of expectation values similar to quantum mechanics. We discuss how non-commutativity naturally arises in this setting. Also other features characteristic of quantum mechanics, such as complex structure, change of basis or symmetry transformations, can be found in classical statistics once formulated in terms of wave functions or density matrices. We construct for every quantum system an equivalent classical statistical system, such that time in quantum mechanics corresponds to the location of hypersurfaces in the classical probabilistic ensemble. For suitable choices of local observables in the classical statistical system one can, in principle, compute all expectation values and correlations of observables in the quantum system from the local probabilistic information of the associated classical statistical system. Realizing a static memory material as a quantum simulator for a given quantum system is not a matter of principle, but rather of practical simplicity.

  20. The Effect of Instruction on the Acquisition of Conservation of Volume.

    ERIC Educational Resources Information Center

    Butts, David P.; Howe, Ann C.

    Tested was the hypothesis that science instruction based on task analysis will lead to the acquisition of the ability to perform certain Piaget volume tasks which have been characterized as requiring formal operations for their solutions. A Test on Formal Operations and a Learning Hierarchies Test were given to fourth- and sixth-grade students in…

  1. Tsallis and Kaniadakis statistics from a point of view of the holographic equipartition law

    NASA Astrophysics Data System (ADS)

    Abreu, Everton M. C.; Ananias Neto, Jorge; Mendes, Albert C. R.; Bonilla, Alexander

    2018-02-01

    In this work, we have illustrated the difference between both Tsallis and Kaniadakis entropies through cosmological models obtained from the formalism proposed by Padmanabhan, which is called holographic equipartition law. Similarly to the formalism proposed by Komatsu, we have obtained an extra driving constant term in the Friedmann equation if we deform the Tsallis entropy by Kaniadakis' formalism. We have considered initially Tsallis entropy as the black-hole (BH) area entropy. This constant term may lead the universe to be in an accelerated or decelerated mode. On the other hand, if we start with the Kaniadakis entropy as the BH area entropy and then by modifying the Kappa expression by Tsallis' formalism, the same absolute value but with opposite sign is obtained. In an opposite limit, no driving inflation term of the early universe was derived from both deformations.

  2. A new and trustworthy formalism to compute entropy in quantum systems

    NASA Astrophysics Data System (ADS)

    Ansari, Mohammad

    Entropy is nonlinear in density matrix and as such its evaluation in open quantum system has not been fully understood. Recently a quantum formalism was proposed by Ansari and Nazarov that evaluates entropy using parallel time evolutions of multiple worlds. We can use this formalism to evaluate entropy flow in a photovoltaic cells coupled to thermal reservoirs and cavity modes. Recently we studied the full counting statistics of energy transfers in such systems. This rigorously proves a nontrivial correspondence between energy exchanges and entropy changes in quantum systems, which only in systems without entanglement can be simplified to the textbook second law of thermodynamics. We evaluate the flow of entropy using this formalism. In the presence of entanglement, however, interestingly much less information is exchanged than what we expected. This increases the upper limit capacity for information transfer and its conversion to energy for next generation devices in mesoscopic physics.

  3. Cost implications of organizing nursing home workforce in teams.

    PubMed

    Mukamel, Dana B; Cai, Shubing; Temkin-Greener, Helena

    2009-08-01

    To estimate the costs associated with formal and self-managed daily practice teams in nursing homes. Medicaid cost reports for 135 nursing homes in New York State in 2006 and survey data for 6,137 direct care workers. A retrospective statistical analysis: We estimated hybrid cost functions that include team penetration variables. Inference was based on robust standard errors. Formal and self-managed team penetration (i.e., percent of staff working in a team) were calculated from survey responses. Annual variable costs, beds, case mix-adjusted days, admissions, home care visits, outpatient clinic visits, day care days, wages, and ownership were calculated from the cost reports. Formal team penetration was significantly associated with costs, while self-managed teams penetration was not. Costs declined with increasing penetration up to 13 percent of formal teams, and increased above this level. Formal teams in nursing homes in the upward sloping range of the curve were more diverse, with a larger number of participating disciplines and more likely to include physicians. Organization of workforce in formal teams may offer nursing homes a cost-saving strategy. More research is required to understand the relationship between team composition and costs.

  4. "Of Course I'm Communicating; I Lecture Every Day": Enhancing Teaching and Learning in Introductory Statistics. Scholarship of Teaching and Learning

    ERIC Educational Resources Information Center

    Wulff, Shaun S.; Wulff, Donald H.

    2004-01-01

    This article focuses on one instructor's evolution from formal lecturing to interactive teaching and learning in a statistics course. Student perception data are used to demonstrate the instructor's use of communication to align the content, students, and instructor throughout the course. Results indicate that the students learned, that…

  5. Impact of formal training on agreement of videofluoroscopic swallowing study interpretation across and within disciplines.

    PubMed

    Silbergleit, Alice K; Cook, Diana; Kienzle, Scott; Boettcher, Erica; Myers, Daniel; Collins, Denise; Peterson, Edward; Silbergleit, Matthew A; Silbergleit, Richard

    2018-04-04

    Formal agreement studies on interpretation of the videofluoroscopic swallowing study (VFSS) procedure among speech-language pathologists, radiology house officers, and staff radiologists have not been pursued. Each of these professions participates in the procedure, interprets the examination, and writes separate reports on the findings. The aim of this study was to determine reliability of interpretation between and within the disciplines and to determine if structured training improved reliability. Thirteen speech-language pathologists (SLPs), ten diagnostic radiologists (RADs) and twenty-one diagnostic radiology house officers (HOs) participated in this study. Each group viewed 24 VFSS samples and rated the presence or absence of seven aberrant swallowing features as well as the presence of dysphagia and identification of oral dysphagia, pharyngeal dysphagia, or both. During part two, the groups were provided with a training session on normal and abnormal swallowing, using different VFSS samples from those in part one, followed by re-rating of the original 24 VFSS samples. A generalized estimating equations (GEE) approach with a binomial link function was used to examine each question separately. For each cluster of tests, as example, all pairwise comparisons between the three groups in the pretraining period, a Hochberg's correction for multiple testing was used to determine significance. A GEE approach with a binomial link function was used to compare the premeasure to postmeasure for each of the three groups of raters stratified by experience. The primary result revealed that the HO group scored significantly lower than the SLP and RAD group on identification of the presence of dysphagia (p = 0.008; p = 0.001, respectively), identification of oral phase dysphagia (p = 0.003; p = 0.001, respectively), and identification of both oral and pharyngeal phase dysphagia, (p = 0.014, p = 0.001, respectively) pretraining. Post training there was no statistically significant difference between the three groups on identification of dysphagia and identification of combined oral and pharyngeal dysphagia. Formal training to identify oropharyngeal dysphagia characteristics appears to improve accuracy of interpretation of the VFSS procedure for radiology house officers. Consideration to include formal training in this area for radiology residency training programs is recommended.

  6. Model-Driven Test Generation of Distributed Systems

    NASA Technical Reports Server (NTRS)

    Easwaran, Arvind; Hall, Brendan; Schweiker, Kevin

    2012-01-01

    This report describes a novel test generation technique for distributed systems. Utilizing formal models and formal verification tools, spe cifically the Symbolic Analysis Laboratory (SAL) tool-suite from SRI, we present techniques to generate concurrent test vectors for distrib uted systems. These are initially explored within an informal test validation context and later extended to achieve full MC/DC coverage of the TTEthernet protocol operating within a system-centric context.

  7. A formal protocol test procedure for the Survivable Adaptable Fiber Optic Embedded Network (SAFENET)

    NASA Astrophysics Data System (ADS)

    High, Wayne

    1993-03-01

    This thesis focuses upon a new method for verifying the correct operation of a complex, high speed fiber optic communication network. These networks are of growing importance to the military because of their increased connectivity, survivability, and reconfigurability. With the introduction and increased dependence on sophisticated software and protocols, it is essential that their operation be correct. Because of the speed and complexity of fiber optic networks being designed today, they are becoming increasingly difficult to test. Previously, testing was accomplished by application of conformance test methods which had little connection with an implementation's specification. The major goal of conformance testing is to ensure that the implementation of a profile is consistent with its specification. Formal specification is needed to ensure that the implementation performs its intended operations while exhibiting desirable behaviors. The new conformance test method presented is based upon the System of Communicating Machine model which uses a formal protocol specification to generate a test sequence. The major contribution of this thesis is the application of the System of Communicating Machine model to formal profile specifications of the Survivable Adaptable Fiber Optic Embedded Network (SAFENET) standard which results in the derivation of test sequences for a SAFENET profile. The results applying this new method to SAFENET's OSI and Lightweight profiles are presented.

  8. Test particle propagation in magnetostatic turbulence. 2: The local approximation method

    NASA Technical Reports Server (NTRS)

    Klimas, A. J.; Sandri, G.; Scudder, J. D.; Howell, D. R.

    1976-01-01

    An approximation method for statistical mechanics is presented and applied to a class of problems which contains a test particle propagation problem. All of the available basic equations used in statistical mechanics are cast in the form of a single equation which is integrodifferential in time and which is then used as the starting point for the construction of the local approximation method. Simplification of the integrodifferential equation is achieved through approximation to the Laplace transform of its kernel. The approximation is valid near the origin in the Laplace space and is based on the assumption of small Laplace variable. No other small parameter is necessary for the construction of this approximation method. The n'th level of approximation is constructed formally, and the first five levels of approximation are calculated explicitly. It is shown that each level of approximation is governed by an inhomogeneous partial differential equation in time with time independent operator coefficients. The order in time of these partial differential equations is found to increase as n does. At n = 0 the most local first order partial differential equation which governs the Markovian limit is regained.

  9. Status of the use and compliance with malaria rapid diagnostic tests in formal private health facilities in Nigeria.

    PubMed

    Mokuolu, Olugbenga A; Ntadom, Godwin N; Ajumobi, Olufemi O; Alero, Roberts A; Wammanda, Robinson D; Adedoyin, Olanrewaju T; Okafor, Henrietta U; Alabi, Adekunle D; Odey, Friday A; Agomo, Chimere O; Edozieh, Kate U; Fagbemi, Tolulope O; Njidda, Ahmad M; Babatunde, Seye; Agbo, Emmanuel C; Nwaneri, Nnamdi B; Shekarau, Emmanuel D; Obasa, Temitope O; Ezeigwe, Nnenna M

    2016-01-04

    Nigeria has the largest number of malaria-related deaths, accounting for a third of global malaria deaths. It is important that the country attains universal coverage of key malaria interventions, one of which is the policy of universal testing before treatment, which the country has recently adopted. However, there is a dearth of data on its implementation in formal private health facilities, where close to a third of the population seek health care. This study identified the level of use of malaria rapid diagnostic testing (RDT), compliance with test results and associated challenges in the formal private health facilities in Nigeria. A cross-sectional study that involved a multi-stage, random sampling of 240 formal private health facilities from the country's six geo-political zones was conducted from July to August 2014. Data were collected using health facility records, healthcare workers' interviews and an exit survey of febrile patients seen at the facilities, in order to determine fever prevalence, level of testing of febrile patience, compliance with test results, and health workers' perceptions to RDT use. Data from the 201 health facilities analysed indicated a fever prevalence of 38.5% (112,521/292,430). Of the 2077 exit interviews for febrile patients, malaria testing was ordered in 73.8% (95% CI 71.7-75.7%). Among the 1270 tested, 61.8% (719/1270) were tested with microscopy and 38.2% (445/1270) with RDT. Compliance to malaria test result [administering arteminisin-based combination therapy (ACT) to positive patients and withholding ACT from negative patients] was 80.9% (95% CI 78.7-83%). Compliance was not influenced by the age of patients or type of malaria test. The health facilities have various cadres of the health workers knowledgeable on RDT with 70% knowing the meaning, while 84.5% knew what it assesses. However, there was clearly a preference for microscopy as only 20% reported performing only RDT. In formal private health facilities in Nigeria there is a high rate of malaria testing for febrile patients, high level of compliance with test results but relatively low level of RDT utilization. This calls for improved engagement of the formal private health sector with a view to achieving universal coverage targets on malaria testing.

  10. Condensate statistics in interacting and ideal dilute bose gases

    PubMed

    Kocharovsky; Kocharovsky; Scully

    2000-03-13

    We obtain analytical formulas for the statistics, in particular, for the characteristic function and all cumulants, of the Bose-Einstein condensate in dilute weakly interacting and ideal equilibrium gases in the canonical ensemble via the particle-number-conserving operator formalism of Girardeau and Arnowitt. We prove that the ground-state occupation statistics is not Gaussian even in the thermodynamic limit. We calculate the effect of Bogoliubov coupling on suppression of ground-state occupation fluctuations and show that they are governed by a pair-correlation, squeezing mechanism.

  11. School Socioeconomic Compositional Effect on Shadow Education Participation: Evidence from Japan

    ERIC Educational Resources Information Center

    Matsuoka, Ryoji

    2015-01-01

    While shadow education, organized learning activities outside formal school, has grown greatly around the world, the relationship between formal schooling and shadow education has not been well investigated. This study is therefore intended to empirically test whether formal education's structure (i.e. tracking) affects students' shadow education…

  12. Conceptual Questions and Lack of Formal Reasoning: Are They Mutually Exclusive?

    ERIC Educational Resources Information Center

    Igaz, Csaba; Proksa, Miroslav

    2012-01-01

    Using specially designed conceptual question pairs, 9th grade students were tested on tasks (presented as experimental situations in pictorial form) that involved controlling the variables' scheme of formal reasoning. The question topics focused on these three chemical contexts: chemistry in everyday life, chemistry without formal concepts, and…

  13. Clinical Applicability and Cutoff Values for an Unstructured Neuropsychological Assessment Protocol for Older Adults with Low Formal Education

    PubMed Central

    de Paula, Jonas Jardim; Bertola, Laiss; Ávila, Rafaela Teixeira; Moreira, Lafaiete; Coutinho, Gabriel; de Moraes, Edgar Nunes; Bicalho, Maria Aparecida Camargos; Nicolato, Rodrigo; Diniz, Breno Satler; Malloy-Diniz, Leandro Fernandes

    2013-01-01

    Background and Objectives The neuropsychological exam plays a central role in the assessment of elderly patients with cognitive complaints. It is particularly relevant to differentiate patients with mild dementia from those subjects with mild cognitive impairment. Formal education is a critical factor in neuropsychological performance; however, there are few studies that evaluated the psychometric properties, especially criterion related validity, neuropsychological tests for patients with low formal education. The present study aims to investigate the validity of an unstructured neuropsychological assessment protocol for this population and develop cutoff values for clinical use. Methods and Results A protocol composed by the Rey-Auditory Verbal Learning Test, Frontal Assessment Battery, Category and Letter Fluency, Stick Design Test, Clock Drawing Test, Digit Span, Token Test and TN-LIN was administered to 274 older adults (96 normal aging, 85 mild cognitive impairment and 93 mild Alzheimer`s disease) with predominantly low formal education. Factor analysis showed a four factor structure related to Executive Functions, Language/Semantic Memory, Episodic Memory and Visuospatial Abilities, accounting for 65% of explained variance. Most of the tests showed a good sensitivity and specificity to differentiate the diagnostic groups. The neuropsychological protocol showed a significant ecological validity as 3 of the cognitive factors explained 31% of the variance on Instrumental Activities of Daily Living. Conclusion The study presents evidence of the construct, criteria and ecological validity for this protocol. The neuropsychological tests and the proposed cutoff values might be used for the clinical assessment of older adults with low formal education. PMID:24066031

  14. Chemodetection in fluctuating environments: receptor coupling, buffering, and antagonism.

    PubMed

    Lalanne, Jean-Benoît; François, Paul

    2015-02-10

    Variability in the chemical composition of the extracellular environment can significantly degrade the ability of cells to detect rare cognate ligands. Using concepts from statistical detection theory, we formalize the generic problem of detection of small concentrations of ligands in a fluctuating background of biochemically similar ligands binding to the same receptors. We discover that in contrast with expectations arising from considerations of signal amplification, inhibitory interactions between receptors can improve detection performance in the presence of substantial environmental variability, providing an adaptive interpretation to the phenomenon of ligand antagonism. Our results suggest that the structure of signaling pathways responsible for chemodetection in fluctuating and heterogeneous environments might be optimized with respect to the statistics and dynamics of environmental composition. The developed formalism stresses the importance of characterizing nonspecific interactions to understand function in signaling pathways.

  15. Testing and Contrasting Road Safety Education, Deterrence, and Social Capital Theories: A Sociological Approach to the Understanding of Male Drink-Driving in Chile's Metropolitan Region.

    PubMed

    Nazif, José Ignacio

    2011-01-01

    Three theories offer different explanations to the understanding of male drink-driving. In order to test road safety education, deterrence, and social capital theories, logistic regression analysis was applied to predict respondents' statements of having or not having engaged in actual drink-driving (DD). Variable for road safety education theory was whether a driver had graduated from a professional driving school or not. Deterrence theory was operationalized with a variable of whether a driver had been issued a traffic ticket or not. Social capital theory was operationalized with two variables, having children or not and having religion identification or not. Since both variables 'years of formal education' and 'years of driving experience' have been reported to be correlated to alcohol consumption and DD respectively, these were introduced as controls. In order to assess the significance of each variable statistically, Wald tests were applied in seven models. Results indicate on the one hand that road safety education variable is not statistically significant; and on the other, deterrence theory variable and social capital theory variable 'having children' were both statistically significant at the level of .01. Findings are discussed in reference to Chile's context. Data were taken from the "Road Users Attitudes and Behaviors towards Traffic Safety" survey from the National Commission of Road Safety of the Government of Chile (2005). The sample size was reported to be 2,118 (N of male drivers was 396). This survey was representative of Chile's Metropolitan Region road users' population.

  16. Testing and Contrasting Road Safety Education, Deterrence, and Social Capital Theories: A Sociological Approach to the Understanding of Male Drink-Driving in Chile’s Metropolitan Region

    PubMed Central

    Nazif, José Ignacio

    2011-01-01

    Three theories offer different explanations to the understanding of male drink-driving. In order to test road safety education, deterrence, and social capital theories, logistic regression analysis was applied to predict respondents’ statements of having or not having engaged in actual drink-driving (DD). Variable for road safety education theory was whether a driver had graduated from a professional driving school or not. Deterrence theory was operationalized with a variable of whether a driver had been issued a traffic ticket or not. Social capital theory was operationalized with two variables, having children or not and having religion identification or not. Since both variables ‘years of formal education’ and ‘years of driving experience’ have been reported to be correlated to alcohol consumption and DD respectively, these were introduced as controls. In order to assess the significance of each variable statistically, Wald tests were applied in seven models. Results indicate on the one hand that road safety education variable is not statistically significant; and on the other, deterrence theory variable and social capital theory variable ‘having children’ were both statistically significant at the level of .01. Findings are discussed in reference to Chile’s context. Data were taken from the “Road Users Attitudes and Behaviors towards Traffic Safety” survey from the National Commission of Road Safety of the Government of Chile (2005). The sample size was reported to be 2,118 (N of male drivers was 396). This survey was representative of Chile’s Metropolitan Region road users' population. PMID:22105406

  17. An assessment of the validity and discrimination of the intensive time-series design by monitoring learning differences between students with different cognitive tendencies

    NASA Astrophysics Data System (ADS)

    Farnsworth, Carolyn H.; Mayer, Victor J.

    Intensive time-series designs for classroom investigations have been under development since 1975. Studies have been conducted to determine their feasibility (Mayer & Lewis, 1979), their potential for monitoring knowledge acquisition (Mayer & Kozlow, 1980), and the potential threat to validity of the frequency of testing inherent in the design (Mayer & Rojas, 1982). This study, an extension of those previous studies, is an attempt to determine the degree of discrimination the design allows in collecting data on achievement. It also serves as a replication of the Mayer and Kozlow study, an attempt to determine design validity for collecting achievement data. The investigator used her eighth-grade earth science students, from a suburban Columbus (Ohio) junior high school. A multiple-group single intervention time-series design (Glass, Willson, & Gottman, 1975) was adapted to the collection of daily data on achievement in the topic of the intervention, a unit on plate tectonics. Single multiple-choice items were randomly assigned to each of three groups of students, identified on the basis of their ranking on a written test of cognitive level (Lawson, 1978). The top third, or those with formal cognitive tendencies, were compared on the basis of knowledge achievement and understanding achievement with the lowest third of the students, or those with concrete cognitive tendencies, to determine if the data collected in the design would discriminate between the two groups. Several studies (Goodstein & Howe, 1978; Lawson & Renner, 1975) indicated that students with formal cognitive tendencies should learn a formal concept such as plate tectonics with greater understanding than should students with concrete cognitive tendencies. Analyses used were a comparison of regression lines in each of the three study stages: baseline, intervention, and follow-up; t-tests of means of days summed across each stage; and a time-series analysis program. Statistically significant differences were found between the two groups both in slopes of regression lines (0.0001) and in t-tests (0.0005) on both knowledge and understanding levels of learning. These differences confirm the discrimination of the intensive time-series design in showing that it can distinguish differences in learning between students with formal cognitive tendencies and those with concrete cognitive tendencies. The time-series analysis model with a trend in the intervention was better than a model with no trend for both groups of students, in that it accounted for a greater amount of variance in the data from both knowledge and understanding levels of learning. This finding adds additional confidence in the validity of the design for obtaining achievement data. When the analysis model with trend was used on data from the group with formal cognitive tendencies, it accounted for a greater degree of variance than the same model applied to the data from the group with concrete cognitive tendencies. This more conservative analysis, therefor, gave results consistent with those from the more usual linear regression techniques and t-tests, further adding to the confidence in the discrimination of the design.

  18. Nowcasting sunshine number using logistic modeling

    NASA Astrophysics Data System (ADS)

    Brabec, Marek; Badescu, Viorel; Paulescu, Marius

    2013-04-01

    In this paper, we present a formalized approach to statistical modeling of the sunshine number, binary indicator of whether the Sun is covered by clouds introduced previously by Badescu (Theor Appl Climatol 72:127-136, 2002). Our statistical approach is based on Markov chain and logistic regression and yields fully specified probability models that are relatively easily identified (and their unknown parameters estimated) from a set of empirical data (observed sunshine number and sunshine stability number series). We discuss general structure of the model and its advantages, demonstrate its performance on real data and compare its results to classical ARIMA approach as to a competitor. Since the model parameters have clear interpretation, we also illustrate how, e.g., their inter-seasonal stability can be tested. We conclude with an outlook to future developments oriented to construction of models allowing for practically desirable smooth transition between data observed with different frequencies and with a short discussion of technical problems that such a goal brings.

  19. Physiological time-series analysis: what does regularity quantify?

    NASA Technical Reports Server (NTRS)

    Pincus, S. M.; Goldberger, A. L.

    1994-01-01

    Approximate entropy (ApEn) is a recently developed statistic quantifying regularity and complexity that appears to have potential application to a wide variety of physiological and clinical time-series data. The focus here is to provide a better understanding of ApEn to facilitate its proper utilization, application, and interpretation. After giving the formal mathematical description of ApEn, we provide a multistep description of the algorithm as applied to two contrasting clinical heart rate data sets. We discuss algorithm implementation and interpretation and introduce a general mathematical hypothesis of the dynamics of a wide class of diseases, indicating the utility of ApEn to test this hypothesis. We indicate the relationship of ApEn to variability measures, the Fourier spectrum, and algorithms motivated by study of chaotic dynamics. We discuss further mathematical properties of ApEn, including the choice of input parameters, statistical issues, and modeling considerations, and we conclude with a section on caveats to ensure correct ApEn utilization.

  20. Adaptive clinical trial design.

    PubMed

    Chow, Shein-Chung

    2014-01-01

    In recent years, the use of adaptive design methods in clinical trials based on accumulated data at interim has received much attention because of its flexibility and efficiency in pharmaceutical/clinical development. In practice, adaptive design may provide the investigators a second chance to modify or redesign the trial while the study is still ongoing. However, it is a concern that a shift in target patient population may occur after significant adaptations are made. In addition, the overall type I error rate may not be preserved. Moreover, the results may not be reliable and hence are difficult to interpret. As indicated by the US Food and Drug Administration draft guidance on adaptive design clinical trials, the adaptive design has to be a prospectively planned opportunity and should be based on information collected within the study, with or without formal statistical hypothesis testing. This article reviews the relative advantages, limitations, and feasibility of commonly considered adaptive designs in clinical trials. Statistical concerns when implementing adaptive designs are also discussed.

  1. The biometric menagerie.

    PubMed

    Yager, Neil; Dunstone, Ted

    2010-02-01

    It is commonly accepted that users of a biometric system may have differing degrees of accuracy within the system. Some people may have trouble authenticating, while others may be particularly vulnerable to impersonation. Goats, wolves, and lambs are labels commonly applied to these problem users. These user types are defined in terms of verification performance when users are matched against themselves (goats) or when matched against others (lambs and wolves). The relationship between a user's genuine and impostor match results suggests four new user groups: worms, doves, chameleons, and phantoms. We establish formal definitions for these animals and a statistical test for their existence. A thorough investigation is conducted using a broad range of biometric modalities, including 2D and 3D faces, fingerprints, iris, speech, and keystroke dynamics. Patterns that emerge from the results expose novel, important, and encouraging insights into the nature of biometric match results. A new framework for the evaluation of biometric systems based on the biometric menagerie, as opposed to collective statistics, is proposed.

  2. 11.2 YIP Human In the Loop Statistical RelationalLearners

    DTIC Science & Technology

    2017-10-23

    learning formalisms including inverse reinforcement learning [4] and statistical relational learning [7, 5, 8]. We have also applied our algorithms in...one introduced for label preferences. 4 Figure 2: Active Advice Seeking for Inverse Reinforcement Learning. active advice seeking is in selecting the...learning tasks. 1.2.1 Sequential Decision-Making Our previous work on advice for inverse reinforcement learning (IRL) defined advice as action

  3. Efficacy of changing physics misconceptions held by ninth grade students at varying developmental levels through teacher addition of a prediction phase to the learning cycle

    NASA Astrophysics Data System (ADS)

    Oglesby, Michael L.

    This study examines the efficacy in correcting student misconceptions about science concepts by using the pedagogical method of asking students to make a prediction in science laboratory lessons for students within pre-formal, transitional, or formal stages of cognitive development. The subjects were students (n = 235) enrolled in ninth grade physical science classes (n=15) in one high school of an urban profile school district. The four freshmen physical science teachers who were part of the study routinely taught the concepts in the study as a part of the normal curriculum during the time of the school year in which the research was conducted. Classrooms representing approximately half of the students were presented with a prediction phase at the start of each of ten learning cycle lesson. The other classrooms were not presented with a prediction phase. Students were pre and post tested using a 40 question instrument based on the Force Concept Inventory augmented with questions on the concepts taught during the period of the study. Students were also tested using the Test of Scientific Reasoning to determine their cognitive developmental level. Results showed 182 of the students to be cognitively pre-formal, 50 to be transitional, and only 3 to be cognitively formal. There were significantly higher gains (p < .05) for the formal group over the transitional group and for the transitional group over the Pre-formal group. However, there were not significantly higher gains (p > .05) for the total students having a prediction phase compared to those not having a prediction phase. Neither were there significant gains (p > .05) within the pre-formal group or within the transitional group. There were too few students within the formal group for meaningful results.

  4. Assessing Requirements Quality through Requirements Coverage

    NASA Technical Reports Server (NTRS)

    Rajan, Ajitha; Heimdahl, Mats; Woodham, Kurt

    2008-01-01

    In model-based development, the development effort is centered around a formal description of the proposed software system the model. This model is derived from some high-level requirements describing the expected behavior of the software. For validation and verification purposes, this model can then be subjected to various types of analysis, for example, completeness and consistency analysis [6], model checking [3], theorem proving [1], and test-case generation [4, 7]. This development paradigm is making rapid inroads in certain industries, e.g., automotive, avionics, space applications, and medical technology. This shift towards model-based development naturally leads to changes in the verification and validation (V&V) process. The model validation problem determining that the model accurately captures the customer's high-level requirements has received little attention and the sufficiency of the validation activities has been largely determined through ad-hoc methods. Since the model serves as the central artifact, its correctness with respect to the users needs is absolutely crucial. In our investigation, we attempt to answer the following two questions with respect to validation (1) Are the requirements sufficiently defined for the system? and (2) How well does the model implement the behaviors specified by the requirements? The second question can be addressed using formal verification. Nevertheless, the size and complexity of many industrial systems make formal verification infeasible even if we have a formal model and formalized requirements. Thus, presently, there is no objective way of answering these two questions. To this end, we propose an approach based on testing that, when given a set of formal requirements, explores the relationship between requirements-based structural test-adequacy coverage and model-based structural test-adequacy coverage. The proposed technique uses requirements coverage metrics defined in [9] on formal high-level software requirements and existing model coverage metrics such as the Modified Condition and Decision Coverage (MC/DC) used when testing highly critical software in the avionics industry [8]. Our work is related to Chockler et al. [2], but we base our work on traditional testing techniques as opposed to verification techniques.

  5. Evaluation of the efficacy of animal-assisted therapy based on the reality orientation therapy protocol in Alzheimer's disease patients: a pilot study.

    PubMed

    Menna, Lucia Francesca; Santaniello, Antonio; Gerardi, Federica; Di Maggio, Annamaria; Milan, Graziella

    2016-07-01

    The aim of this study was to evaluate the efficacy of animal-assisted therapy (AAT) in elderly patients affected by Alzheimer's disease based on the formal reality orientation therapy (ROT) protocol. Our study was carried out at an Alzheimer's centre for 6 months. A homogeneous sample (age, Mini-Mental State Examination (MMSE), 15-item Geriatric Depression Scale (GDS)) of 50 patients was selected at random and successively. Patients were divided into three groups: (i) 20 patients received a course of AAT (AAT group) based on the ROT protocol; (ii) 20 patients were engaged exclusively in activities based on the ROT group; and (iii) 10 patients (control group) participated in no stimulations. MMSE and GDS were administered at time 0 (T0 ) and time 1 (T1 ) to all three groups. Differences within groups between T0 and T1 for GDS and MMSE scores were analyzed by Student's t-test. Differences between group means were analyzed using an anova test with the Bonferroni-Dunn test for post-hoc comparisons. Both the AAT group and ROT group had improved GDS scores and showed a slight improvement in terms of mood. On the GDS, the AAT group improved from 11.5 (T0 ) to 9.5 (T1 ), and the ROT group improved from 11.6 (T0 ) to 10.5 (T1 ). At the same time, a slight improvement in cognitive function, as measured by the MMSE, was observed. In the AAT group, mean MMSE was 20.2 at T0 and 21.5 at T1 , and in the ROT group, it was 19.9 at T0 and 20.0 at T1 . In the control group, the average values of both the GDS and MMSE remained unchanged. The Bonferroni-Dunn results showed statistically significant differences between groups, particularly between the AAT group and the other two (P < 0.001). Pet therapy interventions based on the formal ROT protocol were effective and, compared to the ROT, provided encouraging and statistically significant results. © 2015 The Authors. Psychogeriatrics © 2015 Japanese Psychogeriatric Society.

  6. Can tactile sensory processing differentiate between children with autistic disorder and asperger's disorder?

    PubMed

    Ghanizadeh, Ahmad

    2011-05-01

    There are debates whether autistic disorder (autism) and Asperger's disorder are two distinct disorders. Moreover, interventional sensory occupational therapy should consider the clinical characteristics of patients. Already, commonalities and differences between Asperger's disorder and autistic disorder are not well studied. The aim of this study is to compare tactile sensory function of children with autistic disorder and children with Asperger's disorder. Tactile sensory function was compared between 36 children with autism and 19 children with Asperger's disorder. The two disorders were diagnosed based on Diagnostic and Statistical Manual of Mental Disorders Fourth Edition, Text Revision. The parent-reported Tactile Dysfunction Checklist was used to assess the three aspects of hypersensitivity, hyposensitivity, and poor tactile perception and discrimination. Developmental coordination was also assessed. Developmental coordination problems total score was not associated with group. The mean (standard deviation) score of tactile hyper-responsivity was not different between the groups. Tactile hyporesponsivity and poor tactile perception and discrimination scores were statistically higher in autistic disorder than Asperger's disorder group. These results for the first time indicated that at least some aspects of tactile perception can differentiate these two disorders. Children with autistic disorder have more tactile sensory seeking behaviors than children with Asperger's disorder. Moreover, the ability of children with autistic disorder for tactile discrimination and sensory perception is less than those with Asperger's disorder. Interventional sensory therapy in children with autistic disorder should have some characteristics that can be different and specific for children with Asperger's disorder. Formal intelligence quotient testing was not performed on all of the children evaluated, which is a limitation to this study. In some cases, a clinical estimation of intelligence quotient was given, which limits the conclusions that can be drawn from the data. Additional research using formal intelligence quotient testing on all of the subjects should be performed in order to draw more concrete conclusions.

  7. Can Tactile Sensory Processing Differentiate Between Children with Autistic Disorder and Asperger's Disorder?

    PubMed Central

    2011-01-01

    Objective There are debates whether autistic disorder (autism) and Asperger's disorder are two distinct disorders. Moreover, interventional sensory occupational therapy should consider the clinical characteristics of patients. Already, commonalities and differences between Asperger's disorder and autistic disorder are not well studied. The aim of this study is to compare tactile sensory function of children with autistic disorder and children with Asperger's disorder. Methods Tactile sensory function was compared between 36 children with autism and 19 children with Asperger's disorder. The two disorders were diagnosed based on Diagnostic and Statistical Manual of Mental Disorders Fourth Edition, Text Revision. The parent-reported Tactile Dysfunction Checklist was used to assess the three aspects of hypersensitivity, hyposensitivity, and poor tactile perception and discrimination. Developmental coordination was also assessed. Results Developmental coordination problems total score was not associated with group. The mean (standard deviation) score of tactile hyper-responsivity was not different between the groups. Tactile hyporesponsivity and poor tactile perception and discrimination scores were statistically higher in autistic disorder than Asperger's disorder group. Conclusion These results for the first time indicated that at least some aspects of tactile perception can differentiate these two disorders. Children with autistic disorder have more tactile sensory seeking behaviors than children with Asperger's disorder. Moreover, the ability of children with autistic disorder for tactile discrimination and sensory perception is less than those with Asperger's disorder. Interventional sensory therapy in children with autistic disorder should have some characteristics that can be different and specific for children with Asperger's disorder. Formal intelligence quotient testing was not performed on all of the children evaluated, which is a limitation to this study. In some cases, a clinical estimation of intelligence quotient was given, which limits the conclusions that can be drawn from the data. Additional research using formal intelligence quotient testing on all of the subjects should be performed in order to draw more concrete conclusions. PMID:21686145

  8. The Effects of Using Space to Teach Standard Elementary School Curriculum

    NASA Technical Reports Server (NTRS)

    Ewell, Robert N.

    1996-01-01

    This brief report and recommendation for further research brings to a formal close this effort, the original purpose of which is described in detail in The effects of using space to teach standard elementary school curriculum, Volume 1, included here as the Appendix. Volume 1 describes the project as a 3-year research program to determine the effectiveness of using space to teach. The research design is quasi experimental using standardized test data on students from Aldrin Elementary School and a District-identified 'control' school, which shall be referred to as 'School B.' Students now in fourth through sixth grades will be compared now (after one year at Aldrin) and tracked at least until the present sixth graders are through the eighth grade. Appropriate statistical tests will be applied to standardized test scores to see if Aldrin students are 'better' than School B students in areas such as: Overall academic performance; Performance in math/science; and Enrollments in math/science in middle school.

  9. Innovative approach to teaching communication skills to nursing students.

    PubMed

    Zavertnik, Jean Ellen; Huff, Tanya A; Munro, Cindy L

    2010-02-01

    This study assessed the effectiveness of a learner-centered simulation intervention designed to improve the communication skills of preprofessional sophomore nursing students. An innovative teaching strategy in which communication skills are taught to nursing students by using trained actors who served as standardized family members in a clinical learning laboratory setting was evaluated using a two-group posttest design. In addition to current standard education, the intervention group received a formal training session presenting a framework for communication and a 60-minute practice session with the standardized family members. Four domains of communication-introduction, gathering of information, imparting information, and clarifying goals and expectations-were evaluated in the control and intervention groups in individual testing sessions with a standardized family member. The intervention group performed better than the control group in all four tested domains related to communication skills, and the difference was statistically significant in the domain of gathering information (p = 0.0257). Copyright 2010, SLACK Incorporated.

  10. EFL Teachers' Formal Assessment Practices Based on Exam Papers

    ERIC Educational Resources Information Center

    Kiliçkaya, Ferit

    2016-01-01

    This study reports initial findings from a small-scale qualitative study aimed at gaining insights into English language teachers' assessment practices in Turkey by examining the formal exam papers. Based on the technique of content analysis, formal exam papers were analyzed in terms of assessment items, language skills tested as well as the…

  11. Formal Methods of V&V of Partial Specifications: An Experience Report

    NASA Technical Reports Server (NTRS)

    Easterbrook, Steve; Callahan, John

    1997-01-01

    This paper describes our work exploring the suitability of formal specification methods for independent verification and validation (IV&V) of software specifications for large, safety critical systems. An IV&V contractor often has to perform rapid analysis on incomplete specifications, with no control over how those specifications are represented. Lightweight formal methods show significant promise in this context, as they offer a way of uncovering major errors, without the burden of full proofs of correctness. We describe an experiment in the application of the method SCR. to testing for consistency properties of a partial model of requirements for Fault Detection Isolation and Recovery on the space station. We conclude that the insights gained from formalizing a specification is valuable, and it is the process of formalization, rather than the end product that is important. It was only necessary to build enough of the formal model to test the properties in which we were interested. Maintenance of fidelity between multiple representations of the same requirements (as they evolve) is still a problem, and deserves further study.

  12. Continuing professional development for volunteers working in palliative care in a tertiary care cancer institute in India: a cross-sectional observational study of educational needs.

    PubMed

    Deodhar, Jayita Kedar; Muckaden, Mary Ann

    2015-01-01

    Training programs for volunteers prior to their working in palliative care are well-established in India. However, few studies report on continuing professional development programs for this group. To conduct a preliminary assessment of educational needs of volunteers working in palliative care for developing a structured formal continuing professional development program for this group. Cross-sectional observational study conducted in the Department of Palliative Medicine of a tertiary care cancer institute in India. Participant volunteers completed a questionnaire, noting previous training, years of experience, and a comprehensive list of topics for inclusion in this program, rated in order of importance according to them. Descriptive statistics for overall data and Chi-square tests for categorical variables for group comparisons were applied using Statistical Package for Social Sciences version 18. Fourteen out of 17 volunteers completed the questionnaire, seven having 5-10-years experience in working in palliative care. A need for continuing professional development program was felt by all participants. Communication skills, more for children and elderly specific issues were given highest priority. Spiritual-existential aspects and self-care were rated lower in importance than psychological, physical, and social aspects in palliative care. More experienced volunteers (>5 years of experience) felt the need for self-care as a topic in the program than those with less (<5-years experience) (P < 0.05). Understanding palliative care volunteers' educational needs is essential for developing a structured formal continuing professional development program and should include self-care as a significant component.

  13. Medication errors of nurses and factors in refusal to report medication errors among nurses in a teaching medical center of iran in 2012.

    PubMed

    Mostafaei, Davoud; Barati Marnani, Ahmad; Mosavi Esfahani, Haleh; Estebsari, Fatemeh; Shahzaidi, Shiva; Jamshidi, Ensiyeh; Aghamiri, Seyed Samad

    2014-10-01

    About one third of unwanted reported medication consequences are due to medication errors, resulting in one-fifth of hospital injuries. The aim of this study was determined formal and informal medication errors of nurses and the level of importance of factors in refusal to report medication errors among nurses. The cross-sectional study was done on the nursing staff of Shohada Tajrish Hospital, Tehran, Iran in 2012. The data was gathered through a questionnaire, made by the researchers. The questionnaires' face and content validity was confirmed by experts and for measuring its reliability test-retest was used. The data was analyzed by descriptive statistics. We used SPSS for related statistical analyses. The most important factors in refusal to report medication errors respectively were: lack of medication error recording and reporting system in the hospital (3.3%), non-significant error reporting to hospital authorities and lack of appropriate feedback (3.1%), and lack of a clear definition for a medication error (3%). There were both formal and informal reporting of medication errors in this study. Factors pertaining to management in hospitals as well as the fear of the consequences of reporting are two broad fields among the factors that make nurses not report their medication errors. In this regard, providing enough education to nurses, boosting the job security for nurses, management support and revising related processes and definitions are some factors that can help decreasing medication errors and increasing their report in case of occurrence.

  14. Global developmental delay in guanidionacetate methyltransferase deficiency: differences in formal testing and clinical observation.

    PubMed

    Verbruggen, Krijn T; Knijff, Wilma A; Soorani-Lunsing, Roelineke J; Sijens, Paul E; Verhoeven, Nanda M; Salomons, Gajja S; Goorhuis-Brouwer, Siena M; van Spronsen, Francjan J

    2007-09-01

    Guanidinoacetate N-methyltransferase (GAMT) deficiency is a defect in the biosynthesis of creatine (Cr). So far, reports have not focused on the description of developmental abilities in this disorder. Here, we present the result of formal testing of developmental abilities in a GAMT-deficient patient. Our patient, a 3-year-old boy with GAMT deficiency, presented clinically with a severe language production delay and nearly normal nonverbal development. Treatment with oral Cr supplementation led to partial restoration of the cerebral Cr concentration and a clinically remarkable acceleration of language production development. In contrast to clinical observation, formal testing showed a rather harmonic developmental delay before therapy and a general improvement, but no specific acceleration of language development after therapy. From our case, we conclude that in GAMT deficiency language delay is not always more prominent than delays in other developmental areas. The discrepancy between the clinical impression and formal testing underscores the importance of applying standardized tests in children with developmental delays. Screening for Cr deficiency by metabolite analysis of body fluids or proton magnetic resonance spectroscopy of the brain deficiency should be considered in any child with global developmental delay/mental retardation lacking clues for an alternative etiology.

  15. Cost Implications of Organizing Nursing Home Workforce in Teams

    PubMed Central

    Mukamel, Dana B; Cai, Shubing; Temkin-Greener, Helena

    2009-01-01

    Objective To estimate the costs associated with formal and self-managed daily practice teams in nursing homes. Data Sources/Study Setting Medicaid cost reports for 135 nursing homes in New York State in 2006 and survey data for 6,137 direct care workers. Study Design A retrospective statistical analysis: We estimated hybrid cost functions that include team penetration variables. Inference was based on robust standard errors. Data Collection Formal and self-managed team penetration (i.e., percent of staff working in a team) were calculated from survey responses. Annual variable costs, beds, case mix-adjusted days, admissions, home care visits, outpatient clinic visits, day care days, wages, and ownership were calculated from the cost reports. Principal Findings Formal team penetration was significantly associated with costs, while self-managed teams penetration was not. Costs declined with increasing penetration up to 13 percent of formal teams, and increased above this level. Formal teams in nursing homes in the upward sloping range of the curve were more diverse, with a larger number of participating disciplines and more likely to include physicians. Conclusions Organization of workforce in formal teams may offer nursing homes a cost-saving strategy. More research is required to understand the relationship between team composition and costs. PMID:19486181

  16. The Profile Envision and Splicing Tool (PRESTO): Developing an Atmospheric Wind Analysis Tool for Space Launch Vehicles Using Python

    NASA Technical Reports Server (NTRS)

    Orcutt, John M.; Barbre, Robert E., Jr.; Brenton, James C.; Decker, Ryan K.

    2017-01-01

    Launch vehicle programs require vertically complete atmospheric profiles. Many systems at the ER to make the necessary measurements, but all have different EVR, vertical coverage, and temporal coverage. MSFC Natural Environments Branch developed a tool to create a vertically complete profile from multiple inputs using Python. Forward work: Finish Formal Testing Acceptance Testing, End-to-End Testing. Formal Release

  17. Thomas-Fermi model for a bulk self-gravitating stellar object in two dimensions

    NASA Astrophysics Data System (ADS)

    De, Sanchari; Chakrabarty, Somenath

    2015-09-01

    In this article we have solved a hypothetical problem related to the stability and gross properties of two-dimensional self-gravitating stellar objects using the Thomas-Fermi model. The formalism presented here is an extension of the standard three-dimensional problem discussed in the book on statistical physics, Part I by Landau and Lifshitz. Further, the formalism presented in this article may be considered a class problem for post-graduate-level students of physics or may be assigned as a part of their dissertation project.

  18. Teaching medical students ultrasound-guided vascular access - which learning method is best?

    PubMed

    Lian, Alwin; Rippey, James C R; Carr, Peter J

    2017-05-15

    Ultrasound is recommended to guide insertion of peripheral intravenous vascular cannulae (PIVC) where difficulty is experienced. Ultrasound machines are now common-place and junior doctors are often expected to be able to use them. The educational standards for this skill are highly varied, ranging from no education, to self-guided internet-based education, to formal, face-to-face traditional education. In an attempt to decide which educational technique our institution should introduce, a small pilot trial comparing educational techniques was designed. Thirty medical students were enrolled and allocated to one of three groups. PIVC placing ability was then observed, tested and graded on vascular access phantoms. The formal, face-to-face traditional education was rated best by the students, and had the highest success rate in PIVC placement, the improvement statistically significant compared to no education (p = 0.01) and trending towards significance when compared to self-directed internet-based education (p<0.06). The group receiving traditional face-to-face teaching on ultrasound-guided vascular access, performed significantly better than those not receiving education. As the number of ultrasound machines in clinical areas increases, it is important that education programs to support their safe and appropriate use are developed.

  19. Performances on the CogState and standard neuropsychological batteries among HIV patients without dementia.

    PubMed

    Overton, Edgar Turner; Kauwe, John S K; Paul, Robert; Tashima, Karen; Tate, David F; Patel, Pragna; Carpenter, Charles C J; Patty, David; Brooks, John T; Clifford, David B

    2011-11-01

    HIV-associated neurocognitive disorders remain prevalent but challenging to diagnose particularly among non-demented individuals. To determine whether a brief computerized battery correlates with formal neurocognitive testing, we identified 46 HIV-infected persons who had undergone both formal neurocognitive testing and a brief computerized battery. Simple detection tests correlated best with formal neuropsychological testing. By multivariable regression model, 53% of the variance in the composite Global Deficit Score was accounted for by elements from the brief computerized tool (P < 0.01). These data confirm previous correlation data with the computerized battery. Using the five significant parameters from the regression model in a Receiver Operating Characteristic curve, 90% of persons were accurately classified as being cognitively impaired or not. The test battery requires additional evaluation, specifically for identifying persons with mild impairment, a state upon which interventions may be effective.

  20. The development and initial validation of a sensitive bedside cognitive screening test.

    PubMed

    Faust, D; Fogel, B S

    1989-01-01

    Brief bedside cognitive examinations such as the Mini-Mental State Examination are designed to detect delirium and dementia but not more subtle or delineated cognitive deficits. Formal neuropsychological evaluation provides greater sensitivity and detects a wider range of cognitive deficits but is too lengthy for efficient use at the bedside or in epidemiological studies. The authors developed the High Sensitivity Cognitive Screen (HSCS), a 20-minute interview-based test, to identify patients who show disorder on formal neuropsychological evaluation. An initial study demonstrated satisfactory test-retest and interrater reliability. The HSCS was then administered to 60 psychiatric and neurological patients with suspected cognitive deficits but without gross impairment, who also completed formal neuropsychological testing. Results of both tests were independently classified as either normal, borderline, or abnormal. The HSCS correctly classified 93% of patients across the normal-abnormal dichotomy and showed promise for characterizing the extent and severity of cognitive dysfunction.

  1. Beyond "objective" and "projective": a logical system for classifying psychological tests: comment on Meyer and Kurtz (2006).

    PubMed

    Wagner, Edwin E

    2008-07-01

    I present a formal system that accounts for the misleading distinction between tests formerly termed objective and projective, duly noted by Meyer and Kurtz (2006). Three principles of Response Rightness, Response Latitude and Stimulus Ambiguity are shown to govern, in combination, the formal operating characteristics of tests, producing inevitable overlap between "objective" and "projective" tests and creating at least three "types" of tests historically regarded as being projective in nature. The system resolves many past issues regarding test classification and can be generalized to include all psychological tests.

  2. Cost model validation: a technical and cultural approach

    NASA Technical Reports Server (NTRS)

    Hihn, J.; Rosenberg, L.; Roust, K.; Warfield, K.

    2001-01-01

    This paper summarizes how JPL's parametric mission cost model (PMCM) has been validated using both formal statistical methods and a variety of peer and management reviews in order to establish organizational acceptance of the cost model estimates.

  3. Unified quantitative characterization of epithelial tissue development

    PubMed Central

    Guirao, Boris; Rigaud, Stéphane U; Bosveld, Floris; Bailles, Anaïs; López-Gay, Jesús; Ishihara, Shuji; Sugimura, Kaoru

    2015-01-01

    Understanding the mechanisms regulating development requires a quantitative characterization of cell divisions, rearrangements, cell size and shape changes, and apoptoses. We developed a multiscale formalism that relates the characterizations of each cell process to tissue growth and morphogenesis. Having validated the formalism on computer simulations, we quantified separately all morphogenetic events in the Drosophila dorsal thorax and wing pupal epithelia to obtain comprehensive statistical maps linking cell and tissue scale dynamics. While globally cell shape changes, rearrangements and divisions all significantly participate in tissue morphogenesis, locally, their relative participations display major variations in space and time. By blocking division we analyzed the impact of division on rearrangements, cell shape changes and tissue morphogenesis. Finally, by combining the formalism with mechanical stress measurement, we evidenced unexpected interplays between patterns of tissue elongation, cell division and stress. Our formalism provides a novel and rigorous approach to uncover mechanisms governing tissue development. DOI: http://dx.doi.org/10.7554/eLife.08519.001 PMID:26653285

  4. Quantum-like model for the adaptive dynamics of the genetic regulation of E. coli's metabolism of glucose/lactose.

    PubMed

    Asano, Masanari; Basieva, Irina; Khrennikov, Andrei; Ohya, Masanori; Tanaka, Yoshiharu; Yamato, Ichiro

    2012-06-01

    We developed a quantum-like model describing the gene regulation of glucose/lactose metabolism in a bacterium, Escherichia coli. Our quantum-like model can be considered as a kind of the operational formalism for microbiology and genetics. Instead of trying to describe processes in a cell in the very detail, we propose a formal operator description. Such a description may be very useful in situation in which the detailed description of processes is impossible or extremely complicated. We analyze statistical data obtained from experiments, and we compute the degree of E. coli's preference within adaptive dynamics. It is known that there are several types of E. coli characterized by the metabolic system. We demonstrate that the same type of E. coli can be described by the well determined operators; we find invariant operator quantities characterizing each type. Such invariant quantities can be calculated from the obtained statistical data.

  5. The Verification-based Analysis of Reliable Multicast Protocol

    NASA Technical Reports Server (NTRS)

    Wu, Yunqing

    1996-01-01

    Reliable Multicast Protocol (RMP) is a communication protocol that provides an atomic, totally ordered, reliable multicast service on top of unreliable IP Multicasting. In this paper, we develop formal models for R.W using existing automatic verification systems, and perform verification-based analysis on the formal RMP specifications. We also use the formal models of RW specifications to generate a test suite for conformance testing of the RMP implementation. Throughout the process of RMP development, we follow an iterative, interactive approach that emphasizes concurrent and parallel progress between the implementation and verification processes. Through this approach, we incorporate formal techniques into our development process, promote a common understanding for the protocol, increase the reliability of our software, and maintain high fidelity between the specifications of RMP and its implementation.

  6. On testing for spatial correspondence between maps of human brain structure and function.

    PubMed

    Alexander-Bloch, Aaron F; Shou, Haochang; Liu, Siyuan; Satterthwaite, Theodore D; Glahn, David C; Shinohara, Russell T; Vandekar, Simon N; Raznahan, Armin

    2018-06-01

    A critical issue in many neuroimaging studies is the comparison between brain maps. Nonetheless, it remains unclear how one should test hypotheses focused on the overlap or spatial correspondence between two or more brain maps. This "correspondence problem" affects, for example, the interpretation of comparisons between task-based patterns of functional activation, resting-state networks or modules, and neuroanatomical landmarks. To date, this problem has been addressed with remarkable variability in terms of methodological approaches and statistical rigor. In this paper, we address the correspondence problem using a spatial permutation framework to generate null models of overlap by applying random rotations to spherical representations of the cortical surface, an approach for which we also provide a theoretical statistical foundation. We use this method to derive clusters of cognitive functions that are correlated in terms of their functional neuroatomical substrates. In addition, using publicly available data, we formally demonstrate the correspondence between maps of task-based functional activity, resting-state fMRI networks and gyral-based anatomical landmarks. We provide open-access code to implement the methods presented for two commonly-used tools for surface based cortical analysis (https://www.github.com/spin-test). This spatial permutation approach constitutes a useful advance over widely-used methods for the comparison of cortical maps, thereby opening new possibilities for the integration of diverse neuroimaging data. Copyright © 2018 Elsevier Inc. All rights reserved.

  7. Strong correlations between the exponent α and the particle number for a Renyi monoatomic gas in Gibbs' statistical mechanics.

    PubMed

    Plastino, A; Rocca, M C

    2017-06-01

    Appealing to the 1902 Gibbs formalism for classical statistical mechanics (SM)-the first SM axiomatic theory ever that successfully explained equilibrium thermodynamics-we show that already at the classical level there is a strong correlation between Renyi's exponent α and the number of particles for very simple systems. No reference to heat baths is needed for such a purpose.

  8. Equitability, mutual information, and the maximal information coefficient.

    PubMed

    Kinney, Justin B; Atwal, Gurinder S

    2014-03-04

    How should one quantify the strength of association between two random variables without bias for relationships of a specific form? Despite its conceptual simplicity, this notion of statistical "equitability" has yet to receive a definitive mathematical formalization. Here we argue that equitability is properly formalized by a self-consistency condition closely related to Data Processing Inequality. Mutual information, a fundamental quantity in information theory, is shown to satisfy this equitability criterion. These findings are at odds with the recent work of Reshef et al. [Reshef DN, et al. (2011) Science 334(6062):1518-1524], which proposed an alternative definition of equitability and introduced a new statistic, the "maximal information coefficient" (MIC), said to satisfy equitability in contradistinction to mutual information. These conclusions, however, were supported only with limited simulation evidence, not with mathematical arguments. Upon revisiting these claims, we prove that the mathematical definition of equitability proposed by Reshef et al. cannot be satisfied by any (nontrivial) dependence measure. We also identify artifacts in the reported simulation evidence. When these artifacts are removed, estimates of mutual information are found to be more equitable than estimates of MIC. Mutual information is also observed to have consistently higher statistical power than MIC. We conclude that estimating mutual information provides a natural (and often practical) way to equitably quantify statistical associations in large datasets.

  9. A Practical Methodology for Quantifying Random and Systematic Components of Unexplained Variance in a Wind Tunnel

    NASA Technical Reports Server (NTRS)

    Deloach, Richard; Obara, Clifford J.; Goodman, Wesley L.

    2012-01-01

    This paper documents a check standard wind tunnel test conducted in the Langley 0.3-Meter Transonic Cryogenic Tunnel (0.3M TCT) that was designed and analyzed using the Modern Design of Experiments (MDOE). The test designed to partition the unexplained variance of typical wind tunnel data samples into two constituent components, one attributable to ordinary random error, and one attributable to systematic error induced by covariate effects. Covariate effects in wind tunnel testing are discussed, with examples. The impact of systematic (non-random) unexplained variance on the statistical independence of sequential measurements is reviewed. The corresponding correlation among experimental errors is discussed, as is the impact of such correlation on experimental results generally. The specific experiment documented herein was organized as a formal test for the presence of unexplained variance in representative samples of wind tunnel data, in order to quantify the frequency with which such systematic error was detected, and its magnitude relative to ordinary random error. Levels of systematic and random error reported here are representative of those quantified in other facilities, as cited in the references.

  10. Proceedings of the Second NASA Formal Methods Symposium

    NASA Technical Reports Server (NTRS)

    Munoz, Cesar (Editor)

    2010-01-01

    This publication contains the proceedings of the Second NASA Formal Methods Symposium sponsored by the National Aeronautics and Space Administration and held in Washington D.C. April 13-15, 2010. Topics covered include: Decision Engines for Software Analysis using Satisfiability Modulo Theories Solvers; Verification and Validation of Flight-Critical Systems; Formal Methods at Intel -- An Overview; Automatic Review of Abstract State Machines by Meta Property Verification; Hardware-independent Proofs of Numerical Programs; Slice-based Formal Specification Measures -- Mapping Coupling and Cohesion Measures to Formal Z; How Formal Methods Impels Discovery: A Short History of an Air Traffic Management Project; A Machine-Checked Proof of A State-Space Construction Algorithm; Automated Assume-Guarantee Reasoning for Omega-Regular Systems and Specifications; Modeling Regular Replacement for String Constraint Solving; Using Integer Clocks to Verify the Timing-Sync Sensor Network Protocol; Can Regulatory Bodies Expect Efficient Help from Formal Methods?; Synthesis of Greedy Algorithms Using Dominance Relations; A New Method for Incremental Testing of Finite State Machines; Verification of Faulty Message Passing Systems with Continuous State Space in PVS; Phase Two Feasibility Study for Software Safety Requirements Analysis Using Model Checking; A Prototype Embedding of Bluespec System Verilog in the PVS Theorem Prover; SimCheck: An Expressive Type System for Simulink; Coverage Metrics for Requirements-Based Testing: Evaluation of Effectiveness; Software Model Checking of ARINC-653 Flight Code with MCP; Evaluation of a Guideline by Formal Modelling of Cruise Control System in Event-B; Formal Verification of Large Software Systems; Symbolic Computation of Strongly Connected Components Using Saturation; Towards the Formal Verification of a Distributed Real-Time Automotive System; Slicing AADL Specifications for Model Checking; Model Checking with Edge-valued Decision Diagrams; and Data-flow based Model Analysis.

  11. Formal Process Modeling to Improve Human Decision-Making in Test and Evaluation Acoustic Range Control

    DTIC Science & Technology

    2017-09-01

    AVAILABILITY STATEMENT Approved for public release. Distribution is unlimited. 12b. DISTRIBUTION CODE 13. ABSTRACT (maximum 200 words) Test and...ambiguities and identify high -value decision points? This thesis explores how formalization of these experience-based decisions as a process model...representing a T&E event may reveal high -value decision nodes where certain decisions carry more weight or potential for impacts to a successful test. The

  12. Used battery collection in central Mexico: metal content, legislative/management situation and statistical analysis.

    PubMed

    Guevara-García, José Antonio; Montiel-Corona, Virginia

    2012-03-01

    A statistical analysis of a used battery collection campaign in the state of Tlaxcala, Mexico, is presented. This included a study of the metal composition of spent batteries from formal and informal markets, and a critical discussion about the management of spent batteries in Mexico with respect to legislation. A six-month collection campaign was statistically analyzed: 77% of the battery types were "AA" and 30% of the batteries were from the informal market. A substantial percentage (36%) of batteries had residual voltage in the range 1.2-1.4 V, and 70% had more than 1.0 V; this may reflect underutilization. Metal content analysis and recovery experiments were performed with the five formal and four more frequent informal trademarks. The analysis of Hg, Cd and Pb showed there is no significant difference in content between formal and informal commercialized batteries. All of the analyzed trademarks were under the permissible limit levels of the proposed Mexican Official Norm (NOM) NMX-AA-104-SCFI-2006 and would be classified as not dangerous residues (can be thrown to the domestic rubbish); however, compared with the EU directive 2006/66/EC, 8 out of 9 of the selected battery trademarks would be rejected, since the Mexican Norm content limit is 20, 7.5 and 5 fold higher in Hg, Cd and Pb, respectively, than the EU directive. These results outline the necessity for better regulatory criteria in the proposed Mexican NOM in order to minimize the impact on human health and the environment of this type of residues. Copyright © 2010 Elsevier Ltd. All rights reserved.

  13. Quantum Statistics of the Toda Oscillator in the Wigner Function Formalism

    NASA Astrophysics Data System (ADS)

    Vojta, Günter; Vojta, Matthias

    Classical and quantum mechanical Toda systems (Toda molecules, Toda lattices, Toda quantum fields) recently found growing interest as nonlinear systems showing solitons and chaos. In this paper the statistical thermodynamics of a system of quantum mechanical Toda oscillators characterized by a potential energy V(q) = Vo cos h q is treated within the Wigner function formalism (phase space formalism of quantum statistics). The partition function is given as a Wigner- Kirkwood series expansion in terms of powers of h2 (semiclassical expansion). The partition function and all thermodynamic functions are written, with considerable exactness, as simple closed expressions containing only the modified Hankel functions Ko and K1 of the purely imaginary argument i with = Vo/kT.Translated AbstractQuantenstatistik des Toda-Oszillators im Formalismus der Wigner-FunktionKlassische und quantenmechanische Toda-Systeme (Toda-Moleküle, Toda-Gitter, Toda-Quantenfelder) haben als nichtlineare Systeme mit Solitonen und Chaos in jüngster Zeit zunehmend an Interesse gewonnen. Wir untersuchen die statistische Thermodynamik eines Systems quantenmechanischer Toda-Oszillatoren, die durch eine potentielle Energie der Form V(q) = Vo cos h q charakterisiert sind, im Formalismus der Wigner-Funktion (Phasenraum-Formalismus der Quantenstatistik). Die Zustandssumme wird als Wigner-Kirkwood-Reihe nach Potenzen von h2 (semiklassische Entwicklung) dargestellt, und aus ihr werden die thermodynamischen Funktionen berechnet. Sämtliche Funktionen sind durch einfache geschlossene Formeln allein mit den modifizierten Hankel-Funktionen Ko und K1 des rein imaginären Arguments i mit = Vo/kT mit großer Genauigkeit darzustellen.

  14. Considering Horn's Parallel Analysis from a Random Matrix Theory Point of View.

    PubMed

    Saccenti, Edoardo; Timmerman, Marieke E

    2017-03-01

    Horn's parallel analysis is a widely used method for assessing the number of principal components and common factors. We discuss the theoretical foundations of parallel analysis for principal components based on a covariance matrix by making use of arguments from random matrix theory. In particular, we show that (i) for the first component, parallel analysis is an inferential method equivalent to the Tracy-Widom test, (ii) its use to test high-order eigenvalues is equivalent to the use of the joint distribution of the eigenvalues, and thus should be discouraged, and (iii) a formal test for higher-order components can be obtained based on a Tracy-Widom approximation. We illustrate the performance of the two testing procedures using simulated data generated under both a principal component model and a common factors model. For the principal component model, the Tracy-Widom test performs consistently in all conditions, while parallel analysis shows unpredictable behavior for higher-order components. For the common factor model, including major and minor factors, both procedures are heuristic approaches, with variable performance. We conclude that the Tracy-Widom procedure is preferred over parallel analysis for statistically testing the number of principal components based on a covariance matrix.

  15. Revolutionizing volunteer interpreter services: an evaluation of an innovative medical interpreter education program.

    PubMed

    Hasbún Avalos, Oswaldo; Pennington, Kaylin; Osterberg, Lars

    2013-12-01

    In our ever-increasingly multicultural, multilingual society, medical interpreters serve an important role in the provision of care. Though it is known that using untrained interpreters leads to decreased quality of care for limited English proficiency patients, because of a short supply of professionals and a lack of formalized, feasible education programs for volunteers, community health centers and internal medicine practices continue to rely on untrained interpreters. To develop and formally evaluate a novel medical interpreter education program that encompasses major tenets of interpretation, tailored to the needs of volunteer medical interpreters. One-armed, quasi-experimental retro-pre-post study using survey ratings and feedback correlated by assessment scores to determine educational intervention effects. Thirty-eight students; 24 Spanish, nine Mandarin, and five Vietnamese. The majority had prior interpreting experience but no formal medical interpreter training. Students completed retrospective pre-test and post-test surveys measuring confidence in and perceived knowledge of key skills of interpretation. Primary outcome measures were a 10-point Likert scale for survey questions of knowledge, skills, and confidence, written and oral assessments of interpreter skills, and qualitative evidence of newfound knowledge in written reflections. Analyses showed a statistically significant (P <0.001) change of about two points in mean self-ratings on knowledge, skills, and confidence, with large effect sizes (d > 0.8). The second half of the program was also quantitatively and qualitatively shown to be a vital learning experience, resulting in 18 % more students passing the oral assessments; a 19 % increase in mean scores for written assessments; and a newfound understanding of interpreter roles and ways to navigate them. This innovative program was successful in increasing volunteer interpreters' skills and knowledge of interpretation, as well as confidence in own abilities. Additionally, the program effectively taught how to navigate the roles of the interpreter to maintain clear communication.

  16. Analysis and meta-analysis of single-case designs with a standardized mean difference statistic: a primer and applications.

    PubMed

    Shadish, William R; Hedges, Larry V; Pustejovsky, James E

    2014-04-01

    This article presents a d-statistic for single-case designs that is in the same metric as the d-statistic used in between-subjects designs such as randomized experiments and offers some reasons why such a statistic would be useful in SCD research. The d has a formal statistical development, is accompanied by appropriate power analyses, and can be estimated using user-friendly SPSS macros. We discuss both advantages and disadvantages of d compared to other approaches such as previous d-statistics, overlap statistics, and multilevel modeling. It requires at least three cases for computation and assumes normally distributed outcomes and stationarity, assumptions that are discussed in some detail. We also show how to test these assumptions. The core of the article then demonstrates in depth how to compute d for one study, including estimation of the autocorrelation and the ratio of between case variance to total variance (between case plus within case variance), how to compute power using a macro, and how to use the d to conduct a meta-analysis of studies using single-case designs in the free program R, including syntax in an appendix. This syntax includes how to read data, compute fixed and random effect average effect sizes, prepare a forest plot and a cumulative meta-analysis, estimate various influence statistics to identify studies contributing to heterogeneity and effect size, and do various kinds of publication bias analyses. This d may prove useful for both the analysis and meta-analysis of data from SCDs. Copyright © 2013 Society for the Study of School Psychology. Published by Elsevier Ltd. All rights reserved.

  17. A threshold method for immunological correlates of protection

    PubMed Central

    2013-01-01

    Background Immunological correlates of protection are biological markers such as disease-specific antibodies which correlate with protection against disease and which are measurable with immunological assays. It is common in vaccine research and in setting immunization policy to rely on threshold values for the correlate where the accepted threshold differentiates between individuals who are considered to be protected against disease and those who are susceptible. Examples where thresholds are used include development of a new generation 13-valent pneumococcal conjugate vaccine which was required in clinical trials to meet accepted thresholds for the older 7-valent vaccine, and public health decision making on vaccination policy based on long-term maintenance of protective thresholds for Hepatitis A, rubella, measles, Japanese encephalitis and others. Despite widespread use of such thresholds in vaccine policy and research, few statistical approaches have been formally developed which specifically incorporate a threshold parameter in order to estimate the value of the protective threshold from data. Methods We propose a 3-parameter statistical model called the a:b model which incorporates parameters for a threshold and constant but different infection probabilities below and above the threshold estimated using profile likelihood or least squares methods. Evaluation of the estimated threshold can be performed by a significance test for the existence of a threshold using a modified likelihood ratio test which follows a chi-squared distribution with 3 degrees of freedom, and confidence intervals for the threshold can be obtained by bootstrapping. The model also permits assessment of relative risk of infection in patients achieving the threshold or not. Goodness-of-fit of the a:b model may be assessed using the Hosmer-Lemeshow approach. The model is applied to 15 datasets from published clinical trials on pertussis, respiratory syncytial virus and varicella. Results Highly significant thresholds with p-values less than 0.01 were found for 13 of the 15 datasets. Considerable variability was seen in the widths of confidence intervals. Relative risks indicated around 70% or better protection in 11 datasets and relevance of the estimated threshold to imply strong protection. Goodness-of-fit was generally acceptable. Conclusions The a:b model offers a formal statistical method of estimation of thresholds differentiating susceptible from protected individuals which has previously depended on putative statements based on visual inspection of data. PMID:23448322

  18. Influence of formal maternal education on the use of maternity services in Enugu, Nigeria.

    PubMed

    Ikeako, L C; Onah, H E; Iloabachie, G C

    2006-01-01

    Although some previous studies have suggested formal maternal education as the most potent tool for reducing the mortality ratio in Nigeria, other studies found that the depressed Nigerian economy since 1986 has marginalised the benefits of education with the result that educated women stopped making use of existing health facilities because they could not afford the cost of health services. This study was carried out to determine the current influence of formal maternal education and other factors on the choice of place of delivery by pregnant women in Enugu, south-eastern Nigeria. It was a pre-tested interviewer-administered questionnaire study of women who delivered within 3 months before the date of data collection in the study area. In an increasing order of level of care, the outcome variable (place where the last delivery took place) was categorised into seven, with home deliveries representing the lowest category and private hospitals run by specialist obstetricians as the highest category. These were further sub-categorised into non-institutional deliveries and institutional deliveries. Maternal educational level was the main predictor variable. Other predictor variables were sociodemographic factors. Data analysis was by means of descriptive and inferential statistics including means, frequencies and chi2-tests at the 95% confidence (CI) level. Out of a total of 1,450 women to whom the questionnaires were administered, 1,095 women responded (a response rate of 75.5%). A total of 579 (52.9%) of the respondents delivered outside health institutions, while the remaining 516 (47.1%) delivered within health institutions. Regarding the educational levels of the respondents, 301 (27.5%) had no formal education; 410 (37.4%) had primary education; 148 (13.5%) secondary education and 236 (21.5%) post-secondary education. There was a significant positive correlation between the educational levels of the respondents and their husbands (r=0.86, p=0.000). With respect to occupational categories of the respondents, 88 (8.0%) of them belonged to occupational class I, 158 (14.4%) to occupational class II, 107 (9.8%) to occupational class III, 14 (1.3%) to occupational class IV and 728 to occupational class V. There was a significant positive correlation between the respondents' and their husbands' occupational levels (r=0.89, p=0.000). There were statistically significant associations between choice of institutional or non-institutional deliveries and respondents' educational level as well as place of residence (urban/rural), religion, tribe, marital status, occupational level, husband's occupational and educational levels, age and parity (p

  19. The specification-based validation of reliable multicast protocol: Problem Report. M.S. Thesis

    NASA Technical Reports Server (NTRS)

    Wu, Yunqing

    1995-01-01

    Reliable Multicast Protocol (RMP) is a communication protocol that provides an atomic, totally ordered, reliable multicast service on top of unreliable IP multicasting. In this report, we develop formal models for RMP using existing automated verification systems, and perform validation on the formal RMP specifications. The validation analysis help identifies some minor specification and design problems. We also use the formal models of RMP to generate a test suite for conformance testing of the implementation. Throughout the process of RMP development, we follow an iterative, interactive approach that emphasizes concurrent and parallel progress of implementation and verification processes. Through this approach, we incorporate formal techniques into our development process, promote a common understanding for the protocol, increase the reliability of our software, and maintain high fidelity between the specifications of RMP and its implementation.

  20. The Level of Creative Abilities Dimensions According to Torrance Formal Test (B) and Their Relationship with Some Variables (Sex, Age, GPA)

    ERIC Educational Resources Information Center

    Awamleh, Habis; Al Farah, Yacoub; El-Zraigat, Ibrahim

    2012-01-01

    This study aimed to identify the level of dimensions for creative abilities (originality, flexibility, originality, elaboration) among students in Al Rai Jordanian schools according to Torrance Formal test, and to investigate the differences in these levels attributable to the study variables (gender, age, grade point average "GPA"). The…

  1. Identifying significant gene‐environment interactions using a combination of screening testing and hierarchical false discovery rate control

    PubMed Central

    Shen, Li; Saykin, Andrew J.; Williams, Scott M.; Moore, Jason H.

    2016-01-01

    ABSTRACT Although gene‐environment (G× E) interactions play an important role in many biological systems, detecting these interactions within genome‐wide data can be challenging due to the loss in statistical power incurred by multiple hypothesis correction. To address the challenge of poor power and the limitations of existing multistage methods, we recently developed a screening‐testing approach for G× E interaction detection that combines elastic net penalized regression with joint estimation to support a single omnibus test for the presence of G× E interactions. In our original work on this technique, however, we did not assess type I error control or power and evaluated the method using just a single, small bladder cancer data set. In this paper, we extend the original method in two important directions and provide a more rigorous performance evaluation. First, we introduce a hierarchical false discovery rate approach to formally assess the significance of individual G× E interactions. Second, to support the analysis of truly genome‐wide data sets, we incorporate a score statistic‐based prescreening step to reduce the number of single nucleotide polymorphisms prior to fitting the first stage penalized regression model. To assess the statistical properties of our method, we compare the type I error rate and statistical power of our approach with competing techniques using both simple simulation designs as well as designs based on real disease architectures. Finally, we demonstrate the ability of our approach to identify biologically plausible SNP‐education interactions relative to Alzheimer's disease status using genome‐wide association study data from the Alzheimer's Disease Neuroimaging Initiative (ADNI). PMID:27578615

  2. A study of two unsupervised data driven statistical methodologies for detecting and classifying damages in structural health monitoring

    NASA Astrophysics Data System (ADS)

    Tibaduiza, D.-A.; Torres-Arredondo, M.-A.; Mujica, L. E.; Rodellar, J.; Fritzen, C.-P.

    2013-12-01

    This article is concerned with the practical use of Multiway Principal Component Analysis (MPCA), Discrete Wavelet Transform (DWT), Squared Prediction Error (SPE) measures and Self-Organizing Maps (SOM) to detect and classify damages in mechanical structures. The formalism is based on a distributed piezoelectric active sensor network for the excitation and detection of structural dynamic responses. Statistical models are built using PCA when the structure is known to be healthy either directly from the dynamic responses or from wavelet coefficients at different scales representing Time-frequency information. Different damages on the tested structures are simulated by adding masses at different positions. The data from the structure in different states (damaged or not) are then projected into the different principal component models by each actuator in order to obtain the input feature vectors for a SOM from the scores and the SPE measures. An aircraft fuselage from an Airbus A320 and a multi-layered carbon fiber reinforced plastic (CFRP) plate are used as examples to test the approaches. Results are presented, compared and discussed in order to determine their potential in structural health monitoring. These results showed that all the simulated damages were detectable and the selected features proved capable of separating all damage conditions from the undamaged state for both approaches.

  3. Precision, Reliability, and Effect Size of Slope Variance in Latent Growth Curve Models: Implications for Statistical Power Analysis

    PubMed Central

    Brandmaier, Andreas M.; von Oertzen, Timo; Ghisletta, Paolo; Lindenberger, Ulman; Hertzog, Christopher

    2018-01-01

    Latent Growth Curve Models (LGCM) have become a standard technique to model change over time. Prediction and explanation of inter-individual differences in change are major goals in lifespan research. The major determinants of statistical power to detect individual differences in change are the magnitude of true inter-individual differences in linear change (LGCM slope variance), design precision, alpha level, and sample size. Here, we show that design precision can be expressed as the inverse of effective error. Effective error is determined by instrument reliability and the temporal arrangement of measurement occasions. However, it also depends on another central LGCM component, the variance of the latent intercept and its covariance with the latent slope. We derive a new reliability index for LGCM slope variance—effective curve reliability (ECR)—by scaling slope variance against effective error. ECR is interpretable as a standardized effect size index. We demonstrate how effective error, ECR, and statistical power for a likelihood ratio test of zero slope variance formally relate to each other and how they function as indices of statistical power. We also provide a computational approach to derive ECR for arbitrary intercept-slope covariance. With practical use cases, we argue for the complementary utility of the proposed indices of a study's sensitivity to detect slope variance when making a priori longitudinal design decisions or communicating study designs. PMID:29755377

  4. A statistical rain attenuation prediction model with application to the advanced communication technology satellite project. 1: Theoretical development and application to yearly predictions for selected cities in the United States

    NASA Technical Reports Server (NTRS)

    Manning, Robert M.

    1986-01-01

    A rain attenuation prediction model is described for use in calculating satellite communication link availability for any specific location in the world that is characterized by an extended record of rainfall. Such a formalism is necessary for the accurate assessment of such availability predictions in the case of the small user-terminal concept of the Advanced Communication Technology Satellite (ACTS) Project. The model employs the theory of extreme value statistics to generate the necessary statistical rainrate parameters from rain data in the form compiled by the National Weather Service. These location dependent rain statistics are then applied to a rain attenuation model to obtain a yearly prediction of the occurrence of attenuation on any satellite link at that location. The predictions of this model are compared to those of the Crane Two-Component Rain Model and some empirical data and found to be very good. The model is then used to calculate rain attenuation statistics at 59 locations in the United States (including Alaska and Hawaii) for the 20 GHz downlinks and 30 GHz uplinks of the proposed ACTS system. The flexibility of this modeling formalism is such that it allows a complete and unified treatment of the temporal aspects of rain attenuation that leads to the design of an optimum stochastic power control algorithm, the purpose of which is to efficiently counter such rain fades on a satellite link.

  5. Differential gene expression detection and sample classification using penalized linear regression models.

    PubMed

    Wu, Baolin

    2006-02-15

    Differential gene expression detection and sample classification using microarray data have received much research interest recently. Owing to the large number of genes p and small number of samples n (p > n), microarray data analysis poses big challenges for statistical analysis. An obvious problem owing to the 'large p small n' is over-fitting. Just by chance, we are likely to find some non-differentially expressed genes that can classify the samples very well. The idea of shrinkage is to regularize the model parameters to reduce the effects of noise and produce reliable inferences. Shrinkage has been successfully applied in the microarray data analysis. The SAM statistics proposed by Tusher et al. and the 'nearest shrunken centroid' proposed by Tibshirani et al. are ad hoc shrinkage methods. Both methods are simple, intuitive and prove to be useful in empirical studies. Recently Wu proposed the penalized t/F-statistics with shrinkage by formally using the (1) penalized linear regression models for two-class microarray data, showing good performance. In this paper we systematically discussed the use of penalized regression models for analyzing microarray data. We generalize the two-class penalized t/F-statistics proposed by Wu to multi-class microarray data. We formally derive the ad hoc shrunken centroid used by Tibshirani et al. using the (1) penalized regression models. And we show that the penalized linear regression models provide a rigorous and unified statistical framework for sample classification and differential gene expression detection.

  6. General practitioners’ views of clinically led commissioning: cross-sectional survey in England

    PubMed Central

    Moran, Valerie; Checkland, Kath; Coleman, Anna; Spooner, Sharon; Gibson, Jonathan; Sutton, Matt

    2017-01-01

    Objectives Involving general practitioners (GPs) in the commissioning/purchasing of services has been an important element in English health policy for many years. The Health and Social Care Act 2012 handed responsibility for commissioning of the majority of care for local populations to GP-led Clinical Commissioning Groups (CCGs). In this paper, we explore GP attitudes to involvement in commissioning and future intentions for engagement. Design and setting Survey of a random sample of GPs across England in 2015. Method The Eighth National GP Worklife Survey was distributed to GPs in spring 2015. Responses were received from 2611 respondents (response rate = 46%). We compared responses across different GP characteristics and conducted two sample tests of proportions to identify statistically significant differences in responses across groups. We also used multivariate logistic regression to identify the characteristics associated with wanting a formal CCG role in the future. Results While GPs generally agree that they can add value to aspects of commissioning, only a minority feel that this is an important part of their role. Many current leaders intend to quit in the next 5 years, and there is limited appetite among those not currently in a formal role to take up such a role in the future. CCGs were set up as ‘membership organisations’ but only a minority of respondents reported feeling that they had ‘ownership’ of their local CCG and these were often GPs with formal CCG roles. However, respondents generally agree that the CCG has a legitimate role in influencing the work that they do. Conclusion CCGs need to engage in active succession planning to find the next generation of GP leaders. GPs believe that CCGs have a legitimate role in influencing their work, suggesting that there may be scope for CCGs to involve GPs more fully in roles short of formal leadership. PMID:28596217

  7. General practitioners' views of clinically led commissioning: cross-sectional survey in England.

    PubMed

    Moran, Valerie; Checkland, Kath; Coleman, Anna; Spooner, Sharon; Gibson, Jonathan; Sutton, Matt

    2017-06-08

    Involving general practitioners (GPs) in the commissioning/purchasing of services has been an important element in English health policy for many years. The Health and Social Care Act 2012 handed responsibility for commissioning of the majority of care for local populations to GP-led Clinical Commissioning Groups (CCGs). In this paper, we explore GP attitudes to involvement in commissioning and future intentions for engagement. Survey of a random sample of GPs across England in 2015. The Eighth National GP Worklife Survey was distributed to GPs in spring 2015. Responses were received from 2611 respondents (response rate = 46%). We compared responses across different GP characteristics and conducted two sample tests of proportions to identify statistically significant differences in responses across groups. We also used multivariate logistic regression to identify the characteristics associated with wanting a formal CCG role in the future. While GPs generally agree that they can add value to aspects of commissioning, only a minority feel that this is an important part of their role. Many current leaders intend to quit in the next 5 years, and there is limited appetite among those not currently in a formal role to take up such a role in the future. CCGs were set up as 'membership organisations' but only a minority of respondents reported feeling that they had 'ownership' of their local CCG and these were often GPs with formal CCG roles. However, respondents generally agree that the CCG has a legitimate role in influencing the work that they do. CCGs need to engage in active succession planning to find the next generation of GP leaders. GPs believe that CCGs have a legitimate role in influencing their work, suggesting that there may be scope for CCGs to involve GPs more fully in roles short of formal leadership. © Article author(s) (or their employer(s) unless otherwise stated in the text of the article) 2017. All rights reserved. No commercial use is permitted unless otherwise expressly granted.

  8. Proceedings of the First NASA Formal Methods Symposium

    NASA Technical Reports Server (NTRS)

    Denney, Ewen (Editor); Giannakopoulou, Dimitra (Editor); Pasareanu, Corina S. (Editor)

    2009-01-01

    Topics covered include: Model Checking - My 27-Year Quest to Overcome the State Explosion Problem; Applying Formal Methods to NASA Projects: Transition from Research to Practice; TLA+: Whence, Wherefore, and Whither; Formal Methods Applications in Air Transportation; Theorem Proving in Intel Hardware Design; Building a Formal Model of a Human-Interactive System: Insights into the Integration of Formal Methods and Human Factors Engineering; Model Checking for Autonomic Systems Specified with ASSL; A Game-Theoretic Approach to Branching Time Abstract-Check-Refine Process; Software Model Checking Without Source Code; Generalized Abstract Symbolic Summaries; A Comparative Study of Randomized Constraint Solvers for Random-Symbolic Testing; Component-Oriented Behavior Extraction for Autonomic System Design; Automated Verification of Design Patterns with LePUS3; A Module Language for Typing by Contracts; From Goal-Oriented Requirements to Event-B Specifications; Introduction of Virtualization Technology to Multi-Process Model Checking; Comparing Techniques for Certified Static Analysis; Towards a Framework for Generating Tests to Satisfy Complex Code Coverage in Java Pathfinder; jFuzz: A Concolic Whitebox Fuzzer for Java; Machine-Checkable Timed CSP; Stochastic Formal Correctness of Numerical Algorithms; Deductive Verification of Cryptographic Software; Coloured Petri Net Refinement Specification and Correctness Proof with Coq; Modeling Guidelines for Code Generation in the Railway Signaling Context; Tactical Synthesis Of Efficient Global Search Algorithms; Towards Co-Engineering Communicating Autonomous Cyber-Physical Systems; and Formal Methods for Automated Diagnosis of Autosub 6000.

  9. Superstatistics of the Klein-Gordon equation in deformed formalism for modified Dirac delta distribution

    NASA Astrophysics Data System (ADS)

    Sargolzaeipor, S.; Hassanabadi, H.; Chung, W. S.

    2018-04-01

    The Klein-Gordon equation is extended in the presence of an Aharonov-Bohm magnetic field for the Cornell potential and the corresponding wave functions as well as the spectra are obtained. After introducing the superstatistics in the statistical mechanics, we first derived the effective Boltzmann factor in the deformed formalism with modified Dirac delta distribution. We then use the concepts of the superstatistics to calculate the thermodynamics properties of the system. The well-known results are recovered by the vanishing of deformation parameter and some graphs are plotted for the clarity of our results.

  10. Applications of Formal Methods to Specification and Safety of Avionics Software

    NASA Technical Reports Server (NTRS)

    Hoover, D. N.; Guaspari, David; Humenn, Polar

    1996-01-01

    This report treats several topics in applications of formal methods to avionics software development. Most of these topics concern decision tables, an orderly, easy-to-understand format for formally specifying complex choices among alternative courses of action. The topics relating to decision tables include: generalizations fo decision tables that are more concise and support the use of decision tables in a refinement-based formal software development process; a formalism for systems of decision tables with behaviors; an exposition of Parnas tables for users of decision tables; and test coverage criteria and decision tables. We outline features of a revised version of ORA's decision table tool, Tablewise, which will support many of the new ideas described in this report. We also survey formal safety analysis of specifications and software.

  11. Incompleteness of Bluetooth protocol conformance test cases

    NASA Astrophysics Data System (ADS)

    Wu, Peng; Gao, Qiang

    2001-10-01

    This paper describes a formal method to verify the completeness of conformance testing, in which not only Implementation Under Test (IUT) is formalized in SDL, but also conformance tester is described in SDL so that conformance testing can be performed in simulator provided with CASE tool. The protocol set considered is Bluetooth, an open wireless communication technology. Our research results show that Bluetooth conformance test specification is not complete in that it has only limited coverage and many important capabilities defined in Bluetooth core specification are not tested. We also give a detail report on the missing test cases against Bluetooth core specification, and provide a guide on further test case generation in the future.

  12. Effects of traditional and discovery instructional approaches on learning outcomes for learners of different intellectual development: A study of chemistry students in Zambia

    NASA Astrophysics Data System (ADS)

    Mulopo, Moses M.; Seymour Fowler, H.

    This study examined the differential effectiveness of traditional and discovery methods of instruction for the teaching of science concepts, understandings about science, and scientific attitudes, to learners at the concrete and formal level of cognitive development. The dependent variables were achievement, understanding science, and scientific attitude; assessed through the use of the ACS Achievement Test (high school chemistry, Form 1979), the Test on Understanding Science (Form W), and the Test on Scientific Attitude, respectively. Mode of instruction and cognitive development were the independent variables. Subjects were 120 Form IV (11th grade) males enrolled in chemistry classes in Lusaka, Zambia. Sixty of these were concrete reasoners (mean age = 18.23) randomly selected from one of the two schools. The remaining 60 subjects were formal reasoners (mean age 18.06) randomly selected from a second boys' school. Each of these two groups was randomly split into two subgroups with 30 subjects. Traditional and discovery approaches were randomly assigned to the two subgroups of concrete reasoners and to the two subgroups of formal reasoners. Prior to instruction, the subjects were pretested using the ACS Achievement Test, the Test on Understanding Science, and the Test on Scientific Attitude. Subjects received instruction covering eight chemistry topics during approximately 10 weeks. Posttests followed using the same standard tests. Two-way analysis of covariance, with pretest scores serving as covariates was used and 0.05 level of significant was accepted. Tukey WSD technique was used as a follow-up test where applicable. It was found that (1) for the formal reasoners, the discovery group earned significantly higher understanding science scores than the traditional group. For the concrete reasoners mode of instruction did not make a difference; (2) overall, formal reasoners earned significantly higher achievement scores than concrete reasoners; (3) in general, subjects taught by the discovery approach earned significantly higher scientific attitude scores than those taught by the traditional approach. The traditional group outperformed the discovery group in achievement scores. It was concluded that the traditional approach might be an efficient instructional mode for the teaching of scientific facts and principles to high school students, while the discovery approach seemed to be more suitable for teaching scientific attitudes and for promoting understanding about science and scientists among formal operational learners.

  13. Maximum entropy production principle for geostrophic turbulence

    NASA Astrophysics Data System (ADS)

    Sommeria, J.; Bouchet, F.; Chavanis, P. H.

    2003-04-01

    In 2D turbulence, complex stirring leads to the formation of steady organized states, once fine scale fluctuations have been filtered out. This self-organization can be explained in terms of statistical equilibrium for vorticity, as the most likely outcome of vorticity parcel rearrangements with the constraints of the conservation laws. A mixing entropy describing the vorticity rearrangements is introduced. Extension to the shallow water system has been proposed by Chavanis P.H. and Sommeria J. (2002), Phys. Rev. E. Generalization to multi-layer geostrophic flows is formally straightforward. Outside equilibrium, eddy fluxes should drive the system toward equilibrium, in the spirit of non equilibrium linear thermodynamics. This can been formalized in terms of a principle of maximum entropy production (MEP), as shown by Robert and Sommeria (1991), Phys. Rev. Lett. 69. Then a parameterization of eddy fluxes is obtained, involving an eddy diffusivity plus a drift term acting at larger scale. These two terms balance each other at equilibrium, resulting in a non trivial steady flow, which is the mean state of the statistical equilibrium. Applications of this eddy parametrization will be presented, in the context of oceanic circulation and Jupiter's Great Red Spot. Quantitative tests will be discussed, obtained by comparisons with direct numerical simulations. Kinetic models, inspired from plasma physics, provide a more precise description of the relaxation toward equilibrium, as shown by Chavanis P.H. 2000 ``Quasilinear theory of the 2D Euler equation'', Phys. Rev. Lett. 84. This approach provides relaxation equations with a form similar to the MEP, but not identical. In conclusion, the MEP provides the right trends of the system but its precise justification remains elusive.

  14. Twenty-five years of maximum-entropy principle

    NASA Astrophysics Data System (ADS)

    Kapur, J. N.

    1983-04-01

    The strengths and weaknesses of the maximum entropy principle (MEP) are examined and some challenging problems that remain outstanding at the end of the first quarter century of the principle are discussed. The original formalism of the MEP is presented and its relationship to statistical mechanics is set forth. The use of MEP for characterizing statistical distributions, in statistical inference, nonlinear spectral analysis, transportation models, population density models, models for brand-switching in marketing and vote-switching in elections is discussed. Its application to finance, insurance, image reconstruction, pattern recognition, operations research and engineering, biology and medicine, and nonparametric density estimation is considered.

  15. Quality assurance software inspections at NASA Ames: Metrics for feedback and modification

    NASA Technical Reports Server (NTRS)

    Wenneson, G.

    1985-01-01

    Software inspections are a set of formal technical review procedures held at selected key points during software development in order to find defects in software documents--is described in terms of history, participants, tools, procedures, statistics, and database analysis.

  16. Probability distribution and statistical properties of spherically compensated cosmic regions in ΛCDM cosmology

    NASA Astrophysics Data System (ADS)

    Alimi, Jean-Michel; de Fromont, Paul

    2018-04-01

    The statistical properties of cosmic structures are well known to be strong probes for cosmology. In particular, several studies tried to use the cosmic void counting number to obtain tight constrains on dark energy. In this paper, we model the statistical properties of these regions using the CoSphere formalism (de Fromont & Alimi) in both primordial and non-linearly evolved Universe in the standard Λ cold dark matter model. This formalism applies similarly for minima (voids) and maxima (such as DM haloes), which are here considered symmetrically. We first derive the full joint Gaussian distribution of CoSphere's parameters in the Gaussian random field. We recover the results of Bardeen et al. only in the limit where the compensation radius becomes very large, i.e. when the central extremum decouples from its cosmic environment. We compute the probability distribution of the compensation size in this primordial field. We show that this distribution is redshift independent and can be used to model cosmic voids size distribution. We also derive the statistical distribution of the peak parameters introduced by Bardeen et al. and discuss their correlation with the cosmic environment. We show that small central extrema with low density are associated with narrow compensation regions with deep compensation density, while higher central extrema are preferentially located in larger but smoother over/under massive regions.

  17. The predictive ability of the CHADS2 and CHA2DS2-VASc scores for bleeding risk in atrial fibrillation: the MAQI(2) experience.

    PubMed

    Barnes, Geoffrey D; Gu, Xiaokui; Haymart, Brian; Kline-Rogers, Eva; Almany, Steve; Kozlowski, Jay; Besley, Dennis; Krol, Gregory D; Froehlich, James B; Kaatz, Scott

    2014-08-01

    Guidelines recommend the assessment of stroke and bleeding risk before initiating warfarin anticoagulation in patients with atrial fibrillation. Many of the elements used to predict stroke also overlap with bleeding risk in atrial fibrillation patients and it is tempting to use stroke risk scores to efficiently estimate bleeding risk. Comparison of stroke risk scores to bleeding risk scores to predict bleeding has not been thoroughly assessed. 2600 patients followed at seven anticoagulation clinics were followed from October 2009-May 2013. Five risk models (CHADS2, CHA2DS2-VASc, HEMORR2HAGES, HAS-BLED and ATRIA) were retrospectively applied to each patient. The primary outcome was the first major bleeding event. Area under the ROC curves were compared with C statistic and net reclassification improvement (NRI) analysis was performed. 110 patients experienced a major bleeding event in 2581.6 patient-years (4.5%/year). Mean follow up was 1.0±0.8years. All of the formal bleeding risk scores had a modest predictive value for first major bleeding events (C statistic 0.66-0.69), performing better than CHADS2 and CHA2DS2-VASc scores (C statistic difference 0.10 - 0.16). NRI analysis demonstrated a 52-69% and 47-64% improvement of the formal bleeding risk scores over the CHADS2 score and CHA2DS2-VASc score, respectively. The CHADS2 and CHA2DS2-VASc scores did not perform as well as formal bleeding risk scores for prediction of major bleeding in non-valvular atrial fibrillation patients treated with warfarin. All three bleeding risk scores (HAS-BLED, ATRIA and HEMORR2HAGES) performed moderately well. Copyright © 2014 Elsevier Ltd. All rights reserved.

  18. DOE Office of Scientific and Technical Information (OSTI.GOV)

    Punnoose, Ratish J.; Armstrong, Robert C.; Wong, Matthew H.

    Formal methods have come into wide use because of their effectiveness in verifying "safety and security" requirements of digital systems; a set of requirements for which testing is mostly ineffective. Formal methods are routinely used in the design and verification of high-consequence digital systems in industry. This report outlines our work in assessing the capabilities of commercial and open source formal tools and the ways in which they can be leveraged in digital design workflows.

  19. 'No man is an island'. Testing the specific role of social isolation in formal thought disorder.

    PubMed

    de Sousa, Paulo; Spray, Amy; Sellwood, William; Bentall, Richard P

    2015-12-15

    Recent work has focused on the role of the environment in psychosis with emerging evidence that specific psychotic experiences are associated with specific types of adversity. One risk factor that has been often associated with psychosis is social isolation, with studies identifying isolation as an important feature of prodromal psychosis and others reporting that social networks of psychotic patients are smaller and less dense than those of healthy individuals. In the present study, we tested a prediction that social isolation would be specifically associated with formal thought disorder. 80 patients diagnosed with psychosis-spectrum disorder and 30 healthy participants were assessed for formal thought disorder with speech samples acquired during an interview that promoted personal disclosure and an interview targeting everyday topics. Social isolation was significantly associated with formal thought disorder in the neutral interview and in the salient interview, even when controlling for comorbid hallucinations, delusions and suspiciousness. Hallucinations, delusions and suspiciousness were not associated with social isolation when formal thought disorder was controlled for. Formal thought disorder is robustly and specifically associated with social isolation. Social cognitive mechanisms and processes are discussed which may explain this relationship as well as implications for clinical practice and future research. Copyright © 2015 Elsevier Ireland Ltd. All rights reserved.

  20. [The impact of breastfeeding promotion in women with formal employment].

    PubMed

    Brasileiro, Aline Alves; Possobon, Rosana de Fátima; Carrascoza, Karina Camilo; Ambrosano, Gláucia Maria Bovi; Moraes, Antônio Bento Alves de

    2010-09-01

    This study focused on programs to promote breastfeeding in order to prevent early weaning of working mothers' infant children. A non-randomized intervention study was conducted using a survey of mothers who had returned to work after childbirth, including both participants and non-participants in a program to promote breastfeeding. The sample consisted of 200 mothers of infants ranging from 6 to 10 months of age. Factors associated with early weaning were analyzed with the chi-square and Fisher's exact tests and multiple logistic regression (α = 0.05). The results showed statistical differences between the groups in relation to exclusive breastfeeding (p < 0.0001) and breastfeeding (p < 0.0001). There was a statistically significant difference (p = 0.0056) between the groups in relation to time between childbirth and return to work. There was no difference between the end of maternity leave and weaning time. Mothers that were unable to nurse their infants during the work shift showed 4.98 times higher odds (95%CI: 1.27-19.61) of weaning them before the fourth month of age.

  1. Language experience changes subsequent learning

    PubMed Central

    Onnis, Luca; Thiessen, Erik

    2013-01-01

    What are the effects of experience on subsequent learning? We explored the effects of language-specific word order knowledge on the acquisition of sequential conditional information. Korean and English adults were engaged in a sequence learning task involving three different sets of stimuli: auditory linguistic (nonsense syllables), visual non-linguistic (nonsense shapes), and auditory non-linguistic (pure tones). The forward and backward probabilities between adjacent elements generated two equally probable and orthogonal perceptual parses of the elements, such that any significant preference at test must be due to either general cognitive biases, or prior language-induced biases. We found that language modulated parsing preferences with the linguistic stimuli only. Intriguingly, these preferences are congruent with the dominant word order patterns of each language, as corroborated by corpus analyses, and are driven by probabilistic preferences. Furthermore, although the Korean individuals had received extensive formal explicit training in English and lived in an English-speaking environment, they exhibited statistical learning biases congruent with their native language. Our findings suggest that mechanisms of statistical sequential learning are implicated in language across the lifespan, and experience with language may affect cognitive processes and later learning. PMID:23200510

  2. Stochastic modeling of sunshine number data

    NASA Astrophysics Data System (ADS)

    Brabec, Marek; Paulescu, Marius; Badescu, Viorel

    2013-11-01

    In this paper, we will present a unified statistical modeling framework for estimation and forecasting sunshine number (SSN) data. Sunshine number has been proposed earlier to describe sunshine time series in qualitative terms (Theor Appl Climatol 72 (2002) 127-136) and since then, it was shown to be useful not only for theoretical purposes but also for practical considerations, e.g. those related to the development of photovoltaic energy production. Statistical modeling and prediction of SSN as a binary time series has been challenging problem, however. Our statistical model for SSN time series is based on an underlying stochastic process formulation of Markov chain type. We will show how its transition probabilities can be efficiently estimated within logistic regression framework. In fact, our logistic Markovian model can be relatively easily fitted via maximum likelihood approach. This is optimal in many respects and it also enables us to use formalized statistical inference theory to obtain not only the point estimates of transition probabilities and their functions of interest, but also related uncertainties, as well as to test of various hypotheses of practical interest, etc. It is straightforward to deal with non-homogeneous transition probabilities in this framework. Very importantly from both physical and practical points of view, logistic Markov model class allows us to test hypotheses about how SSN dependents on various external covariates (e.g. elevation angle, solar time, etc.) and about details of the dynamic model (order and functional shape of the Markov kernel, etc.). Therefore, using generalized additive model approach (GAM), we can fit and compare models of various complexity which insist on keeping physical interpretation of the statistical model and its parts. After introducing the Markovian model and general approach for identification of its parameters, we will illustrate its use and performance on high resolution SSN data from the Solar Radiation Monitoring Station of the West University of Timisoara.

  3. Stochastic modeling of sunshine number data

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Brabec, Marek, E-mail: mbrabec@cs.cas.cz; Paulescu, Marius; Badescu, Viorel

    2013-11-13

    In this paper, we will present a unified statistical modeling framework for estimation and forecasting sunshine number (SSN) data. Sunshine number has been proposed earlier to describe sunshine time series in qualitative terms (Theor Appl Climatol 72 (2002) 127-136) and since then, it was shown to be useful not only for theoretical purposes but also for practical considerations, e.g. those related to the development of photovoltaic energy production. Statistical modeling and prediction of SSN as a binary time series has been challenging problem, however. Our statistical model for SSN time series is based on an underlying stochastic process formulation ofmore » Markov chain type. We will show how its transition probabilities can be efficiently estimated within logistic regression framework. In fact, our logistic Markovian model can be relatively easily fitted via maximum likelihood approach. This is optimal in many respects and it also enables us to use formalized statistical inference theory to obtain not only the point estimates of transition probabilities and their functions of interest, but also related uncertainties, as well as to test of various hypotheses of practical interest, etc. It is straightforward to deal with non-homogeneous transition probabilities in this framework. Very importantly from both physical and practical points of view, logistic Markov model class allows us to test hypotheses about how SSN dependents on various external covariates (e.g. elevation angle, solar time, etc.) and about details of the dynamic model (order and functional shape of the Markov kernel, etc.). Therefore, using generalized additive model approach (GAM), we can fit and compare models of various complexity which insist on keeping physical interpretation of the statistical model and its parts. After introducing the Markovian model and general approach for identification of its parameters, we will illustrate its use and performance on high resolution SSN data from the Solar Radiation Monitoring Station of the West University of Timisoara.« less

  4. Bayesian test for colocalisation between pairs of genetic association studies using summary statistics.

    PubMed

    Giambartolomei, Claudia; Vukcevic, Damjan; Schadt, Eric E; Franke, Lude; Hingorani, Aroon D; Wallace, Chris; Plagnol, Vincent

    2014-05-01

    Genetic association studies, in particular the genome-wide association study (GWAS) design, have provided a wealth of novel insights into the aetiology of a wide range of human diseases and traits, in particular cardiovascular diseases and lipid biomarkers. The next challenge consists of understanding the molecular basis of these associations. The integration of multiple association datasets, including gene expression datasets, can contribute to this goal. We have developed a novel statistical methodology to assess whether two association signals are consistent with a shared causal variant. An application is the integration of disease scans with expression quantitative trait locus (eQTL) studies, but any pair of GWAS datasets can be integrated in this framework. We demonstrate the value of the approach by re-analysing a gene expression dataset in 966 liver samples with a published meta-analysis of lipid traits including >100,000 individuals of European ancestry. Combining all lipid biomarkers, our re-analysis supported 26 out of 38 reported colocalisation results with eQTLs and identified 14 new colocalisation results, hence highlighting the value of a formal statistical test. In three cases of reported eQTL-lipid pairs (SYPL2, IFT172, TBKBP1) for which our analysis suggests that the eQTL pattern is not consistent with the lipid association, we identify alternative colocalisation results with SORT1, GCKR, and KPNB1, indicating that these genes are more likely to be causal in these genomic intervals. A key feature of the method is the ability to derive the output statistics from single SNP summary statistics, hence making it possible to perform systematic meta-analysis type comparisons across multiple GWAS datasets (implemented online at http://coloc.cs.ucl.ac.uk/coloc/). Our methodology provides information about candidate causal genes in associated intervals and has direct implications for the understanding of complex diseases as well as the design of drugs to target disease pathways.

  5. Structure of multiphoton quantum optics. I. Canonical formalism and homodyne squeezed states

    NASA Astrophysics Data System (ADS)

    dell'Anno, Fabio; de Siena, Silvio; Illuminati, Fabrizio

    2004-03-01

    We introduce a formalism of nonlinear canonical transformations for general systems of multiphoton quantum optics. For single-mode systems the transformations depend on a tunable free parameter, the homodyne local-oscillator angle; for n -mode systems they depend on n heterodyne mixing angles. The canonical formalism realizes nontrivial mixing of pairs of conjugate quadratures of the electromagnetic field in terms of homodyne variables for single-mode systems, and in terms of heterodyne variables for multimode systems. In the first instance the transformations yield nonquadratic model Hamiltonians of degenerate multiphoton processes and define a class of non-Gaussian, nonclassical multiphoton states that exhibit properties of coherence and squeezing. We show that such homodyne multiphoton squeezed states are generated by unitary operators with a nonlinear time evolution that realizes the homodyne mixing of a pair of conjugate quadratures. Tuning of the local-oscillator angle allows us to vary at will the statistical properties of such states. We discuss the relevance of the formalism for the study of degenerate (up-)down-conversion processes. In a companion paper [

    F. Dell’Anno, S. De Siena, and F. Illuminati, 69, 033813 (2004)
    ], we provide the extension of the nonlinear canonical formalism to multimode systems, we introduce the associated heterodyne multiphoton squeezed states, and we discuss their possible experimental realization.

  6. Structure of multiphoton quantum optics. I. Canonical formalism and homodyne squeezed states

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Dell'Anno, Fabio; De Siena, Silvio; Illuminati, Fabrizio

    2004-03-01

    We introduce a formalism of nonlinear canonical transformations for general systems of multiphoton quantum optics. For single-mode systems the transformations depend on a tunable free parameter, the homodyne local-oscillator angle; for n-mode systems they depend on n heterodyne mixing angles. The canonical formalism realizes nontrivial mixing of pairs of conjugate quadratures of the electromagnetic field in terms of homodyne variables for single-mode systems, and in terms of heterodyne variables for multimode systems. In the first instance the transformations yield nonquadratic model Hamiltonians of degenerate multiphoton processes and define a class of non-Gaussian, nonclassical multiphoton states that exhibit properties of coherencemore » and squeezing. We show that such homodyne multiphoton squeezed states are generated by unitary operators with a nonlinear time evolution that realizes the homodyne mixing of a pair of conjugate quadratures. Tuning of the local-oscillator angle allows us to vary at will the statistical properties of such states. We discuss the relevance of the formalism for the study of degenerate (up-)down-conversion processes. In a companion paper [F. Dell'Anno, S. De Siena, and F. Illuminati, 69, 033813 (2004)], we provide the extension of the nonlinear canonical formalism to multimode systems, we introduce the associated heterodyne multiphoton squeezed states, and we discuss their possible experimental realization.« less

  7. A spatial exploration of informal trail networks within Great Falls Park, VA

    USGS Publications Warehouse

    Wimpey, Jeremy; Marion, Jeffrey L.

    2011-01-01

    Informal (visitor-created) trails represent a threat to the natural resources of protected natural areas around the globe. These trails can remove vegetation, displace wildlife, alter hydrology, alter habitat, spread invasive species, and fragment landscapes. This study examines informal and formal trails within Great Falls Park, VA, a sub-unit of the George Washington Memorial Parkway, managed by the U.S. National Park Service. This study sought to answer three specific questions: 1) Are the physical characteristics and topographic alignments of informal trails significantly different from formal trails, 2) Can landscape fragmentation metrics be used to summarize the relative impacts of formal and informal trail networks on a protected natural area? and 3) What can we learn from examining the spatial distribution of the informal trails within protected natural areas? Statistical comparisons between formal and informal trails in this park indicate that informal trails have less sustainable topographic alignments than their formal counterparts. Spatial summaries of the lineal and areal extent and fragmentation associated with the trail networks by park management zones compare park management goals to the assessed attributes. Hot spot analyses highlight areas of high trail density within the park and findings provide insights regarding potential causes for development of dense informal trail networks.

  8. Who Cares? Infant Educators' Responses to Professional Discourses of Care

    ERIC Educational Resources Information Center

    Davis, Belinda; Degotardi, Sheila

    2015-01-01

    This paper explores the construction of "care" in early childhood curriculum and practice. An increasing number of infants are attending formal early childhood settings in Australia (Australian Bureau of Statistics, 2011. "Childhood education and care, Australia, June 2011." (4402.0). Retrieved from…

  9. Tertiary Education and Training in Australia, 2010

    ERIC Educational Resources Information Center

    National Centre for Vocational Education Research (NCVER), 2012

    2012-01-01

    This publication presents information on tertiary education and training during 2010, including statistics on participation and outcomes. The definition of tertiary education and training adopted for this publication is formal study in vocational education and training (VET) and higher education, including enrolments in Australian Qualifications…

  10. Large deviation principle at work: Computation of the statistical properties of the exact one-point aperture mass

    NASA Astrophysics Data System (ADS)

    Reimberg, Paulo; Bernardeau, Francis

    2018-01-01

    We present a formalism based on the large deviation principle (LDP) applied to cosmological density fields, and more specifically to the arbitrary functional of density profiles, and we apply it to the derivation of the cumulant generating function and one-point probability distribution function (PDF) of the aperture mass (Map ), a common observable for cosmic shear observations. We show that the LDP can indeed be used in practice for a much larger family of observables than previously envisioned, such as those built from continuous and nonlinear functionals of density profiles. Taking advantage of this formalism, we can extend previous results, which were based on crude definitions of the aperture mass, with top-hat windows and the use of the reduced shear approximation (replacing the reduced shear with the shear itself). We were precisely able to quantify how this latter approximation affects the Map statistical properties. In particular, we derive the corrective term for the skewness of the Map and reconstruct its one-point PDF.

  11. Educating the Next Generation of Geoscientists: Strategies for Formal and Informal Settings

    NASA Astrophysics Data System (ADS)

    Burrell, S.

    2013-12-01

    ENGAGE, Educating the Next Generation of Geoscientists, is an effort funded by the National Science Foundation to provide academic opportunities for members of underrepresented groups to learn geology in formal and informal settings through collaboration with other universities and science organizations. The program design tests the hypothesis that developing a culture of on-going dialogue around science issues through special guest lectures and workshops, creating opportunities for mentorship through informal lunches, incorporating experiential learning in the field into the geoscience curriculum in lower division courses, partnership-building through the provision of paid summer internships and research opportunities, enabling students to participate in professional conferences, and engaging family members in science education through family science nights and special presentations, will remove the academic, social and economic obstacles that have traditionally hindered members of underrepresented groups from participation in the geosciences and will result in an increase in geoscience literacy and enrollment. Student feedback and anecdotal evidence indicate an increased interest in geology as a course of study and increased awareness of the relevance of geology everyday life. Preliminary statistics from two years of program implementation indicate increased student comprehension of Earth science concepts and ability to use data to identify trends in the natural environment.

  12. Apes are intuitive statisticians.

    PubMed

    Rakoczy, Hannes; Clüver, Annette; Saucke, Liane; Stoffregen, Nicole; Gräbener, Alice; Migura, Judith; Call, Josep

    2014-04-01

    Inductive learning and reasoning, as we use it both in everyday life and in science, is characterized by flexible inferences based on statistical information: inferences from populations to samples and vice versa. Many forms of such statistical reasoning have been found to develop late in human ontogeny, depending on formal education and language, and to be fragile even in adults. New revolutionary research, however, suggests that even preverbal human infants make use of intuitive statistics. Here, we conducted the first investigation of such intuitive statistical reasoning with non-human primates. In a series of 7 experiments, Bonobos, Chimpanzees, Gorillas and Orangutans drew flexible statistical inferences from populations to samples. These inferences, furthermore, were truly based on statistical information regarding the relative frequency distributions in a population, and not on absolute frequencies. Intuitive statistics in its most basic form is thus an evolutionarily more ancient rather than a uniquely human capacity. Copyright © 2014 Elsevier B.V. All rights reserved.

  13. Double Dutch: A Tool for Designing Combinatorial Libraries of Biological Systems.

    PubMed

    Roehner, Nicholas; Young, Eric M; Voigt, Christopher A; Gordon, D Benjamin; Densmore, Douglas

    2016-06-17

    Recently, semirational approaches that rely on combinatorial assembly of characterized DNA components have been used to engineer biosynthetic pathways. In practice, however, it is not practical to assemble and test millions of pathway variants in order to elucidate how different DNA components affect the behavior of a pathway. To address this challenge, we apply a rigorous mathematical approach known as design of experiments (DOE) that can be used to construct empirical models of system behavior without testing all variants. To support this approach, we have developed a tool named Double Dutch, which uses a formal grammar and heuristic algorithms to automate the process of DOE library design. Compared to designing by hand, Double Dutch enables users to more efficiently and scalably design libraries of pathway variants that can be used in a DOE framework and uniquely provides a means to flexibly balance design considerations of statistical analysis, construction cost, and risk of homologous recombination, thereby demonstrating the utility of automating decision making when faced with complex design trade-offs.

  14. Beyond statistical inference: a decision theory for science.

    PubMed

    Killeen, Peter R

    2006-08-01

    Traditional null hypothesis significance testing does not yield the probability of the null or its alternative and, therefore, cannot logically ground scientific decisions. The decision theory proposed here calculates the expected utility of an effect on the basis of (1) the probability of replicating it and (2) a utility function on its size. It takes significance tests--which place all value on the replicability of an effect and none on its magnitude--as a special case, one in which the cost of a false positive is revealed to be an order of magnitude greater than the value of a true positive. More realistic utility functions credit both replicability and effect size, integrating them for a single index of merit. The analysis incorporates opportunity cost and is consistent with alternate measures of effect size, such as r2 and information transmission, and with Bayesian model selection criteria. An alternate formulation is functionally equivalent to the formal theory, transparent, and easy to compute.

  15. The Ontology of Biological and Clinical Statistics (OBCS) for standardized and reproducible statistical analysis.

    PubMed

    Zheng, Jie; Harris, Marcelline R; Masci, Anna Maria; Lin, Yu; Hero, Alfred; Smith, Barry; He, Yongqun

    2016-09-14

    Statistics play a critical role in biological and clinical research. However, most reports of scientific results in the published literature make it difficult for the reader to reproduce the statistical analyses performed in achieving those results because they provide inadequate documentation of the statistical tests and algorithms applied. The Ontology of Biological and Clinical Statistics (OBCS) is put forward here as a step towards solving this problem. The terms in OBCS including 'data collection', 'data transformation in statistics', 'data visualization', 'statistical data analysis', and 'drawing a conclusion based on data', cover the major types of statistical processes used in basic biological research and clinical outcome studies. OBCS is aligned with the Basic Formal Ontology (BFO) and extends the Ontology of Biomedical Investigations (OBI), an OBO (Open Biological and Biomedical Ontologies) Foundry ontology supported by over 20 research communities. Currently, OBCS comprehends 878 terms, representing 20 BFO classes, 403 OBI classes, 229 OBCS specific classes, and 122 classes imported from ten other OBO ontologies. We discuss two examples illustrating how the ontology is being applied. In the first (biological) use case, we describe how OBCS was applied to represent the high throughput microarray data analysis of immunological transcriptional profiles in human subjects vaccinated with an influenza vaccine. In the second (clinical outcomes) use case, we applied OBCS to represent the processing of electronic health care data to determine the associations between hospital staffing levels and patient mortality. Our case studies were designed to show how OBCS can be used for the consistent representation of statistical analysis pipelines under two different research paradigms. Other ongoing projects using OBCS for statistical data processing are also discussed. The OBCS source code and documentation are available at: https://github.com/obcs/obcs . The Ontology of Biological and Clinical Statistics (OBCS) is a community-based open source ontology in the domain of biological and clinical statistics. OBCS is a timely ontology that represents statistics-related terms and their relations in a rigorous fashion, facilitates standard data analysis and integration, and supports reproducible biological and clinical research.

  16. Medical statistics and hospital medicine: the case of the smallpox vaccination.

    PubMed

    Rusnock, Andrea

    2007-01-01

    Between 1799 and 1806, trials of vaccination to determine its safety and efficacy were undertaken in hospitals in London, Paris, Vienna, and Boston. These trials were among the first instances of formal hospital evaluations of a medical procedure and signal a growing acceptance of a relatively new approach to medical practice. These early evaluations of smallpox vaccination also relied on descriptive and quantitative accounts, as well as probabilistic analyses, and thus occupy a significant, yet hitherto unexamined, place in the history of medical statistics.

  17. Formal Methods for Verification and Validation of Partial Specifications: A Case Study

    NASA Technical Reports Server (NTRS)

    Easterbrook, Steve; Callahan, John

    1997-01-01

    This paper describes our work exploring the suitability of formal specification methods for independent verification and validation (IV&V) of software specifications for large, safety critical systems. An IV&V contractor often has to perform rapid analysis on incomplete specifications, with no control over how those specifications are represented. Lightweight formal methods show significant promise in this context, as they offer a way of uncovering major errors, without the burden of full proofs of correctness. We describe a case study of the use of partial formal models for V&V of the requirements for Fault Detection Isolation and Recovery on the space station. We conclude that the insights gained from formalizing a specification are valuable, and it is the process of formalization, rather than the end product that is important. It was only necessary to build enough of the formal model to test the properties in which we were interested. Maintenance of fidelity between multiple representations of the same requirements (as they evolve) is still a problem, and deserves further study.

  18. [Formal sample size calculation and its limited validity in animal studies of medical basic research].

    PubMed

    Mayer, B; Muche, R

    2013-01-01

    Animal studies are highly relevant for basic medical research, although their usage is discussed controversially in public. Thus, an optimal sample size for these projects should be aimed at from a biometrical point of view. Statistical sample size calculation is usually the appropriate methodology in planning medical research projects. However, required information is often not valid or only available during the course of an animal experiment. This article critically discusses the validity of formal sample size calculation for animal studies. Within the discussion, some requirements are formulated to fundamentally regulate the process of sample size determination for animal experiments.

  19. Collective Bargaining: Its Impact on Educational Cost.

    ERIC Educational Resources Information Center

    Atherton, P. J.

    Since the Ontario (Canada) legislation in 1975 that formalized collective bargaining for teachers, public concern has focused on collective bargaining as the possible cause of recent enrollment declines and increases in schooling costs. However, according to Ontario provincial statistics, enrollment in elementary schools had begun to decline…

  20. Heuristic Elements of Plausible Reasoning.

    ERIC Educational Resources Information Center

    Dudczak, Craig A.

    At least some of the reasoning processes involved in argumentation rely on inferences which do not fit within the traditional categories of inductive or deductive reasoning. The reasoning processes involved in plausibility judgments have neither the formal certainty of deduction nor the imputed statistical probability of induction. When utilizing…

  1. LSD Now: 1973

    ERIC Educational Resources Information Center

    Chunko, John A.

    1973-01-01

    LSD NOW is a nationwide, statistical survey and analysis of hallucinogenic drug use by individuals presently in formal educational surroundings. Analysis, concentrating on the extent and rationale related to the use of such drugs, now offers a deeper and more meaningful understanding of a particular facet of the drug culture. This understanding…

  2. Love and Sex: Can We Talk About That in School?

    ERIC Educational Resources Information Center

    Vance, Paul C.

    1985-01-01

    Gives statistical information on the "national epidemic" of teenage sexual activity and pregnancy and its consequences. Discusses social causes of this problem. Proposes that schools can help solve the problem by providing a formal sex education curriculum for pupils in kindergarten through grade 12. (CB)

  3. Approaching Bose-Einstein Condensation

    ERIC Educational Resources Information Center

    Ferrari, Loris

    2011-01-01

    Bose-Einstein condensation (BEC) is discussed at the level of an advanced course of statistical thermodynamics, clarifying some formal and physical aspects that are usually not covered by the standard pedagogical literature. The non-conventional approach adopted starts by showing that the continuum limit, in certain cases, cancels out the crucial…

  4. 49 CFR 236.923 - Task analysis and basic requirements.

    Code of Federal Regulations, 2011 CFR

    2011-10-01

    ... classroom, simulator, computer-based, hands-on, or other formally structured training and testing, except... for Processor-Based Signal and Train Control Systems § 236.923 Task analysis and basic requirements...) Based on a formal task analysis, identify the installation, maintenance, repair, modification...

  5. Acquisition of Formal Operations: The Effects of Two Training Procedures.

    ERIC Educational Resources Information Center

    Rosenthal, Doreen A.

    1979-01-01

    A study of 11- and 12-year-old girls indicates that either of two training procedures, method training or dimension training, can aid in the transition from concrete operational to formal operational thought by promoting a hypothesis-testing attitude. (BH)

  6. Statistical analysis plan of the head position in acute ischemic stroke trial pilot (HEADPOST pilot).

    PubMed

    Olavarría, Verónica V; Arima, Hisatomi; Anderson, Craig S; Brunser, Alejandro; Muñoz-Venturelli, Paula; Billot, Laurent; Lavados, Pablo M

    2017-02-01

    Background The HEADPOST Pilot is a proof-of-concept, open, prospective, multicenter, international, cluster randomized, phase IIb controlled trial, with masked outcome assessment. The trial will test if lying flat head position initiated in patients within 12 h of onset of acute ischemic stroke involving the anterior circulation increases cerebral blood flow in the middle cerebral arteries, as measured by transcranial Doppler. The study will also assess the safety and feasibility of patients lying flat for ≥24 h. The trial was conducted in centers in three countries, with ability to perform early transcranial Doppler. A feature of this trial was that patients were randomized to a certain position according to the month of admission to hospital. Objective To outline in detail the predetermined statistical analysis plan for HEADPOST Pilot study. Methods All data collected by participating researchers will be reviewed and formally assessed. Information pertaining to the baseline characteristics of patients, their process of care, and the delivery of treatments will be classified, and for each item, appropriate descriptive statistical analyses are planned with comparisons made between randomized groups. For the outcomes, statistical comparisons to be made between groups are planned and described. Results This statistical analysis plan was developed for the analysis of the results of the HEADPOST Pilot study to be transparent, available, verifiable, and predetermined before data lock. Conclusions We have developed a statistical analysis plan for the HEADPOST Pilot study which is to be followed to avoid analysis bias arising from prior knowledge of the study findings. Trial registration The study is registered under HEADPOST-Pilot, ClinicalTrials.gov Identifier NCT01706094.

  7. A survey of the sociodemographic and educational characteristics of oral health technicians in public primary health care teams in Minas Gerais, Brazil.

    PubMed

    Abreu, Mauro Henrique Nogueira Guimarães; Sanglard-Oliveira, Carla Aparecida; Jaruche, Abdul Rahman Mustafá; Mambrini, Juliana Vaz de Melo; Werneck, Marcos Azeredo Furquim; Lucas, Simone Dutra

    2013-12-23

    To describe some sociodemographic and educational characteristics of oral health technicians (OHTs) in public primary health care teams in the state of Minas Gerais, Brazil. A cross-sectional descriptive study was performed based on the telephone survey of a representative sample comprising 231 individuals. A pre-tested instrument was used for the data collection, including questions on gender, age in years, years of work as an OHT, years since graduation as an OHT, formal schooling, individual income in a month, and participation in continuing educational programmes. The descriptive statistic was developed and the formation of clusters, by the agglomerative hierarchy technique based on the furthest neighbour, was based on the age, years of work as an OHT, time since graduation as an OHT, formal schooling, individual income in a month, and participation in continuing educational programmes. Most interviewees (97.1%) were female. A monthly income of USD 300.00 to 600.00 was reported by 77.5% of the sample. Having educational qualifications in excess of their role was reported by approximately 20% of the participants. The median time since graduation was six years, and half of the sample had worked for four years as an OHT. Most interviewees (67.6%) reported having participated in professional continuing educational programmes. Two different clusters were identified based on the sociodemographic and educational characteristics of the sample. The Brazilian OHTs in public primary health care teams in the state of Minas Gerais are mostly female who have had little time since graduation, working experience, and formal schooling sufficient for professional practice.

  8. A survey of the sociodemographic and educational characteristics of oral health technicians in public primary health care teams in Minas Gerais, Brazil

    PubMed Central

    2013-01-01

    Background To describe some sociodemographic and educational characteristics of oral health technicians (OHTs) in public primary health care teams in the state of Minas Gerais, Brazil. Methods A cross-sectional descriptive study was performed based on the telephone survey of a representative sample comprising 231 individuals. A pre-tested instrument was used for the data collection, including questions on gender, age in years, years of work as an OHT, years since graduation as an OHT, formal schooling, individual income in a month, and participation in continuing educational programmes. The descriptive statistic was developed and the formation of clusters, by the agglomerative hierarchy technique based on the furthest neighbour, was based on the age, years of work as an OHT, time since graduation as an OHT, formal schooling, individual income in a month, and participation in continuing educational programmes. Results Most interviewees (97.1%) were female. A monthly income of USD 300.00 to 600.00 was reported by 77.5% of the sample. Having educational qualifications in excess of their role was reported by approximately 20% of the participants. The median time since graduation was six years, and half of the sample had worked for four years as an OHT. Most interviewees (67.6%) reported having participated in professional continuing educational programmes. Two different clusters were identified based on the sociodemographic and educational characteristics of the sample. Conclusions The Brazilian OHTs in public primary health care teams in the state of Minas Gerais are mostly female who have had little time since graduation, working experience, and formal schooling sufficient for professional practice. PMID:24365451

  9. Representation of research hypotheses

    PubMed Central

    2011-01-01

    Background Hypotheses are now being automatically produced on an industrial scale by computers in biology, e.g. the annotation of a genome is essentially a large set of hypotheses generated by sequence similarity programs; and robot scientists enable the full automation of a scientific investigation, including generation and testing of research hypotheses. Results This paper proposes a logically defined way for recording automatically generated hypotheses in machine amenable way. The proposed formalism allows the description of complete hypotheses sets as specified input and output for scientific investigations. The formalism supports the decomposition of research hypotheses into more specialised hypotheses if that is required by an application. Hypotheses are represented in an operational way – it is possible to design an experiment to test them. The explicit formal description of research hypotheses promotes the explicit formal description of the results and conclusions of an investigation. The paper also proposes a framework for automated hypotheses generation. We demonstrate how the key components of the proposed framework are implemented in the Robot Scientist “Adam”. Conclusions A formal representation of automatically generated research hypotheses can help to improve the way humans produce, record, and validate research hypotheses. Availability http://www.aber.ac.uk/en/cs/research/cb/projects/robotscientist/results/ PMID:21624164

  10. Definition of osteoarthritis on MRI: results of a Delphi exercise.

    PubMed

    Hunter, D J; Arden, N; Conaghan, P G; Eckstein, F; Gold, G; Grainger, A; Guermazi, A; Harvey, W; Jones, G; Hellio Le Graverand, M P; Laredo, J D; Lo, G; Losina, E; Mosher, T J; Roemer, F; Zhang, W

    2011-08-01

    Despite a growing body of Magnetic Resonance Imaging (MRI) literature in osteoarthritis (OA), there is little uniformity in its diagnostic application. We envisage in the first instance the definition requiring further validation and testing in the research setting before considering implementation/feasibility testing in the clinical setting. The objective of our research was to develop an MRI definition of structural OA. We undertook a multistage process consisting of a number of different steps. The intent was to develop testable definitions of OA (knee, hip and/or hand) on MRI. This was an evidence driven approach with results of a systematic review provided to the group prior to a Delphi exercise. Each participant of the steering group was allowed to submit independently up to five propositions related to key aspects in MRI diagnosis of knee OA. The steering group then participated in a Delphi exercise to reach consensus on which propositions we would recommend for a definition of structural OA on MRI. For each round of voting, ≥60% votes led to include and ≤20% votes led to exclude a proposition. After developing the proposition one of the definitions developed was tested for its validity against radiographic OA in an extant database. For the systematic review we identified 25 studies which met all of our inclusion criteria and contained relevant diagnostic measure and performance data. At the completion of the Delphi voting exercise 11 propositions were accepted for definition of structural OA on MRI. We assessed the diagnostic performance of the tibiofemoral MRI definition against a radiographic reference standard. The diagnostic performance for individual features was: osteophyte C statistic=0.61, for cartilage loss C statistic=0.73, for bone marrow lesions C statistic=0.72 and for meniscus tear in any region C statistic=0.78. The overall composite model for these four features was a C statistic=0.59. We detected good specificity (1) but less optimal sensitivity (0.46) likely due to detection of disease earlier on MRI. We have developed MRI definition of knee OA that requires further formal testing with regards their diagnostic performance (especially in datasets of persons with early disease), before they are more widely used. Our current analysis suggests that further testing should focus on comparisons other than the radiograph, that may capture later stage disease and thus nullify the potential for detecting early disease that MRI may afford. The propositions are not to detract from, nor to discourage the use of traditional means of diagnosing OA. Copyright © 2011 Osteoarthritis Research Society International. All rights reserved.

  11. The changing landscape of astrostatistics and astroinformatics

    NASA Astrophysics Data System (ADS)

    Feigelson, Eric D.

    2017-06-01

    The history and current status of the cross-disciplinary fields of astrostatistics and astroinformatics are reviewed. Astronomers need a wide range of statistical methods for both data reduction and science analysis. With the proliferation of high-throughput telescopes, efficient large scale computational methods are also becoming essential. However, astronomers receive only weak training in these fields during their formal education. Interest in the fields is rapidly growing with conferences organized by scholarly societies, textbooks and tutorial workshops, and research studies pushing the frontiers of methodology. R, the premier language of statistical computing, can provide an important software environment for the incorporation of advanced statistical and computational methodology into the astronomical community.

  12. Statistical science: a grammar for research.

    PubMed

    Cox, David R

    2017-06-01

    I greatly appreciate the invitation to give this lecture with its century long history. The title is a warning that the lecture is rather discursive and not highly focused and technical. The theme is simple. That statistical thinking provides a unifying set of general ideas and specific methods relevant whenever appreciable natural variation is present. To be most fruitful these ideas should merge seamlessly with subject-matter considerations. By contrast, there is sometimes a temptation to regard formal statistical analysis as a ritual to be added after the serious work has been done, a ritual to satisfy convention, referees, and regulatory agencies. I want implicitly to refute that idea.

  13. EFFICIENTLY ESTABLISHING CONCEPTS OF INFERENTIAL STATISTICS AND HYPOTHESIS DECISION MAKING THROUGH CONTEXTUALLY CONTROLLED EQUIVALENCE CLASSES

    PubMed Central

    Fienup, Daniel M; Critchfield, Thomas S

    2010-01-01

    Computerized lessons that reflect stimulus equivalence principles were used to teach college students concepts related to inferential statistics and hypothesis decision making. Lesson 1 taught participants concepts related to inferential statistics, and Lesson 2 taught them to base hypothesis decisions on a scientific hypothesis and the direction of an effect. Lesson 3 taught the conditional influence of inferential statistics over decisions regarding the scientific and null hypotheses. Participants entered the study with low scores on the targeted skills and left the study demonstrating a high level of accuracy on these skills, which involved mastering more relations than were taught formally. This study illustrates the efficiency of equivalence-based instruction in establishing academic skills in sophisticated learners. PMID:21358904

  14. Does Formal Research Training Lead to Academic Success in Plastic Surgery? A Comprehensive Analysis of U.S. Academic Plastic Surgeons.

    PubMed

    Lopez, Joseph; Ameri, Afshin; Susarla, Srinivas M; Reddy, Sashank; Soni, Ashwin; Tong, J W; Amini, Neda; Ahmed, Rizwan; May, James W; Lee, W P Andrew; Dorafshar, Amir

    2016-01-01

    It is currently unknown whether formal research training has an influence on academic advancement in plastic surgery. The purpose of this study was to determine whether formal research training was associated with higher research productivity, academic rank, and procurement of extramural National Institutes of Health (NIH) funding in plastic surgery, comparing academic surgeons who completed said research training with those without. This was a cross-sectional study of full-time academic plastic surgeons in the United States. The main predictor variable was formal research training, defined as completion of a postdoctoral research fellowship or attainment of a Doctor of Philosophy (PhD). The primary outcome was scientific productivity measured by the Hirsh-index (h-index, the number of publications, h that have at least h citations each). The secondary outcomes were academic rank and NIH funding. Descriptive, bivariate, and multiple regression statistics were computed. A total of 607 academic surgeons were identified from 94 Accreditation Council for Graduate Medical Education-accredited plastic surgery training programs. In all, 179 (29.5%) surgeons completed formal research training. The mean h-index was 11.7 ± 9.9. And, 58 (9.6%) surgeons successfully procured NIH funding. The distribution of academic rank was the following: endowed professor (5.4%), professor (23.9%), associate professor (23.4%), assistant professor (46.0%), and instructor (1.3%). In a multiple regression analysis, completion of formal research training was significantly predictive of a higher h-index and successful procurement of NIH funding. Current evidence demonstrates that formal research training is associated with higher scientific productivity and increased likelihood of future NIH funding. Copyright © 2016 Association of Program Directors in Surgery. Published by Elsevier Inc. All rights reserved.

  15. Numerical approximation abilities correlate with and predict informal but not formal mathematics abilities

    PubMed Central

    Libertus, Melissa E.; Feigenson, Lisa; Halberda, Justin

    2013-01-01

    Previous research has found a relationship between individual differences in children’s precision when nonverbally approximating quantities and their school mathematics performance. School mathematics performance emerges from both informal (e.g., counting) and formal (e.g., knowledge of mathematics facts) abilities. It remains unknown whether approximation precision relates to both of these types of mathematics abilities. In the present study we assessed the precision of numerical approximation in 85 3- to 7-year-old children four times over a span of two years. Additionally, at the last time point, we tested children’s informal and formal mathematics abilities using the Test of Early Mathematics Ability (TEMA-3; Ginsburg & Baroody, 2003). We found that children’s numerical approximation precision correlated with and predicted their informal, but not formal, mathematics abilities when controlling for age and IQ. These results add to our growing understanding of the relationship between an unlearned, non-symbolic system of quantity representation and the system of mathematical reasoning that children come to master through instruction. PMID:24076381

  16. Numerical approximation abilities correlate with and predict informal but not formal mathematics abilities.

    PubMed

    Libertus, Melissa E; Feigenson, Lisa; Halberda, Justin

    2013-12-01

    Previous research has found a relationship between individual differences in children's precision when nonverbally approximating quantities and their school mathematics performance. School mathematics performance emerges from both informal (e.g., counting) and formal (e.g., knowledge of mathematics facts) abilities. It remains unknown whether approximation precision relates to both of these types of mathematics abilities. In the current study, we assessed the precision of numerical approximation in 85 3- to 7-year-old children four times over a span of 2years. In addition, at the final time point, we tested children's informal and formal mathematics abilities using the Test of Early Mathematics Ability (TEMA-3). We found that children's numerical approximation precision correlated with and predicted their informal, but not formal, mathematics abilities when controlling for age and IQ. These results add to our growing understanding of the relationship between an unlearned nonsymbolic system of quantity representation and the system of mathematics reasoning that children come to master through instruction. Copyright © 2013 Elsevier Inc. All rights reserved.

  17. Statistical Selection of Biological Models for Genome-Wide Association Analyses.

    PubMed

    Bi, Wenjian; Kang, Guolian; Pounds, Stanley B

    2018-05-24

    Genome-wide association studies have discovered many biologically important associations of genes with phenotypes. Typically, genome-wide association analyses formally test the association of each genetic feature (SNP, CNV, etc) with the phenotype of interest and summarize the results with multiplicity-adjusted p-values. However, very small p-values only provide evidence against the null hypothesis of no association without indicating which biological model best explains the observed data. Correctly identifying a specific biological model may improve the scientific interpretation and can be used to more effectively select and design a follow-up validation study. Thus, statistical methodology to identify the correct biological model for a particular genotype-phenotype association can be very useful to investigators. Here, we propose a general statistical method to summarize how accurately each of five biological models (null, additive, dominant, recessive, co-dominant) represents the data observed for each variant in a GWAS study. We show that the new method stringently controls the false discovery rate and asymptotically selects the correct biological model. Simulations of two-stage discovery-validation studies show that the new method has these properties and that its validation power is similar to or exceeds that of simple methods that use the same statistical model for all SNPs. Example analyses of three data sets also highlight these advantages of the new method. An R package is freely available at www.stjuderesearch.org/site/depts/biostats/maew. Copyright © 2018. Published by Elsevier Inc.

  18. Null tests of the standard model using the linear model formalism

    NASA Astrophysics Data System (ADS)

    Marra, Valerio; Sapone, Domenico

    2018-04-01

    We test both the Friedmann-Lemaître-Robertson-Walker geometry and Λ CDM cosmology in a model-independent way by reconstructing the Hubble function H (z ), the comoving distance D (z ), and the growth of structure f σ8(z ) using the most recent data available. We use the linear model formalism in order to optimally reconstruct the above cosmological functions, together with their derivatives and integrals. We then evaluate four of the null tests available in the literature that probe both background and perturbation assumptions. For all the four tests, we find agreement, within the errors, with the standard cosmological model.

  19. Experiences Using Lightweight Formal Methods for Requirements Modeling

    NASA Technical Reports Server (NTRS)

    Easterbrook, Steve; Lutz, Robyn; Covington, Rick; Kelly, John; Ampo, Yoko; Hamilton, David

    1997-01-01

    This paper describes three case studies in the lightweight application of formal methods to requirements modeling for spacecraft fault protection systems. The case studies differ from previously reported applications of formal methods in that formal methods were applied very early in the requirements engineering process, to validate the evolving requirements. The results were fed back into the projects, to improve the informal specifications. For each case study, we describe what methods were applied, how they were applied, how much effort was involved, and what the findings were. In all three cases, formal methods enhanced the existing verification and validation processes, by testing key properties of the evolving requirements, and helping to identify weaknesses. We conclude that the benefits gained from early modeling of unstable requirements more than outweigh the effort needed to maintain multiple representations.

  20. Forecasts of non-Gaussian parameter spaces using Box-Cox transformations

    NASA Astrophysics Data System (ADS)

    Joachimi, B.; Taylor, A. N.

    2011-09-01

    Forecasts of statistical constraints on model parameters using the Fisher matrix abound in many fields of astrophysics. The Fisher matrix formalism involves the assumption of Gaussianity in parameter space and hence fails to predict complex features of posterior probability distributions. Combining the standard Fisher matrix with Box-Cox transformations, we propose a novel method that accurately predicts arbitrary posterior shapes. The Box-Cox transformations are applied to parameter space to render it approximately multivariate Gaussian, performing the Fisher matrix calculation on the transformed parameters. We demonstrate that, after the Box-Cox parameters have been determined from an initial likelihood evaluation, the method correctly predicts changes in the posterior when varying various parameters of the experimental setup and the data analysis, with marginally higher computational cost than a standard Fisher matrix calculation. We apply the Box-Cox-Fisher formalism to forecast cosmological parameter constraints by future weak gravitational lensing surveys. The characteristic non-linear degeneracy between matter density parameter and normalization of matter density fluctuations is reproduced for several cases, and the capabilities of breaking this degeneracy by weak-lensing three-point statistics is investigated. Possible applications of Box-Cox transformations of posterior distributions are discussed, including the prospects for performing statistical data analysis steps in the transformed Gaussianized parameter space.

  1. The Evolution of Organization Analysis in ASQ, 1959-1979.

    ERIC Educational Resources Information Center

    Daft, Richard L.

    1980-01-01

    During the period 1959-1979, a sharp trend toward low-variety statistical languages has taken place, which may represent an organizational mapping phase in which simple, quantifiable relationships have been formally defined and measured. A broader scope of research languages will be needed in the future. (Author/IRT)

  2. Beginning Teacher Induction: A Report on Beginning Teacher Effectiveness and Retention.

    ERIC Educational Resources Information Center

    Serpell, Zewelanji; Bozeman, Leslie A.

    National statistics show a rise in the number of beginning teachers undergoing formal induction in their first year of teaching. This report discusses the effectiveness of induction programs and resulting outcomes for beginning teacher retention, beginning teacher effectiveness, and mentor participation. The various components of induction…

  3. Statistical Knowledge and Learning in Phonology

    ERIC Educational Resources Information Center

    Dunbar, Ewan Michael

    2013-01-01

    This dissertation deals with the theory of the phonetic component of grammar in a formal probabilistic inference framework: (1) it has been recognized since the beginning of generative phonology that some language-specific phonetic implementation is actually context-dependent, and thus it can be said that there are gradient "phonetic…

  4. Mathematical Literacy--It's Become Fundamental

    ERIC Educational Resources Information Center

    McCrone, Sharon Soucy; Dossey, John A.

    2007-01-01

    The rising tide of numbers and statistics in daily life signals a need for a fundamental broadening of the concept of literacy: mathematical literacy assuming a coequal role in the curriculum alongside language-based literacy. Mathematical literacy is not about studying higher levels of formal mathematics, but about making math relevant and…

  5. Developing Sensitivity to Subword Combinatorial Orthographic Regularity (SCORe): A Two-Process Framework

    ERIC Educational Resources Information Center

    Mano, Quintino R.

    2016-01-01

    Accumulating evidence suggests that literacy acquisition involves developing sensitivity to the statistical regularities of the textual environment. To organize accumulating evidence and help guide future inquiry, this article integrates data from disparate fields of study and formalizes a new two-process framework for developing sensitivity to…

  6. Prison Clinicians' Perceptions of Antisocial Personality Disorder as a Formal Diagnosis.

    ERIC Educational Resources Information Center

    Stevens, Gail Flint

    1994-01-01

    Surveyed and interviewed 53 clinicians who work with prison inmates. Results indicated that clinicians used diagnosis of antisocial personality disorder liberally among inmates and felt majority of inmates could be so diagnosed. Large minority of clinicians went beyond Diagnostic and Statistical Manual of Mental Disorders criteria and reported…

  7. Ethical Reasoning Instruction in Non-Ethics Business Courses: A Non-Intrusive Approach

    ERIC Educational Resources Information Center

    Wilhelm, William J.

    2010-01-01

    This article discusses four confirmatory studies designed to corroborate findings from prior developmental research which yielded statistically significant improvements in student moral reasoning when specific instructional strategies and content materials were utilized in non-ethics business courses by instructors not formally trained in business…

  8. The Lay Concept of Childhood Mental Disorder

    ERIC Educational Resources Information Center

    Giummarra, Melita J.; Haslam, Nick

    2005-01-01

    The structure of lay people's concepts of childhood mental disorder was investigated in a questionnaire study and examined for convergence with the Diagnostic and Statistical Manual (DSM-IV). Eighty-four undergraduates who had no formal education in abnormal psychology rated 54 conditions--36 DSM-IV childhood disorders and 18 non-disorders--on…

  9. From Mere Coincidences to Meaningful Discoveries

    ERIC Educational Resources Information Center

    Griffiths, Thomas L.; Tenenbaum, Joshua B.

    2007-01-01

    People's reactions to coincidences are often cited as an illustration of the irrationality of human reasoning about chance. We argue that coincidences may be better understood in terms of rational statistical inference, based on their functional role in processes of causal discovery and theory revision. We present a formal definition of…

  10. Structured Statistical Models of Inductive Reasoning

    ERIC Educational Resources Information Center

    Kemp, Charles; Tenenbaum, Joshua B.

    2009-01-01

    Everyday inductive inferences are often guided by rich background knowledge. Formal models of induction should aim to incorporate this knowledge and should explain how different kinds of knowledge lead to the distinctive patterns of reasoning found in different inductive contexts. This article presents a Bayesian framework that attempts to meet…

  11. Evaluating Teachers and Schools Using Student Growth Models

    ERIC Educational Resources Information Center

    Schafer, William D.; Lissitz, Robert W.; Zhu, Xiaoshu; Zhang, Yuan; Hou, Xiaodong; Li, Ying

    2012-01-01

    Interest in Student Growth Modeling (SGM) and Value Added Modeling (VAM) arises from educators concerned with measuring the effectiveness of teaching and other school activities through changes in student performance as a companion and perhaps even an alternative to status. Several formal statistical models have been proposed for year-to-year…

  12. Predicting structural classes of proteins by incorporating their global and local physicochemical and conformational properties into general Chou's PseAAC.

    PubMed

    Contreras-Torres, Ernesto

    2018-06-02

    In this study, I introduce novel global and local 0D-protein descriptors based on a statistical quantity named Total Sum of Squares (TSS). This quantity represents the sum of the squares differences of amino acid properties from the arithmetic mean property. As an extension, the amino acid-types and amino acid-groups formalisms are used for describing zones of interest in proteins. To assess the effectiveness of the proposed descriptors, a Nearest Neighbor model for predicting the major four protein structural classes was built. This model has a success rate of 98.53% on the jackknife cross-validation test; this performance being superior to other reported methods despite the simplicity of the predictor. Additionally, this predictor has an average success rate of 98.35% in different cross-validation tests performed. A value of 0.98 for the Kappa statistic clearly discriminates this model from a random predictor. The results obtained by the Nearest Neighbor model demonstrated the ability of the proposed descriptors not only to reflect relevant biochemical information related to the structural classes of proteins but also to allow appropriate interpretability. It can thus be expected that the current method may play a supplementary role to other existing approaches for protein structural class prediction and other protein attributes. Copyright © 2018 Elsevier Ltd. All rights reserved.

  13. The moral development of medical students: a pilot study of the possible influence of medical education.

    PubMed

    Self, D J; Schrader, D E; Baldwin, D C; Wolinsky, F D

    1993-01-01

    Medicine endorses a code of ethics and encourages a high moral character among doctors. This study examines the influence of medical education on the moral reasoning and development of medical students. Kohlberg's Moral Judgment Interview was given to a sample of 20 medical students (41.7% of students in that class). The students were tested at the beginning and at the end of their medical course to determine whether their moral reasoning scores had increased to the same extent as other people who extend their formal education. It was found that normally expected increases in moral reasoning scores did not occur over the 4 years of medical education for these students, suggesting that their educational experience somehow inhibited their moral reasoning ability rather than facilitating it. With a range of moral reasoning scores between 315 and 482, the finding of a mean increase from first year to fourth year of 18.5 points was not statistically significant at the P < or = 0.05 level. Statistical analysis revealed no significant correlations at the P < or = 0.05 level between the moral reasoning scores and age, gender, Medical College Admission Test scores, or grade point average scores. Along with a brief description of Kohlberg's cognitive moral development theory, some interpretations and explanations are given for the findings of the study.

  14. Formal mentorship in a surgical residency training program: a prospective interventional study.

    PubMed

    Zhang, Han; Isaac, Andre; Wright, Erin D; Alrajhi, Yaser; Seikaly, Hadi

    2017-02-13

    Otolaryngology-Head and Neck surgery resident physicians (OHNSR) have a high prevalence of burnout, job dissatisfaction and stress as shown within the literature. Formal mentorship programs (FMP) have a proven track record of enhancing professional development and academic success. More importantly FMP have an overall positive impact on residents and assist in improving job satisfaction. The purpose of the study is to determine the effects of a FMP on the well-being of OHNSR. A FMP was established and all OHNSR participation was voluntary. Eight OHNSR participated in the program. Perceived Stress Survey (PSS) and the Maslach Burnout Inventory (MBI) were administered at baseline and then at 3, 6, 9, and 12 month intervals. World Health Quality of Life-Bref Questionnaire (WH-QOL) was administered at baseline and at 12 months. Baseline statistics found a significant burden of stress and burnout with an average PSS of 18.5 with a high MBI of 47.6, 50.6, and 16.5 for the emotional, depersonalization, and personal achievement domains respectively. Quality of life was also found to be low with a WH-QOL score of 71.9. After implementation of the FMP, PSS was reduced to 14.5 at 3 months (p = 0.174) and a statistically significant lower value of 7.9 at 12 months (p = 0.001). Participants were also found to have lower emotional scores (14.9, p < 0.0001), levels of depersonalization (20.1, p < 0.0001), and higher personal achievement (42.5, p < 0.0001) on MBI testing at 12 months. Overall quality values using the WH-QOL was also found to be significantly improved (37.5, P = 0.003) with statistically significant lower scores for the physical health (33.9, p = 0.003), psychological (41.1, p = 0.001), social relationship (46.9, p = 0.019), and environment (53.5, p = 0.012) domains. This is the first study to show that FMP can potentially alleviate high levels of stress and burnout within a surgical residency program and achieve higher levels of personal satisfaction as well as overall quality of life.

  15. (Finite) statistical size effects on compressive strength.

    PubMed

    Weiss, Jérôme; Girard, Lucas; Gimbert, Florent; Amitrano, David; Vandembroucq, Damien

    2014-04-29

    The larger structures are, the lower their mechanical strength. Already discussed by Leonardo da Vinci and Edmé Mariotte several centuries ago, size effects on strength remain of crucial importance in modern engineering for the elaboration of safety regulations in structural design or the extrapolation of laboratory results to geophysical field scales. Under tensile loading, statistical size effects are traditionally modeled with a weakest-link approach. One of its prominent results is a prediction of vanishing strength at large scales that can be quantified in the framework of extreme value statistics. Despite a frequent use outside its range of validity, this approach remains the dominant tool in the field of statistical size effects. Here we focus on compressive failure, which concerns a wide range of geophysical and geotechnical situations. We show on historical and recent experimental data that weakest-link predictions are not obeyed. In particular, the mechanical strength saturates at a nonzero value toward large scales. Accounting explicitly for the elastic interactions between defects during the damage process, we build a formal analogy of compressive failure with the depinning transition of an elastic manifold. This critical transition interpretation naturally entails finite-size scaling laws for the mean strength and its associated variability. Theoretical predictions are in remarkable agreement with measurements reported for various materials such as rocks, ice, coal, or concrete. This formalism, which can also be extended to the flowing instability of granular media under multiaxial compression, has important practical consequences for future design rules.

  16. A Benchmark for Comparing Different Approaches for Specifying and Verifying Real-Time Systems

    DTIC Science & Technology

    1993-01-01

    To be considered correct or useful, real - time systems must deliver results within specified time intervals, either without exception or with high...probability. Recently, a large number of formal methods have been invented for specifying and verifying real - time systems . It has been suggested that...these formal methods need to be tested out on actual real - time systems . Such testing will allow the scalability of the methods to be assessed and also

  17. Advanced orbiting systems test-bedding and protocol verification

    NASA Technical Reports Server (NTRS)

    Noles, James; De Gree, Melvin

    1989-01-01

    The Consultative Committee for Space Data Systems (CCSDS) has begun the development of a set of protocol recommendations for Advanced Orbiting Systems (SOS). The AOS validation program and formal definition of AOS protocols are reviewed, and the configuration control of the AOS formal specifications is summarized. Independent implementations of the AOS protocols by NASA and ESA are discussed, and cross-support/interoperability tests which will allow the space agencies of various countries to share AOS communication facilities are addressed.

  18. DOE Office of Scientific and Technical Information (OSTI.GOV)

    Vrugt, Jasper A; Robinson, Bruce A; Ter Braak, Cajo J F

    In recent years, a strong debate has emerged in the hydrologic literature regarding what constitutes an appropriate framework for uncertainty estimation. Particularly, there is strong disagreement whether an uncertainty framework should have its roots within a proper statistical (Bayesian) context, or whether such a framework should be based on a different philosophy and implement informal measures and weaker inference to summarize parameter and predictive distributions. In this paper, we compare a formal Bayesian approach using Markov Chain Monte Carlo (MCMC) with generalized likelihood uncertainty estimation (GLUE) for assessing uncertainty in conceptual watershed modeling. Our formal Bayesian approach is implemented usingmore » the recently developed differential evolution adaptive metropolis (DREAM) MCMC scheme with a likelihood function that explicitly considers model structural, input and parameter uncertainty. Our results demonstrate that DREAM and GLUE can generate very similar estimates of total streamflow uncertainty. This suggests that formal and informal Bayesian approaches have more common ground than the hydrologic literature and ongoing debate might suggest. The main advantage of formal approaches is, however, that they attempt to disentangle the effect of forcing, parameter and model structural error on total predictive uncertainty. This is key to improving hydrologic theory and to better understand and predict the flow of water through catchments.« less

  19. Specific features of goal setting in road traffic safety

    NASA Astrophysics Data System (ADS)

    Kolesov, V. I.; Danilov, O. F.; Petrov, A. I.

    2017-10-01

    Road traffic safety (RTS) management is inherently a branch of cybernetics and therefore requires clear formalization of the task. The paper aims at identification of the specific features of goal setting in RTS management under the system approach. The paper presents the results of cybernetic modeling of the cause-to-effect mechanism of a road traffic accident (RTA); in here, the mechanism itself is viewed as a complex system. A designed management goal function is focused on minimizing the difficulty in achieving the target goal. Optimization of the target goal has been performed using the Lagrange principle. The created working algorithms have passed the soft testing. The key role of the obtained solution in the tactical and strategic RTS management is considered. The dynamics of the management effectiveness indicator has been analyzed based on the ten-year statistics for Russia.

  20. Integrating science education and marine conservation through collaborative partnerships.

    PubMed

    Martin, Jeannie Miller; Higgins, Katie; Lee, Kristin; Stearns, Kira; Hunt, Lori

    2015-06-15

    The Georgia Sea Turtle Center has a mission of conservation based rehabilitation, research, and education. Marine debris is a serious threat to marine species. In an effort to educate local students, the GSTC obtained a grant to provide educational opportunities to local third graders. Third and fourth grade classes in Glynn County, Georgia were offered a Garbage in the Water program and 964 students were reached. After programming, students showed a statistically significant (p<.0001) increase in test scores between the pre and posttests. This success led to repeat funding for additional programming for first grades as well as a formalized relationship with the Glynn County School District. As part of this relationship the Georgia Sea Turtle Center is now the official field trip location for all third grades in the district. Copyright © 2015 Elsevier Ltd. All rights reserved.

  1. A comparison of two methods for retrieving ICD-9-CM data: the effect of using an ontology-based method for handling terminology changes.

    PubMed

    Yu, Alexander C; Cimino, James J

    2011-04-01

    Most existing controlled terminologies can be characterized as collections of terms, wherein the terms are arranged in a simple list or organized in a hierarchy. These kinds of terminologies are considered useful for standardizing terms and encoding data and are currently used in many existing information systems. However, they suffer from a number of limitations that make data reuse difficult. Relatively recently, it has been proposed that formal ontological methods can be applied to some of the problems of terminological design. Biomedical ontologies organize concepts (embodiments of knowledge about biomedical reality) whereas terminologies organize terms (what is used to code patient data at a certain point in time, based on the particular terminology version). However, the application of these methods to existing terminologies is not straightforward. The use of these terminologies is firmly entrenched in many systems, and what might seem to be a simple option of replacing these terminologies is not possible. Moreover, these terminologies evolve over time in order to suit the needs of users. Any methodology must therefore take these constraints into consideration, hence the need for formal methods of managing changes. Along these lines, we have developed a formal representation of the concept-term relation, around which we have also developed a methodology for management of terminology changes. The objective of this study was to determine whether our methodology would result in improved retrieval of data. Comparison of two methods for retrieving data encoded with terms from the International Classification of Diseases (ICD-9-CM), based on their recall when retrieving data for ICD-9-CM terms whose codes had changed but which had retained their original meaning (code change). Recall and interclass correlation coefficient. Statistically significant differences were detected (p<0.05) with the McNemar test for two terms whose codes had changed. Furthermore, when all the cases are combined in an overall category, our method also performs statistically significantly better (p<0.05). Our study shows that an ontology-based ICD-9-CM data retrieval method that takes into account the effects of terminology changes performs better on recall than one that does not in the retrieval of data for terms whose codes had changed but which retained their original meaning. Copyright © 2011 Elsevier Inc. All rights reserved.

  2. A Comparison of Two Methods for Retrieving ICD-9-CM data: The Effect of Using an Ontology-based Method for Handling Terminology Changes

    PubMed Central

    Yu, Alexander C.; Cimino, James J.

    2012-01-01

    Objective Most existing controlled terminologies can be characterized as collections of terms, wherein the terms are arranged in a simple list or organized in a hierarchy. These kinds of terminologies are considered useful for standardizing terms and encoding data and are currently used in many existing information systems. However, they suffer from a number of limitations that make data reuse difficult. Relatively recently, it has been proposed that formal ontological methods can be applied to some of the problems of terminological design. Biomedical ontologies organize concepts (embodiments of knowledge about biomedical reality) whereas terminologies organize terms (what is used to code patient data at a certain point in time, based on the particular terminology version). However, the application of these methods to existing terminologies is not straightforward. The use of these terminologies is firmly entrenched in many systems, and what might seem to be a simple option of replacing these terminologies is not possible. Moreover, these terminologies evolve over time in order to suit the needs of users. Any methodology must therefore take these constraints into consideration, hence the need for formal methods of managing changes. Along these lines, we have developed a formal representation of the concept-term relation, around which we have also developed a methodology for management of terminology changes. The objective of this study was to determine whether our methodology would result in improved retrieval of data. Design Comparison of two methods for retrieving data encoded with terms from the International Classification of Diseases (ICD-9-CM), based on their recall when retrieving data for ICD-9-CM terms whose codes had changed but which had retained their original meaning (code change). Measurements Recall and interclass correlation coefficient. Results Statistically significant differences were detected (p<0.05) with the McNemar test for two terms whose codes had changed. Furthermore, when all the cases are combined in an overall category, our method also performs statistically significantly better (p < 0.05). Conclusion Our study shows that an ontology-based ICD-9-CM data retrieval method that takes into account the effects of terminology changes performs better on recall than one that does not in the retrieval of data for terms whose codes had changed but which retained their original meaning. PMID:21262390

  3. The effect of esthetic crown lengthening on perceptions of a patient's attractiveness, friendliness, trustworthiness, intelligence, and self-confidence.

    PubMed

    Malkinson, Sam; Waldrop, Thomas C; Gunsolley, John C; Lanning, Sharon K; Sabatini, Robert

    2013-08-01

    Smile esthetics have been shown to play a major role in the perception of whether a person is attractive, and whether they are perceived as friendly, trustworthy, intelligent, and self-confident. A proposed major determinant of the esthetics of a smile is the amount of gingival display, which can be excessive in cases of altered passive eruption. The aim of this study is to see whether altering the amount of gingival display of patients would affect dental professionals' and laypersons' perceptions of the aforementioned social parameters. Patients were identified as having altered passive eruption and excessive gingival display. Smiling "control" photographs were taken and then digitally altered so as to lengthen the teeth and thus reduce the amount of gingival display. These became the "test" photographs. The control and test photographs were shown in random order. The control group of evaluators consisted of senior dental students, and the test group of evaluators comprised students who had no formal dental training. Groups were asked to rate, on a visual analog scale, each picture's attractiveness, friendliness, trustworthiness, intelligence, and self-confidence. The test pictures with less gingival display were consistently and statistically significantly rated higher for all five social parameters than were their control counterparts (P <0.0001). When analyzed as an isolated effect, there were no statistically significant differences between the control group and the test group of evaluators when rating the pictures. Pictures depicting African Americans were judged to be more trustworthy (P = 0.0467) and self-confident (P = 0.0490) than pictures depicting white individuals. Pictures depicting women were judged to be more trustworthy (P = 0.0159) and intelligent (P = 0.0329) than pictures depicting men. All the social parameters were positively and statistically significantly correlated with each other (P <0.0001). Excessive gingival display did negatively affect how attractive a person's smile is judged to be. In addition, how friendly, trustworthy, intelligent, and self-confident a person was perceived to be was inversely related to the amount of gingival display. Untrained laypeople were just as sensitive to these differences as senior dental students.

  4. The influence of shyness on children's test performance.

    PubMed

    Crozier, W Ray; Hostettler, Kirsten

    2003-09-01

    Research has shown that shy children differ from their peers not only in their use of language in routine social encounters but also in formal assessments of their language development, including psychometric tests of vocabulary. There has been little examination of factors contributing to these individual differences. To investigate cognitive-competence and social anxiety interpretations of differences in children's performance on tests of vocabulary. To examine the performance of shy and less shy children under different conditions of test administration, individually with an examiner or among their peers within the familiar classroom setting. The sample consisted of 240 Year 5 pupils (122 male, 118 female) from 24 primary schools. Shy and less shy children, identified by teacher nomination and checklist ratings, completed vocabulary and mental arithmetic tests in one of three conditions, in a between-subjects design. The conditions varied individual and group administration, and oral and written responses. The conditions of test administration influenced the vocabulary test performance of shy children. They performed significantly more poorly than their peers in the two face-to-face conditions but not in the group test condition. A comparable trend for the arithmetic test was not statistically significant. Across the sample as a whole, shyness correlated significantly with test scores. Shyness does influence children's cognitive test performance and its impact is larger when children are tested face-to-face rather than in a more anonymous group setting. The results are of significance for theories of shyness and have implications for the assessment of schoolchildren.

  5. Bottom-up approaches to strengthening child protection systems: Placing children, families, and communities at the center.

    PubMed

    Wessells, Michael G

    2015-05-01

    Efforts to strengthen national child protection systems have frequently taken a top-down approach of imposing formal, government-managed services. Such expert-driven approaches are often characterized by low use of formal services and the misalignment of the nonformal and formal aspects of the child protection system. This article examines an alternative approach of community-driven, bottom-up work that enables nonformal-formal collaboration and alignment, greater use of formal services, internally driven social change, and high levels of community ownership. The dominant approach of reliance on expert-driven Child Welfare Committees produces low levels of community ownership. Using an approach developed and tested in rural Sierra Leone, community-driven action, including collaboration and linkages with the formal system, promoted the use of formal services and achieved increased ownership, effectiveness, and sustainability of the system. The field needs less reliance on expert-driven approaches and much wider use of slower, community-driven, bottom-up approaches to child protection. Copyright © 2015 The Author. Published by Elsevier Ltd.. All rights reserved.

  6. Balancing Chemical Equations: The Role of Developmental Level and Mental Capacity.

    ERIC Educational Resources Information Center

    Niaz, Mansoor; Lawson, Anton E.

    1985-01-01

    Tested two hypotheses: (1) formal reasoning is required to balance simple one-step equations; and (2) formal reasoning plus sufficient mental capacity are required to balance many-step equations. Independent variables included intellectual development, mental capacity, and degree of field dependence/independence. With 25 subjects, significance was…

  7. Recognition of Emotions in Autism: A Formal Meta-Analysis

    ERIC Educational Resources Information Center

    Uljarevic, Mirko; Hamilton, Antonia

    2013-01-01

    Determining the integrity of emotion recognition in autistic spectrum disorder is important to our theoretical understanding of autism and to teaching social skills. Previous studies have reported both positive and negative results. Here, we take a formal meta-analytic approach, bringing together data from 48 papers testing over 980 participants…

  8. Interlanguage Variation: A Point Missed?

    ERIC Educational Resources Information Center

    Tice, Bradley Scott

    A study investigated patterns in phonological errors occurring in the speaker's second language in both formal and informal speaking situations. Subjects were three adult learners of English as a second language, including a native Spanish-speaker and two Asians. Their speech was recorded during diagnostic testing (formal speech) and in everyday…

  9. Formal verification and testing: An integrated approach to validating Ada programs

    NASA Technical Reports Server (NTRS)

    Cohen, Norman H.

    1986-01-01

    An integrated set of tools called a validation environment is proposed to support the validation of Ada programs by a combination of methods. A Modular Ada Validation Environment (MAVEN) is described which proposes a context in which formal verification can fit into the industrial development of Ada software.

  10. Genetic Improvement of Eastern Cottonwood

    Treesearch

    Robert E. Farmer; Carl A. Mohn

    1970-01-01

    Eastern cottonwood (Populus deltoides Bartt.) genetics research has moved during the past decade from formal statements of its promise to long-term formal tests of commercially promising material. Much of this research has been conducted in the Lower Mississippi Valley. wh.ere cottonwood has major commercial importance. but there have been...

  11. Designing Curricular Experiences that Promote Young Adolescents' Cognitive Growth

    ERIC Educational Resources Information Center

    Brown, Dave F.; Canniff, Mary

    2007-01-01

    One of the most challenging daily experiences of teaching young adolescents is helping them transition from Piaget's concrete to the formal operational stage of cognitive development during the middle school years. Students who have reached formal operations can design and test hypotheses, engage in deductive reasoning, use flexible thinking,…

  12. Piagetian Research as Applied to Teaching Science to Secondary and College Students.

    ERIC Educational Resources Information Center

    Gabel, Dorothy L.

    1979-01-01

    Piaget's formal operational stage is related to the teaching of science by focusing on the development of paper and pencil tests for determining students' cognitive level of development and on procedures for helping concrete operational students improve achievement and become more formal in their thinking. (JMF)

  13. Gonorrhea Test

    MedlinePlus

    ... for Targeted Cancer Therapy Glucose Tests Gonorrhea Testing Gram Stain Growth Hormone Haptoglobin hCG Pregnancy hCG Tumor Marker ... NAT Neisseria gonorrhoeae Nucleic Acid Amplification Test, Culture, Gram Stain, DNA Probe Formal Name Neisseria gonorrhoeae This article ...

  14. Path Integrals for Electronic Densities, Reactivity Indices, and Localization Functions in Quantum Systems

    PubMed Central

    Putz, Mihai V.

    2009-01-01

    The density matrix theory, the ancestor of density functional theory, provides the immediate framework for Path Integral (PI) development, allowing the canonical density be extended for the many-electronic systems through the density functional closure relationship. Yet, the use of path integral formalism for electronic density prescription presents several advantages: assures the inner quantum mechanical description of the system by parameterized paths; averages the quantum fluctuations; behaves as the propagator for time-space evolution of quantum information; resembles Schrödinger equation; allows quantum statistical description of the system through partition function computing. In this framework, four levels of path integral formalism were presented: the Feynman quantum mechanical, the semiclassical, the Feynman-Kleinert effective classical, and the Fokker-Planck non-equilibrium ones. In each case the density matrix or/and the canonical density were rigorously defined and presented. The practical specializations for quantum free and harmonic motions, for statistical high and low temperature limits, the smearing justification for the Bohr’s quantum stability postulate with the paradigmatic Hydrogen atomic excursion, along the quantum chemical calculation of semiclassical electronegativity and hardness, of chemical action and Mulliken electronegativity, as well as by the Markovian generalizations of Becke-Edgecombe electronic focalization functions – all advocate for the reliability of assuming PI formalism of quantum mechanics as a versatile one, suited for analytically and/or computationally modeling of a variety of fundamental physical and chemical reactivity concepts characterizing the (density driving) many-electronic systems. PMID:20087467

  15. Path integrals for electronic densities, reactivity indices, and localization functions in quantum systems.

    PubMed

    Putz, Mihai V

    2009-11-10

    The density matrix theory, the ancestor of density functional theory, provides the immediate framework for Path Integral (PI) development, allowing the canonical density be extended for the many-electronic systems through the density functional closure relationship. Yet, the use of path integral formalism for electronic density prescription presents several advantages: assures the inner quantum mechanical description of the system by parameterized paths; averages the quantum fluctuations; behaves as the propagator for time-space evolution of quantum information; resembles Schrödinger equation; allows quantum statistical description of the system through partition function computing. In this framework, four levels of path integral formalism were presented: the Feynman quantum mechanical, the semiclassical, the Feynman-Kleinert effective classical, and the Fokker-Planck non-equilibrium ones. In each case the density matrix or/and the canonical density were rigorously defined and presented. The practical specializations for quantum free and harmonic motions, for statistical high and low temperature limits, the smearing justification for the Bohr's quantum stability postulate with the paradigmatic Hydrogen atomic excursion, along the quantum chemical calculation of semiclassical electronegativity and hardness, of chemical action and Mulliken electronegativity, as well as by the Markovian generalizations of Becke-Edgecombe electronic focalization functions - all advocate for the reliability of assuming PI formalism of quantum mechanics as a versatile one, suited for analytically and/or computationally modeling of a variety of fundamental physical and chemical reactivity concepts characterizing the (density driving) many-electronic systems.

  16. Mapping of polycrystalline films of biological fluids utilizing the Jones-matrix formalism

    NASA Astrophysics Data System (ADS)

    Ushenko, Vladimir A.; Dubolazov, Alexander V.; Pidkamin, Leonid Y.; Sakchnovsky, Michael Yu; Bodnar, Anna B.; Ushenko, Yuriy A.; Ushenko, Alexander G.; Bykov, Alexander; Meglinski, Igor

    2018-02-01

    Utilizing a polarized light approach, we reconstruct the spatial distribution of birefringence and optical activity in polycrystalline films of biological fluids. The Jones-matrix formalism is used for an accessible quantitative description of these types of optical anisotropy. We demonstrate that differentiation of polycrystalline films of biological fluids can be performed based on a statistical analysis of the distribution of rotation angles and phase shifts associated with the optical activity and birefringence, respectively. Finally, practical operational characteristics, such as sensitivity, specificity and accuracy of the Jones-matrix reconstruction of optical anisotropy, were identified with special emphasis on biomedical application, specifically for differentiation of bile films taken from healthy donors and from patients with cholelithiasis.

  17. Formal Solutions for Polarized Radiative Transfer. III. Stiffness and Instability

    NASA Astrophysics Data System (ADS)

    Janett, Gioele; Paganini, Alberto

    2018-04-01

    Efficient numerical approximation of the polarized radiative transfer equation is challenging because this system of ordinary differential equations exhibits stiff behavior, which potentially results in numerical instability. This negatively impacts the accuracy of formal solvers, and small step-sizes are often necessary to retrieve physical solutions. This work presents stability analyses of formal solvers for the radiative transfer equation of polarized light, identifies instability issues, and suggests practical remedies. In particular, the assumptions and the limitations of the stability analysis of Runge–Kutta methods play a crucial role. On this basis, a suitable and pragmatic formal solver is outlined and tested. An insightful comparison to the scalar radiative transfer equation is also presented.

  18. Statistical manifestation of quantum correlations via disequilibrium

    NASA Astrophysics Data System (ADS)

    Pennini, F.; Plastino, A.

    2017-12-01

    The statistical notion of disequilibrium (D) was introduced by López-Ruiz, Mancini, and Calbet (LMC) (1995) [1] more than 20 years ago. D measures the amount of ;correlational structure; of a system. We wish to use D to analyze one of the simplest types of quantum correlations, those present in gaseous systems due to symmetry considerations. To this end we extend the LMC formalism to the grand canonical environment and show that D displays distinctive behaviors for simple gases, that allow for interesting insights into their structural properties.

  19. Bayesian Decision Support

    NASA Astrophysics Data System (ADS)

    Berliner, M.

    2017-12-01

    Bayesian statistical decision theory offers a natural framework for decision-policy making in the presence of uncertainty. Key advantages of the approach include efficient incorporation of information and observations. However, in complicated settings it is very difficult, perhaps essentially impossible, to formalize the mathematical inputs needed in the approach. Nevertheless, using the approach as a template is useful for decision support; that is, organizing and communicating our analyses. Bayesian hierarchical modeling is valuable in quantifying and managing uncertainty such cases. I review some aspects of the idea emphasizing statistical model development and use in the context of sea-level rise.

  20. Systems engineering principles for the design of biomedical signal processing systems.

    PubMed

    Faust, Oliver; Acharya U, Rajendra; Sputh, Bernhard H C; Min, Lim Choo

    2011-06-01

    Systems engineering aims to produce reliable systems which function according to specification. In this paper we follow a systems engineering approach to design a biomedical signal processing system. We discuss requirements capturing, specification definition, implementation and testing of a classification system. These steps are executed as formal as possible. The requirements, which motivate the system design, are based on diabetes research. The main requirement for the classification system is to be a reliable component of a machine which controls diabetes. Reliability is very important, because uncontrolled diabetes may lead to hyperglycaemia (raised blood sugar) and over a period of time may cause serious damage to many of the body systems, especially the nerves and blood vessels. In a second step, these requirements are refined into a formal CSP‖ B model. The formal model expresses the system functionality in a clear and semantically strong way. Subsequently, the proven system model was translated into an implementation. This implementation was tested with use cases and failure cases. Formal modeling and automated model checking gave us deep insight in the system functionality. This insight enabled us to create a reliable and trustworthy implementation. With extensive tests we established trust in the reliability of the implementation. Copyright © 2010 Elsevier Ireland Ltd. All rights reserved.

  1. Statistical nature of infrared dynamics on de Sitter background

    NASA Astrophysics Data System (ADS)

    Tokuda, Junsei; Tanaka, Takahiro

    2018-02-01

    In this study, we formulate a systematic way of deriving an effective equation of motion(EoM) for long wavelength modes of a massless scalar field with a general potential V(phi) on de Sitter background, and investigate whether or not the effective EoM can be described as a classical stochastic process. Our formulation gives an extension of the usual stochastic formalism to including sub-leading secular growth coming from the nonlinearity of short wavelength modes. Applying our formalism to λ phi4 theory, we explicitly derive an effective EoM which correctly recovers the next-to-leading secularly growing part at a late time, and show that this effective EoM can be seen as a classical stochastic process. Our extended stochastic formalism can describe all secularly growing terms which appear in all correlation functions with a specific operator ordering. The restriction of the operator ordering will not be a big drawback because the commutator of a light scalar field becomes negligible at large scales owing to the squeezing.

  2. Evaluating the Effectiveness of a Large-Scale Professional Development Programme

    ERIC Educational Resources Information Center

    Main, Katherine; Pendergast, Donna

    2017-01-01

    An evaluation of the effectiveness of a large-scale professional development (PD) programme delivered to 258 schools in Queensland, Australia is presented. Formal evaluations were conducted at two stages during the programme using a tool developed from Desimone's five core features of effective PD. Descriptive statistics of 38 questions and…

  3. Making Heads or Tails of Probability: An Experiment with Random Generators

    ERIC Educational Resources Information Center

    Morsanyi, Kinga; Handley, Simon J.; Serpell, Sylvie

    2013-01-01

    Background: The equiprobability bias is a tendency for individuals to think of probabilistic events as "equiprobable" by nature, and to judge outcomes that occur with different probabilities as equally likely. The equiprobability bias has been repeatedly found to be related to formal education in statistics, and it is claimed to be based…

  4. Emergent Feature Structures: Harmony Systems in Exemplar Models of Phonology

    ERIC Educational Resources Information Center

    Cole, Jennifer

    2009-01-01

    In exemplar models of phonology, phonotactic constraints are modeled as emergent from patterns of high activation between units that co-occur with statistical regularity, or as patterns of low activation or inhibition between units that co-occur less frequently or not at all. Exemplar models posit no a "priori" formal or representational…

  5. Integrated Postsecondary Education Data System Data Quality Study. Methodology Report. NCES 2005-175

    ERIC Educational Resources Information Center

    Jackson, Kenneth W.; Peecksen, Scott; Jang, Donsig; Sukasih, Amang

    2005-01-01

    The Integrated Postsecondary Education Data System (IPEDS) of the National Center for Education Statistics (NCES) was initiated in 1986 to collect data about all identified institutions whose primary purpose is to provide postsecondary education. Postsecondary education is defined within IPEDS as "the provision of a formal instructional…

  6. The SACE Review Panel's Final Report: Significant Flaws in the Analysis of Statistical Data

    ERIC Educational Resources Information Center

    Gregory, Kelvin

    2006-01-01

    The South Australian Certificate of Education (SACE) is a credential and formal qualification within the Australian Qualifications Framework. A recent review of the SACE outlined a number of recommendations for significant changes to this certificate. These recommendations were the result of a process that began with the review panel…

  7. Bootstrapping in a Language of Thought: A Formal Model of Numerical Concept Learning

    ERIC Educational Resources Information Center

    Piantadosi, Steven T.; Tenenbaum, Joshua B.; Goodman, Noah D.

    2012-01-01

    In acquiring number words, children exhibit a qualitative leap in which they transition from understanding a few number words, to possessing a rich system of interrelated numerical concepts. We present a computational framework for understanding this inductive leap as the consequence of statistical inference over a sufficiently powerful…

  8. Behavioral and Social Science Research: A National Resource. Part II.

    ERIC Educational Resources Information Center

    Adams, Robert McC., Ed.; And Others

    Areas of behavioral and social science research that have achieved significant breakthroughs in knowledge or application or that show future promise of achieving such breakthroughs are discussed in 12 papers. For example, the paper on formal demography shows how mathematical or statistical techniques can be used to explain and predict change in…

  9. Students and Courses 2002: At a Glance. Australian Vocational Education and Training Statistics.

    ERIC Educational Resources Information Center

    National Centre for Vocational Education Research, Leabrook (Australia).

    The public vocational education and training (VET) system in Australia encompasses formal learning activities intended to develop knowledge and skills that are relevant in the workplace for those past the age of compulsory schooling, but excludes bachelor and post-graduate courses and learning for leisure, recreation or personal enrichment. Some…

  10. Mechanics of Brittle Materials. Part 1. Preliminary Mechanical Properties and Statistical Representations

    DTIC Science & Technology

    1973-10-01

    intensity computation are shown in Figure 17. Using the same formal procedure outlined by Winne & Wundt . a notch geometry can be chosen to induce...Nitride at Elevated Temperatures . Winne, D.H. and Wundt , B.M., "Application of the Gnffith-Irwm Theory of Crack Propagation to the Bursting Behavior

  11. Beyond Literacy: Non-Formal Education Programmes for Adults in Mozambique

    ERIC Educational Resources Information Center

    van der Linden, Josje; Manuel, Alzira Munguambe

    2011-01-01

    Thirty-five years after independence the Mozambican illiteracy rate has been reduced from 93% to just over 50% according to official statistics. Although this indicates an enormous achievement in the area of education, the challenge of today still is to design appropriate adult basic education programmes including literacy, numeracy and life…

  12. Introduction of Digital Storytelling in Preschool Education: A Case Study from Croatia

    ERIC Educational Resources Information Center

    Preradovic, Nives Mikelic; Lesin, Gordana; Boras, Damir

    2016-01-01

    Our case study from Croatia showed the benefits of digital storytelling in a preschool as a basis for the formal ICT education. The statistical analysis revealed significant differences between children aged 6-7 who learned mathematics by traditional storytelling compared to those learning through digital storytelling. The experimental group that…

  13. 77 FR 72715 - Informal Entry Limit and Removal of a Formal Entry Requirement

    Federal Register 2010, 2011, 2012, 2013, 2014

    2012-12-06

    ... required certifications, enforcement information, and statistical data. An agency may not conduct or..., 1623, 1624, 3314. * * * * * Sec. 10.1 [Amended] 0 2. In Sec. 10.1: 0 a. Paragraph (a) introductory text... revising``19------'' to read ``20---- --''; 0 c. Paragraph (a)(2) introductory text is amended in the last...

  14. The Schrödinger–Langevin equation with and without thermal fluctuations

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Katz, R., E-mail: roland.katz@subatech.in2p3.fr; Gossiaux, P.B., E-mail: Pol-Bernard.Gossiaux@subatech.in2p3.fr

    2016-05-15

    The Schrödinger–Langevin equation (SLE) is considered as an effective open quantum system formalism suitable for phenomenological applications involving a quantum subsystem interacting with a thermal bath. We focus on two open issues relative to its solutions: the stationarity of the excited states of the non-interacting subsystem when one considers the dissipation only and the thermal relaxation toward asymptotic distributions with the additional stochastic term. We first show that a proper application of the Madelung/polar transformation of the wave function leads to a non zero damping of the excited states of the quantum subsystem. We then study analytically and numerically themore » SLE ability to bring a quantum subsystem to the thermal equilibrium of statistical mechanics. To do so, concepts about statistical mixed states and quantum noises are discussed and a detailed analysis is carried with two kinds of noise and potential. We show that within our assumptions the use of the SLE as an effective open quantum system formalism is possible and discuss some of its limitations.« less

  15. Kolmogorov complexity, statistical regularization of inverse problems, and Birkhoff's formalization of beauty

    NASA Astrophysics Data System (ADS)

    Kreinovich, Vladik; Longpre, Luc; Koshelev, Misha

    1998-09-01

    Most practical applications of statistical methods are based on the implicit assumption that if an event has a very small probability, then it cannot occur. For example, the probability that a kettle placed on a cold stove would start boiling by itself is not 0, it is positive, but it is so small, that physicists conclude that such an event is simply impossible. This assumption is difficult to formalize in traditional probability theory, because this theory only describes measures on sets and does not allow us to divide functions into 'random' and non-random ones. This distinction was made possible by the idea of algorithmic randomness, introduce by Kolmogorov and his student Martin- Loef in the 1960s. We show that this idea can also be used for inverse problems. In particular, we prove that for every probability measure, the corresponding set of random functions is compact, and, therefore, the corresponding restricted inverse problem is well-defined. The resulting techniques turns out to be interestingly related with the qualitative esthetic measure introduced by G. Birkhoff as order/complexity.

  16. Random dopant fluctuations and statistical variability in n-channel junctionless FETs

    NASA Astrophysics Data System (ADS)

    Akhavan, N. D.; Umana-Membreno, G. A.; Gu, R.; Antoszewski, J.; Faraone, L.

    2018-01-01

    The influence of random dopant fluctuations on the statistical variability of the electrical characteristics of n-channel silicon junctionless nanowire transistor (JNT) has been studied using three dimensional quantum simulations based on the non-equilibrium Green’s function (NEGF) formalism. Average randomly distributed body doping densities of 2 × 1019, 6 × 1019 and 1 × 1020 cm-3 have been considered employing an atomistic model for JNTs with gate lengths of 5, 10 and 15 nm. We demonstrate that by properly adjusting the doping density in the JNT, a near ideal statistical variability and electrical performance can be achieved, which can pave the way for the continuation of scaling in silicon CMOS technology.

  17. Eye Donation Awareness and Conversion Rate in Hospital Cornea Retrieval Programme in a Tertiary Hospital of Central India

    PubMed Central

    Shrivastava, Ulka; Kumar, Kavita; Baghel, Rajendra; Khan, Farhana; Kulkarni, Shridhar

    2017-01-01

    Introduction Corneal blindness accounts for 6–8 million blinds in the world. In India, it is estimated that there are approximately 6.8 million people who have vision less than 6/60 in at least one eye due to corneal diseases. Aim This study was done to assess the awareness about eye donation amongst attendants of critically ill and deceased patients, their willingness to donate eyes, the efficacy of grief counselling by Eye Donation Counsellors (EDC), its impact on the conversion rate and the reasons for poor donation rate. Materials and Methods This prospective hospital based study was done in 554 participants (guardians of critically ill and deceased subjects) to understand the awareness of eye donation. Factors related to willingness for eye donation that influenced conversion to actual donation were evaluated. Data was analysed with tests for statistical significance: Chi square test; p<0.05 at 95% confidence interval was set as significant. Results Awareness index particularly in males <40 years, was found to be statistically more. In participants who were partially/fully aware of eye-donation, time taken for motivation remained less than 12 hours, which was statistically significant (Chi square=106. p<0.001). Subject who were aware, willing for donation in comparison to those who were unaware in a ratio of 2:1. Grief counsellors (57.5%) had the most influence among the causes that were facilitators of donation. Conclusion Utilizing the services of eye donation counsellors is a promising way to motivate the guardians of deceased. Increasing the awareness in society, rendering simple assistances to next of kin and speeding the medico legal formalities can go a long way in increasing the conversion rate and hence actual donation. PMID:28969171

  18. Eye Donation Awareness and Conversion Rate in Hospital Cornea Retrieval Programme in a Tertiary Hospital of Central India.

    PubMed

    Sharma, Bhavana; Shrivastava, Ulka; Kumar, Kavita; Baghel, Rajendra; Khan, Farhana; Kulkarni, Shridhar

    2017-08-01

    Corneal blindness accounts for 6-8 million blinds in the world. In India, it is estimated that there are approximately 6.8 million people who have vision less than 6/60 in at least one eye due to corneal diseases. This study was done to assess the awareness about eye donation amongst attendants of critically ill and deceased patients, their willingness to donate eyes, the efficacy of grief counselling by Eye Donation Counsellors (EDC), its impact on the conversion rate and the reasons for poor donation rate. This prospective hospital based study was done in 554 participants (guardians of critically ill and deceased subjects) to understand the awareness of eye donation. Factors related to willingness for eye donation that influenced conversion to actual donation were evaluated. Data was analysed with tests for statistical significance: Chi square test; p<0.05 at 95% confidence interval was set as significant. Awareness index particularly in males <40 years, was found to be statistically more. In participants who were partially/fully aware of eye-donation, time taken for motivation remained less than 12 hours, which was statistically significant (Chi square=106. p<0.001). Subject who were aware, willing for donation in comparison to those who were unaware in a ratio of 2:1. Grief counsellors (57.5%) had the most influence among the causes that were facilitators of donation. Utilizing the services of eye donation counsellors is a promising way to motivate the guardians of deceased. Increasing the awareness in society, rendering simple assistances to next of kin and speeding the medico legal formalities can go a long way in increasing the conversion rate and hence actual donation.

  19. Formal and informal home learning activities in relation to children's early numeracy and literacy skills: the development of a home numeracy model.

    PubMed

    Skwarchuk, Sheri-Lynn; Sowinski, Carla; LeFevre, Jo-Anne

    2014-05-01

    The purpose of this study was to propose and test a model of children's home numeracy experience based on Sénéchal and LeFevre's home literacy model (Child Development, 73 (2002) 445-460). Parents of 183 children starting kindergarten in the fall (median child age=58 months) completed an early home learning experiences questionnaire. Most of the children whose parents completed the questionnaire were recruited for numeracy and literacy testing 1 year later (along with 32 children from the inner city). Confirmatory factor analyses were used to reduce survey items, and hierarchical regression analyses were used to predict the relation among parents' attitudes, academic expectations for their children, reports of formal and informal numeracy, and literacy home practices on children's test scores. Parental reports of formal home numeracy practices (e.g., practicing simple sums) predicted children's symbolic number system knowledge, whereas reports of informal exposure to games with numerical content (measured indirectly through parents' knowledge of children's games) predicted children's non-symbolic arithmetic, as did numeracy attitudes (e.g., parents' enjoyment of numeracy). The home literacy results replicated past findings; parental reports of formal literacy practices (e.g., helping their children to read words) predicted children's word reading, whereas reports of informal experiences (i.e., frequency of shared reading measured indirectly through parents' storybook knowledge) predicted children's vocabulary. These findings support a multifaceted model of children's early numeracy environment, with different types of early home experiences (formal and informal) predicting different numeracy outcomes. Copyright © 2013 Elsevier Inc. All rights reserved.

  20. The New Italian Seismic Hazard Model

    NASA Astrophysics Data System (ADS)

    Marzocchi, W.; Meletti, C.; Albarello, D.; D'Amico, V.; Luzi, L.; Martinelli, F.; Pace, B.; Pignone, M.; Rovida, A.; Visini, F.

    2017-12-01

    In 2015 the Seismic Hazard Center (Centro Pericolosità Sismica - CPS) of the National Institute of Geophysics and Volcanology was commissioned of coordinating the national scientific community with the aim to elaborate a new reference seismic hazard model, mainly finalized to the update of seismic code. The CPS designed a roadmap for releasing within three years a significantly renewed PSHA model, with regard both to the updated input elements and to the strategies to be followed. The main requirements of the model were discussed in meetings with the experts on earthquake engineering that then will participate to the revision of the building code. The activities were organized in 6 tasks: program coordination, input data, seismicity models, ground motion predictive equations (GMPEs), computation and rendering, testing. The input data task has been selecting the most updated information about seismicity (historical and instrumental), seismogenic faults, and deformation (both from seismicity and geodetic data). The seismicity models have been elaborating in terms of classic source areas, fault sources and gridded seismicity based on different approaches. The GMPEs task has selected the most recent models accounting for their tectonic suitability and forecasting performance. The testing phase has been planned to design statistical procedures to test with the available data the whole seismic hazard models, and single components such as the seismicity models and the GMPEs. In this talk we show some preliminary results, summarize the overall strategy for building the new Italian PSHA model, and discuss in detail important novelties that we put forward. Specifically, we adopt a new formal probabilistic framework to interpret the outcomes of the model and to test it meaningfully; this requires a proper definition and characterization of both aleatory variability and epistemic uncertainty that we accomplish through an ensemble modeling strategy. We use a weighting scheme of the different components of the PSHA model that has been built through three different independent steps: a formal experts' elicitation, the outcomes of the testing phase, and the correlation between the outcomes. Finally, we explore through different techniques the influence on seismic hazard of the declustering procedure.

  1. Japanese Americans' health concerns and depressive symptoms: implications for disaster counseling.

    PubMed

    Cheung, Monit; Leung, Patrick; Tsui, Venus

    2013-07-01

    This study examined factors contributing to depressive symptoms among Japanese Americans. Data were collected in Houston, Texas, in 2008, before the March 2011 Japan earthquake, through a community survey including demographic and mental health questions and the Hopkins Symptoms Checklist. Among 43 Japanese American respondents in this convenience sample, the depression prevalence was 11.6 percent. Chi-square results found that having anxiety symptoms and holding a master's degree had statistically significant relationships with depressive symptoms. An independent sample t test found that those having depressive symptoms experienced significantly more health issues than those without depressive symptoms. When these statistically significant variables were entered into a logistic regression model, the overall effect of having health issues, anxiety symptoms, and a master's degree collectively predicted depressive symptoms. It was also found that Japanese Americans rarely consult mental health professionals; in particular, female Japanese American respondents tend to seek help from religious leaders. As implied by these findings, the reluctance of Japanese Americans to seek formal help can be explained by social stigma, a health-oriented approach to treatment, and other cultural considerations. Practice implications focus on disaster counseling with a connection between mental health needs and health care support.

  2. An Assessment of Teaching and Learning Practices: A Questionnaire Study for Dental Educators of Karnataka.

    PubMed

    Meenakshi, S; Raghunath, N; Shreeshyla, H S

    2017-11-01

    Faculty members of dental institutions are being asked to assume new academic duties for which they have received no formal training. To succeed in new teaching tasks, faculty development through assessment of teaching skills is essential. A Self-Assessment Questionnaire consisting 18 closed-ended questions was sent to various faculty members of dental colleges of Karnataka. A total of 210 faculty members volunteered to participate in the study. The response rate was 69.8%. Data gathered were statistically analyzed using SPSS software version 16, Chi-square test, and descriptive statistics. In the present study, 27.3% of participants were unaware of andragogy, 33.3% were unaware of teachers development programs, 44.6% do not obtain student feedback after teaching, 52.6% were unaware of peer review of teaching skills, and 50% were unaware of interprofessional education initiatives. By incorporating teaching and learning skills, dental faculty could acquire competencies and academic credentials to become valuable contributors to the institution. This study emphasizes the areas of improvement in dental school learning environment, based on activation of prior knowledge, elaboration of new learning, learning in context, transfer of learning, and organization of knowledge toward learning.

  3. Language experience changes subsequent learning.

    PubMed

    Onnis, Luca; Thiessen, Erik

    2013-02-01

    What are the effects of experience on subsequent learning? We explored the effects of language-specific word order knowledge on the acquisition of sequential conditional information. Korean and English adults were engaged in a sequence learning task involving three different sets of stimuli: auditory linguistic (nonsense syllables), visual non-linguistic (nonsense shapes), and auditory non-linguistic (pure tones). The forward and backward probabilities between adjacent elements generated two equally probable and orthogonal perceptual parses of the elements, such that any significant preference at test must be due to either general cognitive biases, or prior language-induced biases. We found that language modulated parsing preferences with the linguistic stimuli only. Intriguingly, these preferences are congruent with the dominant word order patterns of each language, as corroborated by corpus analyses, and are driven by probabilistic preferences. Furthermore, although the Korean individuals had received extensive formal explicit training in English and lived in an English-speaking environment, they exhibited statistical learning biases congruent with their native language. Our findings suggest that mechanisms of statistical sequential learning are implicated in language across the lifespan, and experience with language may affect cognitive processes and later learning. Copyright © 2012 Elsevier B.V. All rights reserved.

  4. Heuristic decision making.

    PubMed

    Gigerenzer, Gerd; Gaissmaier, Wolfgang

    2011-01-01

    As reflected in the amount of controversy, few areas in psychology have undergone such dramatic conceptual changes in the past decade as the emerging science of heuristics. Heuristics are efficient cognitive processes, conscious or unconscious, that ignore part of the information. Because using heuristics saves effort, the classical view has been that heuristic decisions imply greater errors than do "rational" decisions as defined by logic or statistical models. However, for many decisions, the assumptions of rational models are not met, and it is an empirical rather than an a priori issue how well cognitive heuristics function in an uncertain world. To answer both the descriptive question ("Which heuristics do people use in which situations?") and the prescriptive question ("When should people rely on a given heuristic rather than a complex strategy to make better judgments?"), formal models are indispensable. We review research that tests formal models of heuristic inference, including in business organizations, health care, and legal institutions. This research indicates that (a) individuals and organizations often rely on simple heuristics in an adaptive way, and (b) ignoring part of the information can lead to more accurate judgments than weighting and adding all information, for instance for low predictability and small samples. The big future challenge is to develop a systematic theory of the building blocks of heuristics as well as the core capacities and environmental structures these exploit.

  5. Modelling Trial-by-Trial Changes in the Mismatch Negativity

    PubMed Central

    Lieder, Falk; Daunizeau, Jean; Garrido, Marta I.; Friston, Karl J.; Stephan, Klaas E.

    2013-01-01

    The mismatch negativity (MMN) is a differential brain response to violations of learned regularities. It has been used to demonstrate that the brain learns the statistical structure of its environment and predicts future sensory inputs. However, the algorithmic nature of these computations and the underlying neurobiological implementation remain controversial. This article introduces a mathematical framework with which competing ideas about the computational quantities indexed by MMN responses can be formalized and tested against single-trial EEG data. This framework was applied to five major theories of the MMN, comparing their ability to explain trial-by-trial changes in MMN amplitude. Three of these theories (predictive coding, model adjustment, and novelty detection) were formalized by linking the MMN to different manifestations of the same computational mechanism: approximate Bayesian inference according to the free-energy principle. We thereby propose a unifying view on three distinct theories of the MMN. The relative plausibility of each theory was assessed against empirical single-trial MMN amplitudes acquired from eight healthy volunteers in a roving oddball experiment. Models based on the free-energy principle provided more plausible explanations of trial-by-trial changes in MMN amplitude than models representing the two more traditional theories (change detection and adaptation). Our results suggest that the MMN reflects approximate Bayesian learning of sensory regularities, and that the MMN-generating process adjusts a probabilistic model of the environment according to prediction errors. PMID:23436989

  6. Formulation of state projected centroid molecular dynamics: Microcanonical ensemble and connection to the Wigner distribution.

    PubMed

    Orr, Lindsay; Hernández de la Peña, Lisandro; Roy, Pierre-Nicholas

    2017-06-07

    A derivation of quantum statistical mechanics based on the concept of a Feynman path centroid is presented for the case of generalized density operators using the projected density operator formalism of Blinov and Roy [J. Chem. Phys. 115, 7822-7831 (2001)]. The resulting centroid densities, centroid symbols, and centroid correlation functions are formulated and analyzed in the context of the canonical equilibrium picture of Jang and Voth [J. Chem. Phys. 111, 2357-2370 (1999)]. The case where the density operator projects onto a particular energy eigenstate of the system is discussed, and it is shown that one can extract microcanonical dynamical information from double Kubo transformed correlation functions. It is also shown that the proposed projection operator approach can be used to formally connect the centroid and Wigner phase-space distributions in the zero reciprocal temperature β limit. A Centroid Molecular Dynamics (CMD) approximation to the state-projected exact quantum dynamics is proposed and proven to be exact in the harmonic limit. The state projected CMD method is also tested numerically for a quartic oscillator and a double-well potential and found to be more accurate than canonical CMD. In the case of a ground state projection, this method can resolve tunnelling splittings of the double well problem in the higher barrier regime where canonical CMD fails. Finally, the state-projected CMD framework is cast in a path integral form.

  7. Formulation of state projected centroid molecular dynamics: Microcanonical ensemble and connection to the Wigner distribution

    NASA Astrophysics Data System (ADS)

    Orr, Lindsay; Hernández de la Peña, Lisandro; Roy, Pierre-Nicholas

    2017-06-01

    A derivation of quantum statistical mechanics based on the concept of a Feynman path centroid is presented for the case of generalized density operators using the projected density operator formalism of Blinov and Roy [J. Chem. Phys. 115, 7822-7831 (2001)]. The resulting centroid densities, centroid symbols, and centroid correlation functions are formulated and analyzed in the context of the canonical equilibrium picture of Jang and Voth [J. Chem. Phys. 111, 2357-2370 (1999)]. The case where the density operator projects onto a particular energy eigenstate of the system is discussed, and it is shown that one can extract microcanonical dynamical information from double Kubo transformed correlation functions. It is also shown that the proposed projection operator approach can be used to formally connect the centroid and Wigner phase-space distributions in the zero reciprocal temperature β limit. A Centroid Molecular Dynamics (CMD) approximation to the state-projected exact quantum dynamics is proposed and proven to be exact in the harmonic limit. The state projected CMD method is also tested numerically for a quartic oscillator and a double-well potential and found to be more accurate than canonical CMD. In the case of a ground state projection, this method can resolve tunnelling splittings of the double well problem in the higher barrier regime where canonical CMD fails. Finally, the state-projected CMD framework is cast in a path integral form.

  8. Unified phonon-based approach to the thermodynamics of solid, liquid and gas states

    NASA Astrophysics Data System (ADS)

    Bolmatov, Dima; Zav'yalov, Dmitry; Zhernenkov, Mikhail; Musaev, Edvard T.; Cai, Yong Q.

    2015-12-01

    We introduce a unified approach to states of matter (solid, liquid and gas) and describe the thermodynamics of the pressure-temperature phase diagram in terms of phonon excitations. We derive the effective Hamiltonian with low-energy cutoff in two transverse phonon polarizations (phononic band gaps) by breaking the symmetry in phonon interactions. Further, we construct the statistical mechanics of states of aggregation employing the Debye approximation. The introduced formalism covers the Debye theory of solids, the phonon theory of liquids, and thermodynamic limits such as the Dulong-Petit thermodynamic limit (cV = 3kB), the ideal gas limit (cV =3/2 kB) and the new thermodynamic limit (cV = 2kB), dubbed here the Frenkel line thermodynamic limit. We discuss the phonon propagation and localization effects in liquids above and below the Frenkel line, and explain the "fast sound" phenomenon. As a test for our theory we calculate velocity-velocity autocorrelation and pair distribution functions within the Green-Kubo formalism. We show the consistency between dynamics of phonons and pair correlations in the framework of the unified approach. New directions towards advancements in phononic band gaps engineering, hypersound manipulation technologies and exploration of exotic behaviour of fluids relevant to geo- and planetary sciences are discussed. The presented results are equally important both for practical implications and for fundamental research.

  9. DOE Office of Scientific and Technical Information (OSTI.GOV)

    Nunes, Rafael C.; Abreu, Everton M.C.; Neto, Jorge Ananias

    Based on the relationship between thermodynamics and gravity we propose, with the aid of Verlinde's formalism, an alternative interpretation of the dynamical evolution of the Friedmann-Robertson-Walker Universe. This description takes into account the entropy and temperature intrinsic to the horizon of the universe due to the information holographically stored there through non-gaussian statistical theories proposed by Tsallis and Kaniadakis. The effect of these non-gaussian statistics in the cosmological context is to change the strength of the gravitational constant. In this paper, we consider the w CDM model modified by the non-gaussian statistics and investigate the compatibility of these non-gaussian modificationmore » with the cosmological observations. In order to analyze in which extend the cosmological data constrain these non-extensive statistics, we will use type Ia supernovae, baryon acoustic oscillations, Hubble expansion rate function and the linear growth of matter density perturbations data. We show that Tsallis' statistics is favored at 1σ confidence level.« less

  10. 20 CFR 404.1564 - Your education as a vocational factor.

    Code of Federal Regulations, 2013 CFR

    2013-04-01

    ... education as a vocational factor. (a) General. Education is primarily used to mean formal schooling or other... necessarily mean that you are uneducated or lack these abilities. Past work experience and the kinds of... little formal education. Your daily activities, hobbies, or the results of testing may also show that you...

  11. 20 CFR 404.1564 - Your education as a vocational factor.

    Code of Federal Regulations, 2011 CFR

    2011-04-01

    ... education as a vocational factor. (a) General. Education is primarily used to mean formal schooling or other... necessarily mean that you are uneducated or lack these abilities. Past work experience and the kinds of... little formal education. Your daily activities, hobbies, or the results of testing may also show that you...

  12. 20 CFR 404.1564 - Your education as a vocational factor.

    Code of Federal Regulations, 2014 CFR

    2014-04-01

    ... education as a vocational factor. (a) General. Education is primarily used to mean formal schooling or other... necessarily mean that you are uneducated or lack these abilities. Past work experience and the kinds of... little formal education. Your daily activities, hobbies, or the results of testing may also show that you...

  13. 20 CFR 404.1564 - Your education as a vocational factor.

    Code of Federal Regulations, 2012 CFR

    2012-04-01

    ... education as a vocational factor. (a) General. Education is primarily used to mean formal schooling or other... necessarily mean that you are uneducated or lack these abilities. Past work experience and the kinds of... little formal education. Your daily activities, hobbies, or the results of testing may also show that you...

  14. 20 CFR 404.1564 - Your education as a vocational factor.

    Code of Federal Regulations, 2010 CFR

    2010-04-01

    ... little formal education. Your daily activities, hobbies, or the results of testing may also show that you... importance of your educational background may depend upon how much time has passed between the completion of your formal education and the beginning of your physical or mental impairment(s) and by what you have...

  15. Attainment Of Formal Thinking As Revealed By Solving Of Three-Term Verbal Problems By Junior School Children

    ERIC Educational Resources Information Center

    Loughran, R.

    1973-01-01

    Article considered experiments that tested the contentions of Piaget and Peel who believed that formal thinking is not established before 11-12 years of age. These studies were tied to the success achieved by pre-adolescent children in solving verbal three-term series problems. (Author/RK)

  16. Official Labeling, Criminal Embeddedness, and Subsequent Delinquency: A Longitudinal Test of Labeling Theory

    ERIC Educational Resources Information Center

    Bernburg, Jon Gunnar; Krohn, Marvin D.; Rivera, Craig J.

    2006-01-01

    This article examines the short-term impact of formal criminal labeling on involvement in deviant social networks and increased likelihood of subsequent delinquency. According to labeling theory, formal criminal intervention should affect the individual's immediate social networks. In many cases, the stigma of the criminal status may increase the…

  17. Data-based Non-Markovian Model Inference

    NASA Astrophysics Data System (ADS)

    Ghil, Michael

    2015-04-01

    This talk concentrates on obtaining stable and efficient data-based models for simulation and prediction in the geosciences and life sciences. The proposed model derivation relies on using a multivariate time series of partial observations from a large-dimensional system, and the resulting low-order models are compared with the optimal closures predicted by the non-Markovian Mori-Zwanzig formalism of statistical physics. Multilayer stochastic models (MSMs) are introduced as both a very broad generalization and a time-continuous limit of existing multilevel, regression-based approaches to data-based closure, in particular of empirical model reduction (EMR). We show that the multilayer structure of MSMs can provide a natural Markov approximation to the generalized Langevin equation (GLE) of the Mori-Zwanzig formalism. A simple correlation-based stopping criterion for an EMR-MSM model is derived to assess how well it approximates the GLE solution. Sufficient conditions are given for the nonlinear cross-interactions between the constitutive layers of a given MSM to guarantee the existence of a global random attractor. This existence ensures that no blow-up can occur for a very broad class of MSM applications. The EMR-MSM methodology is first applied to a conceptual, nonlinear, stochastic climate model of coupled slow and fast variables, in which only slow variables are observed. The resulting reduced model with energy-conserving nonlinearities captures the main statistical features of the slow variables, even when there is no formal scale separation and the fast variables are quite energetic. Second, an MSM is shown to successfully reproduce the statistics of a partially observed, generalized Lokta-Volterra model of population dynamics in its chaotic regime. The positivity constraint on the solutions' components replaces here the quadratic-energy-preserving constraint of fluid-flow problems and it successfully prevents blow-up. This work is based on a close collaboration with M.D. Chekroun, D. Kondrashov, S. Kravtsov and A.W. Robertson.

  18. Uncertainty and inference in the world of paleoecological data

    NASA Astrophysics Data System (ADS)

    McLachlan, J. S.; Dawson, A.; Dietze, M.; Finley, M.; Hooten, M.; Itter, M.; Jackson, S. T.; Marlon, J. R.; Raiho, A.; Tipton, J.; Williams, J.

    2017-12-01

    Proxy data in paleoecology and paleoclimatology share a common set of biases and uncertainties: spatiotemporal error associated with the taphonomic processes of deposition, preservation, and dating; calibration error between proxy data and the ecosystem states of interest; and error in the interpolation of calibrated estimates across space and time. Researchers often account for this daunting suite of challenges by applying qualitave expert judgment: inferring the past states of ecosystems and assessing the level of uncertainty in those states subjectively. The effectiveness of this approach can be seen by the extent to which future observations confirm previous assertions. Hierarchical Bayesian (HB) statistical approaches allow an alternative approach to accounting for multiple uncertainties in paleo data. HB estimates of ecosystem state formally account for each of the common uncertainties listed above. HB approaches can readily incorporate additional data, and data of different types into estimates of ecosystem state. And HB estimates of ecosystem state, with associated uncertainty, can be used to constrain forecasts of ecosystem dynamics based on mechanistic ecosystem models using data assimilation. Decisions about how to structure an HB model are also subjective, which creates a parallel framework for deciding how to interpret data from the deep past.Our group, the Paleoecological Observatory Network (PalEON), has applied hierarchical Bayesian statistics to formally account for uncertainties in proxy based estimates of past climate, fire, primary productivity, biomass, and vegetation composition. Our estimates often reveal new patterns of past ecosystem change, which is an unambiguously good thing, but we also often estimate a level of uncertainty that is uncomfortably high for many researchers. High levels of uncertainty are due to several features of the HB approach: spatiotemporal smoothing, the formal aggregation of multiple types of uncertainty, and a coarseness in statistical models of taphonomic process. Each of these features provides useful opportunities for statisticians and data-generating researchers to assess what we know about the signal and the noise in paleo data and to improve inference about past changes in ecosystem state.

  19. Improving Achievement Via Essay Exams.

    ERIC Educational Resources Information Center

    Milton, Ohmer

    1979-01-01

    The benefits of using essay tests rather than objective tests in professional education programs are discussed. Essay tests offer practice in writing, creativity and formal communications. Guidelines for using and scoring a sample essay test in biology are presented. (BH)

  20. Cognitive deficits associated with impaired awareness of hypoglycaemia in type 1 diabetes.

    PubMed

    Hansen, Tor I; Olsen, Sandra E; Haferstrom, Elise C D; Sand, Trond; Frier, Brian M; Håberg, Asta K; Bjørgaas, Marit R

    2017-06-01

    The aim of this study was to compare cognitive function in adults with type 1 diabetes who have impaired awareness of hypoglycaemia with those who have normal awareness of hypoglycaemia. A putative association was sought between cognitive test scores and a history of severe hypoglycaemia. A total of 68 adults with type 1 diabetes were included: 33 had impaired and 35 had normal awareness of hypoglycaemia, as confirmed by formal testing. The groups were matched for age, sex and diabetes duration. Cognitive tests of verbal memory, object-location memory, pattern separation, executive function, working memory and processing speed were administered. Participants with impaired awareness of hypoglycaemia scored significantly lower on the verbal and object-location memory tests and on the pattern separation test (Cohen's d -0.86 to -0.55 [95% CI -1.39, -0.05]). Participants with impaired awareness of hypoglycaemia had reduced planning ability task scores, although the difference was not statistically significant (Cohen's d 0.57 [95% CI 0, 1.14]). Frequency of exposure to severe hypoglycaemia correlated with the number of cognitive tests that had not been performed according to instructions. Impaired awareness of hypoglycaemia was associated with diminished learning, memory and pattern separation. These cognitive tasks all depend on the hippocampus, which is vulnerable to neuroglycopenia. The findings suggest that hypoglycaemia contributes to the observed correlation between impaired awareness of hypoglycaemia and impaired cognition.

  1. Semantics, pragmatics, and formal thought disorders in people with schizophrenia.

    PubMed

    Salavera, Carlos; Puyuelo, Miguel; Antoñanzas, José L; Teruel, Pilar

    2013-01-01

    The aim of this study was to analyze how formal thought disorders (FTD) affect semantics and pragmatics in patients with schizophrenia. The sample comprised subjects with schizophrenia (n = 102) who met the criteria for the disorder according to the Diagnostic and Statistical Manual of Mental Disorders, 4th Edition Text Revision. In the research process, the following scales were used: Positive and Negative Syndrome Scale (PANSS) for psychopathology measurements; the Scale for the Assessment of Thought, Language, and Communication (TLC) for FTD, Word Accentuation Test (WAT), System for the Behavioral Evaluation of Social Skills (SECHS), the pragmatics section of the Objective Criteria Language Battery (BLOC-SR) and the verbal sections of the Wechsler Adults Intelligence Scale (WAIS) III, for assessment of semantics and pragmatics. The results in the semantics and pragmatics sections were inferior to the average values obtained in the general population. Our data demonstrated that the more serious the FTD, the worse the performances in the Verbal-WAIS tests (particularly in its vocabulary, similarities, and comprehension sections), SECHS, and BLOC-SR, indicating that FTD affects semantics and pragmatics, although the results of the WAT indicated good premorbid language skills. The principal conclusion we can draw from this study is the evidence that in schizophrenia the superior level of language structure seems to be compromised, and that this level is related to semantics and pragmatics; when there is an alteration in this level, symptoms of FTD appear, with a wide-ranging relationship between both language and FTD. The second conclusion is that the subject's language is affected by the disorder and rules out the possibility of a previous verbal impairment.

  2. Semantics, pragmatics, and formal thought disorders in people with schizophrenia

    PubMed Central

    Salavera, Carlos; Puyuelo, Miguel; Antoñanzas, José L; Teruel, Pilar

    2013-01-01

    Background: The aim of this study was to analyze how formal thought disorders (FTD) affect semantics and pragmatics in patients with schizophrenia. Methods: The sample comprised subjects with schizophrenia (n = 102) who met the criteria for the disorder according to the Diagnostic and Statistical Manual of Mental Disorders, 4th Edition Text Revision. In the research process, the following scales were used: Positive and Negative Syndrome Scale (PANSS) for psychopathology measurements; the Scale for the Assessment of Thought, Language, and Communication (TLC) for FTD, Word Accentuation Test (WAT), System for the Behavioral Evaluation of Social Skills (SECHS), the pragmatics section of the Objective Criteria Language Battery (BLOC-SR) and the verbal sections of the Wechsler Adults Intelligence Scale (WAIS) III, for assessment of semantics and pragmatics. Results: The results in the semantics and pragmatics sections were inferior to the average values obtained in the general population. Our data demonstrated that the more serious the FTD, the worse the performances in the Verbal-WAIS tests (particularly in its vocabulary, similarities, and comprehension sections), SECHS, and BLOC-SR, indicating that FTD affects semantics and pragmatics, although the results of the WAT indicated good premorbid language skills. Conclusion: The principal conclusion we can draw from this study is the evidence that in schizophrenia the superior level of language structure seems to be compromised, and that this level is related to semantics and pragmatics; when there is an alteration in this level, symptoms of FTD appear, with a wide-ranging relationship between both language and FTD. The second conclusion is that the subject’s language is affected by the disorder and rules out the possibility of a previous verbal impairment. PMID:23430043

  3. The MINERVA Software Development Process

    NASA Technical Reports Server (NTRS)

    Narkawicz, Anthony; Munoz, Cesar A.; Dutle, Aaron M.

    2017-01-01

    This paper presents a software development process for safety-critical software components of cyber-physical systems. The process is called MINERVA, which stands for Mirrored Implementation Numerically Evaluated against Rigorously Verified Algorithms. The process relies on formal methods for rigorously validating code against its requirements. The software development process uses: (1) a formal specification language for describing the algorithms and their functional requirements, (2) an interactive theorem prover for formally verifying the correctness of the algorithms, (3) test cases that stress the code, and (4) numerical evaluation on these test cases of both the algorithm specifications and their implementations in code. The MINERVA process is illustrated in this paper with an application to geo-containment algorithms for unmanned aircraft systems. These algorithms ensure that the position of an aircraft never leaves a predetermined polygon region and provide recovery maneuvers when the region is inadvertently exited.

  4. Statistical mechanics of shell models for two-dimensional turbulence

    NASA Astrophysics Data System (ADS)

    Aurell, E.; Boffetta, G.; Crisanti, A.; Frick, P.; Paladin, G.; Vulpiani, A.

    1994-12-01

    We study shell models that conserve the analogs of energy and enstrophy and hence are designed to mimic fluid turbulence in two-dimensions (2D). The main result is that the observed state is well described as a formal statistical equilibrium, closely analogous to the approach to two-dimensional ideal hydrodynamics of Onsager [Nuovo Cimento Suppl. 6, 279 (1949)], Hopf [J. Rat. Mech. Anal. 1, 87 (1952)], and Lee [Q. Appl. Math. 10, 69 (1952)]. In the presence of forcing and dissipation we observe a forward flux of enstrophy and a backward flux of energy. These fluxes can be understood as mean diffusive drifts from a source to two sinks in a system which is close to local equilibrium with Lagrange multipliers (``shell temperatures'') changing slowly with scale. This is clear evidence that the simplest shell models are not adequate to reproduce the main features of two-dimensional turbulence. The dimensional predictions on the power spectra from a supposed forward cascade of enstrophy and from one branch of the formal statistical equilibrium coincide in these shell models in contrast to the corresponding predictions for the Navier-Stokes and Euler equations in 2D. This coincidence has previously led to the mistaken conclusion that shell models exhibit a forward cascade of enstrophy. We also study the dynamical properties of the models and the growth of perturbations.

  5. Number statistics for β-ensembles of random matrices: Applications to trapped fermions at zero temperature.

    PubMed

    Marino, Ricardo; Majumdar, Satya N; Schehr, Grégory; Vivo, Pierpaolo

    2016-09-01

    Let P_{β}^{(V)}(N_{I}) be the probability that a N×Nβ-ensemble of random matrices with confining potential V(x) has N_{I} eigenvalues inside an interval I=[a,b] on the real line. We introduce a general formalism, based on the Coulomb gas technique and the resolvent method, to compute analytically P_{β}^{(V)}(N_{I}) for large N. We show that this probability scales for large N as P_{β}^{(V)}(N_{I})≈exp[-βN^{2}ψ^{(V)}(N_{I}/N)], where β is the Dyson index of the ensemble. The rate function ψ^{(V)}(k_{I}), independent of β, is computed in terms of single integrals that can be easily evaluated numerically. The general formalism is then applied to the classical β-Gaussian (I=[-L,L]), β-Wishart (I=[1,L]), and β-Cauchy (I=[-L,L]) ensembles. Expanding the rate function around its minimum, we find that generically the number variance var(N_{I}) exhibits a nonmonotonic behavior as a function of the size of the interval, with a maximum that can be precisely characterized. These analytical results, corroborated by numerical simulations, provide the full counting statistics of many systems where random matrix models apply. In particular, we present results for the full counting statistics of zero-temperature one-dimensional spinless fermions in a harmonic trap.

  6. On the simulation of indistinguishable fermions in the many-body Wigner formalism

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Sellier, J.M., E-mail: jeanmichel.sellier@gmail.com; Dimov, I.

    2015-01-01

    The simulation of quantum systems consisting of interacting, indistinguishable fermions is an incredible mathematical problem which poses formidable numerical challenges. Many sophisticated methods addressing this problem are available which are based on the many-body Schrödinger formalism. Recently a Monte Carlo technique for the resolution of the many-body Wigner equation has been introduced and successfully applied to the simulation of distinguishable, spinless particles. This numerical approach presents several advantages over other methods. Indeed, it is based on an intuitive formalism in which quantum systems are described in terms of a quasi-distribution function, and highly scalable due to its Monte Carlo nature.more » In this work, we extend the many-body Wigner Monte Carlo method to the simulation of indistinguishable fermions. To this end, we first show how fermions are incorporated into the Wigner formalism. Then we demonstrate that the Pauli exclusion principle is intrinsic to the formalism. As a matter of fact, a numerical simulation of two strongly interacting fermions (electrons) is performed which clearly shows the appearance of a Fermi (or exchange–correlation) hole in the phase-space, a clear signature of the presence of the Pauli principle. To conclude, we simulate 4, 8 and 16 non-interacting fermions, isolated in a closed box, and show that, as the number of fermions increases, we gradually recover the Fermi–Dirac statistics, a clear proof of the reliability of our proposed method for the treatment of indistinguishable particles.« less

  7. Dietary diversity of formal and informal residents in Johannesburg, South Africa

    PubMed Central

    2013-01-01

    Background This paper considers the question of dietary diversity as a proxy for nutrition insecurity in communities living in the inner city and the urban informal periphery in Johannesburg. It argues that the issue of nutrition insecurity demands urgent and immediate attention by policy makers. Methods A cross-sectional survey was undertaken for households from urban informal (n = 195) and urban formal (n = 292) areas in Johannesburg, South Africa. Foods consumed by the respondents the previous day were used to calculate a Dietary Diversity Score; a score < 4 was considered low. Results Statistical comparisons of means between groups revealed that respondents from informal settlements consumed mostly cereals and meat/poultry/fish, while respondents in formal settlements consumed a more varied diet. Significantly more respondents living in informal settlements consumed a diet of low diversity (68.1%) versus those in formal settlements (15.4%). When grouped in quintiles, two-thirds of respondents from informal settlements fell in the lowest two, versus 15.4% living in formal settlements. Households who experienced periods of food shortages during the previous 12 months had a lower mean DDS than those from food secure households (4.00 ± 1.6 versus 4.36 ± 1.7; p = 0.026). Conclusions Respondents in the informal settlements were more nutritionally vulnerable. Achieving nutrition security requires policies, strategies and plans to include specific nutrition considerations. PMID:24088249

  8. Imbalanced target prediction with pattern discovery on clinical data repositories.

    PubMed

    Chan, Tak-Ming; Li, Yuxi; Chiau, Choo-Chiap; Zhu, Jane; Jiang, Jie; Huo, Yong

    2017-04-20

    Clinical data repositories (CDR) have great potential to improve outcome prediction and risk modeling. However, most clinical studies require careful study design, dedicated data collection efforts, and sophisticated modeling techniques before a hypothesis can be tested. We aim to bridge this gap, so that clinical domain users can perform first-hand prediction on existing repository data without complicated handling, and obtain insightful patterns of imbalanced targets for a formal study before it is conducted. We specifically target for interpretability for domain users where the model can be conveniently explained and applied in clinical practice. We propose an interpretable pattern model which is noise (missing) tolerant for practice data. To address the challenge of imbalanced targets of interest in clinical research, e.g., deaths less than a few percent, the geometric mean of sensitivity and specificity (G-mean) optimization criterion is employed, with which a simple but effective heuristic algorithm is developed. We compared pattern discovery to clinically interpretable methods on two retrospective clinical datasets. They contain 14.9% deaths in 1 year in the thoracic dataset and 9.1% deaths in the cardiac dataset, respectively. In spite of the imbalance challenge shown on other methods, pattern discovery consistently shows competitive cross-validated prediction performance. Compared to logistic regression, Naïve Bayes, and decision tree, pattern discovery achieves statistically significant (p-values < 0.01, Wilcoxon signed rank test) favorable averaged testing G-means and F1-scores (harmonic mean of precision and sensitivity). Without requiring sophisticated technical processing of data and tweaking, the prediction performance of pattern discovery is consistently comparable to the best achievable performance. Pattern discovery has demonstrated to be robust and valuable for target prediction on existing clinical data repositories with imbalance and noise. The prediction results and interpretable patterns can provide insights in an agile and inexpensive way for the potential formal studies.

  9. Can Propensity Score Analysis Approximate Randomized Experiments Using Pretest and Demographic Information in Pre-K Intervention Research?

    PubMed

    Dong, Nianbo; Lipsey, Mark W

    2017-01-01

    It is unclear whether propensity score analysis (PSA) based on pretest and demographic covariates will meet the ignorability assumption for replicating the results of randomized experiments. This study applies within-study comparisons to assess whether pre-Kindergarten (pre-K) treatment effects on achievement outcomes estimated using PSA based on a pretest and demographic covariates can approximate those found in a randomized experiment. Data-Four studies with samples of pre-K children each provided data on two math achievement outcome measures with baseline pretests and child demographic variables that included race, gender, age, language spoken at home, and mother's highest education. Research Design and Data Analysis-A randomized study of a pre-K math curriculum provided benchmark estimates of effects on achievement measures. Comparison samples from other pre-K studies were then substituted for the original randomized control and the effects were reestimated using PSA. The correspondence was evaluated using multiple criteria. The effect estimates using PSA were in the same direction as the benchmark estimates, had similar but not identical statistical significance, and did not differ from the benchmarks at statistically significant levels. However, the magnitude of the effect sizes differed and displayed both absolute and relative bias larger than required to show statistical equivalence with formal tests, but those results were not definitive because of the limited statistical power. We conclude that treatment effect estimates based on a single pretest and demographic covariates in PSA correspond to those from a randomized experiment on the most general criteria for equivalence.

  10. SSE software test management STM capability: Using STM in the Ground Systems Development Environment (GSDE)

    NASA Technical Reports Server (NTRS)

    Church, Victor E.; Long, D.; Hartenstein, Ray; Perez-Davila, Alfredo

    1992-01-01

    This report is one of a series discussing configuration management (CM) topics for Space Station ground systems software development. It provides a description of the Software Support Environment (SSE)-developed Software Test Management (STM) capability, and discusses the possible use of this capability for management of developed software during testing performed on target platforms. This is intended to supplement the formal documentation of STM provided by the SEE Project. How STM can be used to integrate contractor CM and formal CM for software before delivery to operations is described. STM provides a level of control that is flexible enough to support integration and debugging, but sufficiently rigorous to insure the integrity of the testing process.

  11. Hyperboloidal evolution of test fields in three spatial dimensions

    NASA Astrophysics Data System (ADS)

    Zenginoǧlu, Anıl; Kidder, Lawrence E.

    2010-06-01

    We present the numerical implementation of a clean solution to the outer boundary and radiation extraction problems within the 3+1 formalism for hyperbolic partial differential equations on a given background. Our approach is based on compactification at null infinity in hyperboloidal scri fixing coordinates. We report numerical tests for the particular example of a scalar wave equation on Minkowski and Schwarzschild backgrounds. We address issues related to the implementation of the hyperboloidal approach for the Einstein equations, such as nonlinear source functions, matching, and evaluation of formally singular terms at null infinity.

  12. A Baseline Patient Model to Support Testing of Medical Cyber-Physical Systems.

    PubMed

    Silva, Lenardo C; Perkusich, Mirko; Almeida, Hyggo O; Perkusich, Angelo; Lima, Mateus A M; Gorgônio, Kyller C

    2015-01-01

    Medical Cyber-Physical Systems (MCPS) are currently a trending topic of research. The main challenges are related to the integration and interoperability of connected medical devices, patient safety, physiologic closed-loop control, and the verification and validation of these systems. In this paper, we focus on patient safety and MCPS validation. We present a formal patient model to be used in health care systems validation without jeopardizing the patient's health. To determine the basic patient conditions, our model considers the four main vital signs: heart rate, respiratory rate, blood pressure and body temperature. To generate the vital signs we used regression models based on statistical analysis of a clinical database. Our solution should be used as a starting point for a behavioral patient model and adapted to specific clinical scenarios. We present the modeling process of the baseline patient model and show its evaluation. The conception process may be used to build different patient models. The results show the feasibility of the proposed model as an alternative to the immediate need for clinical trials to test these medical systems.

  13. The Role of Formal Experiment Design in Hypersonic Flight System Technology Development

    NASA Technical Reports Server (NTRS)

    McClinton, Charles R.; Ferlemann, Shelly M.; Rock, Ken E.; Ferlemann, Paul G.

    2002-01-01

    Hypersonic airbreathing engine (scramjet) powered vehicles are being considered to replace conventional rocket-powered launch systems. Effective utilization of scramjet engines requires careful integration with the air vehicle. This integration synergistically combines aerodynamic forces with propulsive cycle functions of the engine. Due to the highly integrated nature of the hypersonic vehicle design problem, the large flight envelope, and the large number of design variables, the use of a statistical design approach in design is effective. Modern Design-of-Experiments (MDOE) has been used throughout the Hyper-X program, for both systems analysis and experimental testing. Application of MDOE fall into four categories: (1) experimental testing; (2) studies of unit phenomena; (3) refining engine design; and (4) full vehicle system optimization. The MDOE process also provides analytical models, which are also used to document lessons learned, supplement low-level design tools, and accelerate future studies. This paper will discuss the design considerations for scramjet-powered vehicles, specifics of MDOE utilized for Hyper-X, and present highlights from the use of these MDOE methods within the Hyper-X Program.

  14. Alcohol consumption, beverage prices and measurement error.

    PubMed

    Young, Douglas J; Bielinska-Kwapisz, Agnieszka

    2003-03-01

    Alcohol price data collected by the American Chamber of Commerce Researchers Association (ACCRA) have been widely used in studies of alcohol consumption and related behaviors. A number of problems with these data suggest that they contain substantial measurement error, which biases conventional statistical estimators toward a finding of little or no effect of prices on behavior. We test for measurement error, assess the magnitude of the bias and provide an alternative estimator that is likely to be superior. The study utilizes data on per capita alcohol consumption across U.S. states and the years 1982-1997. State and federal alcohol taxes are used as instrumental variables for prices. Formal tests strongly confim the hypothesis of measurement error. Instrumental variable estimates of the price elasticity of demand range from -0.53 to -1.24. These estimates are substantially larger in absolute value than ordinary least squares estimates, which sometimes are not significantly different from zero or even positive. The ACCRA price data are substantially contaminated with measurement error, but using state and federal taxes as instrumental variables mitigates the problem.

  15. The protective effect of marriage for survival: a review and update.

    PubMed

    Rendall, Michael S; Weden, Margaret M; Favreault, Melissa M; Waldron, Hilary

    2011-05-01

    The theory that marriage has protective effects for survival has itself lived for more than 100 years since Durkheim's groundbreaking study of suicide (Durkheim 1951 [1897]). Investigations of differences in this protective effect by gender, by age, and in contrast to different unmarried statuses, however, have yielded inconsistent conclusions. These investigations typically either use data in which marital status and other covariates are observed in cross-sectional surveys up to 10 years before mortality exposure, or use data from panel surveys with much smaller sample sizes. Their conclusions are usually not based on formal statistical tests of contrasts between men and women or between never-married, divorced/separated, and widowed statuses. Using large-scale pooled panel survey data linked to death registrations and earnings histories for U.S. men and women aged 25 and older, and with appropriate contrast tests, we find a consistent survival advantage for married over unmarried men and women, and an additional survival "premium" for married men. We find little evidence of mortality differences between never-married, divorced/separated, and widowed statuses.

  16. Student Levels of Cognitive Development: Establishing Links between Logical Thinking Skills and Success in Earth Science

    NASA Astrophysics Data System (ADS)

    Steer, D. N.; McConnell, D. A.; Owens, K.

    2003-12-01

    Students in inquiry-based, general education Earth Science courses were found to display a wide range of logical thinking skills that are known indicators of success in science courses. The Group Assessment of Logical Thinking instrument that tests six logical operations was administered on the first day of class and near the end of the course. Such tests can be used to assess a student's overall level of cognitive development (concrete, transitional or formal) and specific logical thinking strengths or weaknesses. Results from paired pre- and post-course logical thinking tests of 393 students indicated that 25% of the incoming students were concrete, 30% were transitional and 45% were formal thinkers. Concrete and transitional thinkers were far more likely to withdraw from or fail the course when compared to their formal thinking peers (35%, 25% and 10% respectively). Differences in scores between genders were significant with 210 females testing at 30% concrete, 35% transitional and 35% formal on the pretest compared to 183 males who tested 15% concrete, 25% transitional and 60% formal. Overall logical thinking scores of students increased significantly in every inquiry-based class with lecture-based classes showing overall lower increases. Post-test data indicated that there were fewer concrete thinkers (16% female, 7% male), little change in the number of transitional thinkers (30% female, 23% male) and more formal thinkers (54% female, 70% male) toward the end of the inquiry-based course. Scores on two of the logical operations, conservation and probability, were sufficient to separate those who received a high grade (A or B in course) from those were unsuccessful (D, F or withdrew). Students who score low in conservation operations (n=46) tend to rely on intuition rather than logic when trying to understand typical Earth System concepts such as plate tectonics, atmospheric processes and climate change. Students who score low in probability skills (n=46) have difficulty distinguishing the difference between unrelated, but possible, data and those data that confirm a supposition. Such skills are necessary to properly apply the scientific method. By the end of the course, unsuccessful concrete students improved conservation reasoning skills to the same levels of their higher performing concrete peers on the post-test but remained behind them in probability skills. Successful transitional thinkers (n=50) displayed better correlation-reasoning skills than their lower performing contemporaries (n=51). Correlation reasoning skills are necessary to understand some of the many causal relationships routinely developed in the Earth Sciences (e.g. those associated with plate tectonics and earthquakes or volcanoes; CO2 and global climate change).

  17. Relating Out-of-Time-Order Correlations to Entanglement via Multiple-Quantum Coherences.

    PubMed

    Gärttner, Martin; Hauke, Philipp; Rey, Ana Maria

    2018-01-26

    Out-of-time-order correlations (OTOCs) characterize the scrambling, or delocalization, of quantum information over all the degrees of freedom of a system and thus have been proposed as a proxy for chaos in quantum systems. Recent experimental progress in measuring OTOCs calls for a more thorough understanding of how these quantities characterize complex quantum systems, most importantly in terms of the buildup of entanglement. Although a connection between OTOCs and entanglement entropy has been derived, the latter only quantifies entanglement in pure systems and is hard to access experimentally. In this work, we formally demonstrate that the multiple-quantum coherence spectra, a specific family of OTOCs well known in NMR, can be used as an entanglement witness and as a direct probe of multiparticle entanglement. Our results open a path to experimentally testing the fascinating idea that entanglement is the underlying glue that links thermodynamics, statistical mechanics, and quantum gravity.

  18. Can economics be a physical science?

    NASA Astrophysics Data System (ADS)

    Foley, Duncan K.

    2016-12-01

    Economics and other social sciences stem from the same methodological scientific revolution that gave birth to the natural sciences. The natural and social sciences share a commitment to the dialectical process of theory formation on the basis of empirical findings and theory revision to incorporate empirical anomalies. Claims that the subject matter of social and natural sciences differ qualitatively in terms of mathematical formalism, statistical modeling, or reductionism are unconvincing. The notion of a "value-free" character to natural sciences fails historical and critical tests. Natural and social sciences share an ideological component in their representation of the relation between the subject and the external natural and social world. Natural sciences arise from the struggles of human beings with nature in the process of social reproduction, while social sciences arise from the struggles of human beings with each other and with the class divisions social reproduction imposes.

  19. Functional equivalency inferred from "authoritative sources" in networks of homologous proteins.

    PubMed

    Natarajan, Shreedhar; Jakobsson, Eric

    2009-06-12

    A one-on-one mapping of protein functionality across different species is a critical component of comparative analysis. This paper presents a heuristic algorithm for discovering the Most Likely Functional Counterparts (MoLFunCs) of a protein, based on simple concepts from network theory. A key feature of our algorithm is utilization of the user's knowledge to assign high confidence to selected functional identification. We show use of the algorithm to retrieve functional equivalents for 7 membrane proteins, from an exploration of almost 40 genomes form multiple online resources. We verify the functional equivalency of our dataset through a series of tests that include sequence, structure and function comparisons. Comparison is made to the OMA methodology, which also identifies one-on-one mapping between proteins from different species. Based on that comparison, we believe that incorporation of user's knowledge as a key aspect of the technique adds value to purely statistical formal methods.

  20. Relating Out-of-Time-Order Correlations to Entanglement via Multiple-Quantum Coherences

    NASA Astrophysics Data System (ADS)

    Gärttner, Martin; Hauke, Philipp; Rey, Ana Maria

    2018-01-01

    Out-of-time-order correlations (OTOCs) characterize the scrambling, or delocalization, of quantum information over all the degrees of freedom of a system and thus have been proposed as a proxy for chaos in quantum systems. Recent experimental progress in measuring OTOCs calls for a more thorough understanding of how these quantities characterize complex quantum systems, most importantly in terms of the buildup of entanglement. Although a connection between OTOCs and entanglement entropy has been derived, the latter only quantifies entanglement in pure systems and is hard to access experimentally. In this work, we formally demonstrate that the multiple-quantum coherence spectra, a specific family of OTOCs well known in NMR, can be used as an entanglement witness and as a direct probe of multiparticle entanglement. Our results open a path to experimentally testing the fascinating idea that entanglement is the underlying glue that links thermodynamics, statistical mechanics, and quantum gravity.

  1. Functional Equivalency Inferred from “Authoritative Sources” in Networks of Homologous Proteins

    PubMed Central

    Natarajan, Shreedhar; Jakobsson, Eric

    2009-01-01

    A one-on-one mapping of protein functionality across different species is a critical component of comparative analysis. This paper presents a heuristic algorithm for discovering the Most Likely Functional Counterparts (MoLFunCs) of a protein, based on simple concepts from network theory. A key feature of our algorithm is utilization of the user's knowledge to assign high confidence to selected functional identification. We show use of the algorithm to retrieve functional equivalents for 7 membrane proteins, from an exploration of almost 40 genomes form multiple online resources. We verify the functional equivalency of our dataset through a series of tests that include sequence, structure and function comparisons. Comparison is made to the OMA methodology, which also identifies one-on-one mapping between proteins from different species. Based on that comparison, we believe that incorporation of user's knowledge as a key aspect of the technique adds value to purely statistical formal methods. PMID:19521530

  2. From information processing to decisions: Formalizing and comparing psychologically plausible choice models.

    PubMed

    Heck, Daniel W; Hilbig, Benjamin E; Moshagen, Morten

    2017-08-01

    Decision strategies explain how people integrate multiple sources of information to make probabilistic inferences. In the past decade, increasingly sophisticated methods have been developed to determine which strategy explains decision behavior best. We extend these efforts to test psychologically more plausible models (i.e., strategies), including a new, probabilistic version of the take-the-best (TTB) heuristic that implements a rank order of error probabilities based on sequential processing. Within a coherent statistical framework, deterministic and probabilistic versions of TTB and other strategies can directly be compared using model selection by minimum description length or the Bayes factor. In an experiment with inferences from given information, only three of 104 participants were best described by the psychologically plausible, probabilistic version of TTB. Similar as in previous studies, most participants were classified as users of weighted-additive, a strategy that integrates all available information and approximates rational decisions. Copyright © 2017 Elsevier Inc. All rights reserved.

  3. Testing the Self-Consistency of the Excursion Set Approach to Predicting the Dark Matter Halo Mass Function

    NASA Astrophysics Data System (ADS)

    Achitouv, I.; Rasera, Y.; Sheth, R. K.; Corasaniti, P. S.

    2013-12-01

    The excursion set approach provides a framework for predicting how the abundance of dark matter halos depends on the initial conditions. A key ingredient of this formalism is the specification of a critical overdensity threshold (barrier) which protohalos must exceed if they are to form virialized halos at a later time. However, to make its predictions, the excursion set approach explicitly averages over all positions in the initial field, rather than the special ones around which halos form, so it is not clear that the barrier has physical motivation or meaning. In this Letter we show that once the statistical assumptions which underlie the excursion set approach are considered a drifting diffusing barrier model does provide a good self-consistent description both of halo abundance as well as of the initial overdensities of the protohalo patches.

  4. Modelling the sensory space of varietal wines: Mining of large, unstructured text data and visualisation of style patterns.

    PubMed

    Valente, Carlo C; Bauer, Florian F; Venter, Fritz; Watson, Bruce; Nieuwoudt, Hélène H

    2018-03-21

    The increasingly large volumes of publicly available sensory descriptions of wine raises the question whether this source of data can be mined to extract meaningful domain-specific information about the sensory properties of wine. We introduce a novel application of formal concept lattices, in combination with traditional statistical tests, to visualise the sensory attributes of a big data set of some 7,000 Chenin blanc and Sauvignon blanc wines. Complexity was identified as an important driver of style in hereto uncharacterised Chenin blanc, and the sensory cues for specific styles were identified. This is the first study to apply these methods for the purpose of identifying styles within varietal wines. More generally, our interactive data visualisation and mining driven approach opens up new investigations towards better understanding of the complex field of sensory science.

  5. An Introduction to Distributions Using Weighted Dice

    ERIC Educational Resources Information Center

    Holland, Bart K.

    2011-01-01

    Distributions are the basis for an enormous amount of theoretical and applied work in statistics. While there are formal definitions of distributions and many formulas to characterize them, it is important that students at first get a clear introduction to this basic concept. For many of them, neither words nor formulas can match the power of a…

  6. Uncertainty in eddy covariance measurements and its application to physiological models

    Treesearch

    D.Y. Hollinger; A.D. Richardson; A.D. Richardson

    2005-01-01

    Flux data are noisy, and this uncertainty is largely due to random measurement error. Knowledge of uncertainty is essential for the statistical evaluation of modeled andmeasured fluxes, for comparison of parameters derived by fitting models to measured fluxes and in formal data-assimilation efforts. We used the difference between simultaneous measurements from two...

  7. How Can We Enhance Enjoyment of Secondary School? The Student View

    ERIC Educational Resources Information Center

    Gorard, Stephen; See, Beng Huat

    2011-01-01

    This paper considers enjoyment of formal education for young people aged 14 to 16, largely from their own perspective, based on the view of around 3000 students in England. The data include documentary analysis, official statistics, interviews and surveys with staff and students. Enjoyment of school tends to be promoted by factors such as…

  8. Empirical and Genealogical Analysis of Non-Vocational Adult Education in Europe

    ERIC Educational Resources Information Center

    Manninen, Jyri

    2017-01-01

    Non-formal, non-vocational adult education (NFNVAE) is a low-cost, low-threshold learning activity that generates many benefits for individuals and society, and it should play a more central role in educational policy. NFNVAE's challenge is that it lacks clear concepts and definitions and is, therefore, less systematically covered in statistics,…

  9. A Statistical Ontology-Based Approach to Ranking for Multiword Search

    ERIC Educational Resources Information Center

    Kim, Jinwoo

    2013-01-01

    Keyword search is a prominent data retrieval method for the Web, largely because the simple and efficient nature of keyword processing allows a large amount of information to be searched with fast response. However, keyword search approaches do not formally capture the clear meaning of a keyword query and fail to address the semantic relationships…

  10. Reliability Considerations for the Operation of Large Accelerator User Facilities

    DOE PAGES

    Willeke, F. J.

    2016-01-29

    The lecture provides an overview of considerations relevant for achieving highly reliable operation of accelerator based user facilities. The article starts with an overview of statistical reliability formalism which is followed by high reliability design considerations with examples. Finally, the article closes with operational aspects of high reliability such as preventive maintenance and spares inventory.

  11. A Formal Derivation of the Gibbs Entropy for Classical Systems Following the Schrodinger Quantum Mechanical Approach

    ERIC Educational Resources Information Center

    Santillan, M.; Zeron, E. S.; Del Rio-Correa, J. L.

    2008-01-01

    In the traditional statistical mechanics textbooks, the entropy concept is first introduced for the microcanonical ensemble and then extended to the canonical and grand-canonical cases. However, in the authors' experience, this procedure makes it difficult for the student to see the bigger picture and, although quite ingenuous, the subtleness of…

  12. Peer Coaching as an Institutionalised Tool for Professional Development: The Perceptions of Tutors in a Nigerian College

    ERIC Educational Resources Information Center

    Aderibigbe, Semiyu Adejare; Ajasa, Folorunso Adekemi

    2013-01-01

    Purpose: The purpose of this paper is to explore the perceptions of college tutors on peer coaching as a tool for professional development to determine its formal institutionalisation. Design/methodology/approach: A survey questionnaire was used for data collection, while analysis of data was done using descriptive statistics. Findings: The…

  13. Open Educational Resources: A Faculty Author's Perspective

    ERIC Educational Resources Information Center

    Illowsky, Barbara

    2012-01-01

    As the coauthor (with Susan Dean) of a formally for-profit and now open (i.e., free on the web) textbook, "Collaborative Statistics," this author has received many questions about open educational resources (OER), which can be summarized as follows: (1) What are OER?; (2) Why do you support, actively promote, and speak about OER?; (3) If a book is…

  14. 76 FR 30306 - New England Fishery Management Council; Public Meeting

    Federal Register 2010, 2011, 2012, 2013, 2014

    2011-05-25

    ...The New England Fishery Management Council (Council) is scheduling a public meeting of its Scientific and Statistical Committee on June 14-15, 2011 to consider actions affecting New England fisheries in the exclusive economic zone (EEZ). Recommendations from this group will be brought to the full Council for formal consideration and action, if appropriate.

  15. 76 FR 43266 - New England Fishery Management Council; Public Meeting

    Federal Register 2010, 2011, 2012, 2013, 2014

    2011-07-20

    ...The New England Fishery Management Council (Council) is scheduling a public meeting of its Scientific and Statistical Committee, on August 9-10, 2011, to consider actions affecting New England fisheries in the exclusive economic zone (EEZ). Recommendations from this group will be brought to the full Council for formal consideration and action, if appropriate.

  16. Standing by Their Principles: Two Librarians Who Faced Challenges

    ERIC Educational Resources Information Center

    Adams, Helen; Leu, DaNae; Venuto, Dee Ann

    2015-01-01

    What do school librarians fear most? Hands down, their biggest fear is a formal challenge to a resource in the school library. There are no accurate statistics about the number of challenges to school library resources. The staff of ALA's Office for Intellectual Freedom estimates that only about 20 percent are reported to ALA annually. For the…

  17. Exploring the Implementation, Effectiveness and Costs of the Reading Partners Program

    ERIC Educational Resources Information Center

    Jacob, Robin; Elson, Dean; Bowden, Brooks; Armstrong, Catherine

    2015-01-01

    Reading skills are the key building blocks of a child's formal education. Yet, the national statistics on literacy attainment are profoundly distressing: two out of three American fourth graders are reading below grade level and almost one third of children nationwide lack even basic reading skills. This study reports on an evaluation of the…

  18. 76 FR 66875 - Informal Entry Limit and Removal of a Formal Entry Requirement

    Federal Register 2010, 2011, 2012, 2013, 2014

    2011-10-28

    ... to properly assess duties on the merchandise and collect accurate statistics with respect to the.... In Sec. 10.1: a. Introductory paragraph (a) is amended by removing the word ``shall'' and adding in... removing the word ``shall'' and adding in its place the word ``must''; m. Introductory paragraph (h)(4) is...

  19. [Training of residents in obstetrics and gynecology: Assessment of an educational program including formal lectures and practical sessions using simulators].

    PubMed

    Jordan, A; El Haloui, O; Breaud, J; Chevalier, D; Antomarchi, J; Bongain, A; Boucoiran, I; Delotte, J

    2015-01-01

    Evaluate an educational program in the training of residents in gynecology-obstetrics (GO) with a theory session and a practical session on simulators and analyze their learning curve. Single-center prospective study, at the university hospital (CHU). Two-day sessions were leaded in April and July 2013. An evaluation on obstetric and gynecological surgery simulator was available to all residents. Theoretical knowledge principles of obstetrics were evaluated early in the session and after formal lectures was taught to them. At the end of the first session, a satisfaction questionnaire was distributed to all participants. Twenty residents agreed to participate to the training sessions. Evaluation of theoretical knowledge: at the end of the session, the residents obtained a significant improvement in their score on 20 testing knowledge. Obstetrical simulator: a statistically significant improvement in scores on assessments simulator vaginal delivery between the first and second session. Subjectively, a larger increase feeling was seen after breech delivery simulation than for the cephalic vaginal delivery. However, the confidence level of the resident after breech delivery simulation has not been improved at the end of the second session. Simulation in gynecological surgery: a trend towards improvement in the time realized on the peg-transfer between the two sessions was noted. In the virtual simulation, no statistically significant differences showed, no improvement for in salpingectomy's time. Subjectively, the residents felt an increase in the precision of their gesture. Satisfaction: All residents have tried the whole program. They considered the pursuit of these sessions on simulators was necessary and even mandatory. The approach chosen by this structured educational program allowed a progression for the residents, both objectively and subjectively. This simulation program type for the resident's training would use this tool in assessing their skills and develop learning curves. Copyright © 2015 Elsevier Masson SAS. All rights reserved.

  20. Reasoning with Conditionals: A Test of Formal Models of Four Theories

    ERIC Educational Resources Information Center

    Oberauer, Klaus

    2006-01-01

    The four dominant theories of reasoning from conditionals are translated into formal models: The theory of mental models (Johnson-Laird, P. N., & Byrne, R. M. J. (2002). Conditionals: a theory of meaning, pragmatics, and inference. "Psychological Review," 109, 646-678), the suppositional theory (Evans, J. S. B. T., & Over, D. E. (2004). "If."…

  1. Formal and Informal Experiential Realms in German as a Foreign Language: A Preliminary Investigation

    ERIC Educational Resources Information Center

    Moyer, Alene

    2005-01-01

    In this study of German as a foreign language, formal classroom experience is compared with informal use of German outside the classroom focusing on three syntactic features: main clause word order (subject-verb-object, or SVO), topicalization (subject-verb inversion), and subordinate word order (subject-object-verb, or SOV). T tests and…

  2. Understanding Informal and Formal Mathematical Abilities in Mainland Chinese and Chinese-American Children.

    ERIC Educational Resources Information Center

    Zhou, Zheng; Cheng, Christine; Mottram, Lisa; Rosenblum, Stacey

    Informal and formal mathematical abilities were studied in the preschool, kindergarten, and first grade children in Beijing, China and Chinese-American children in New York City. Test of Early Mathematical Abilities-2nd Edition (TEMA-2) was administered to the three groups of children (children from Beijing, Chinese-American from lower-class, and…

  3. Quantum-Like Models for Decision Making in Psychology and Cognitive Science

    NASA Astrophysics Data System (ADS)

    Khrennikov, Andrei.

    2009-02-01

    We show that (in contrast to rather common opinion) the domain of applications of the mathematical formalism of quantum mechanics is not restricted to physics. This formalism can be applied to the description of various quantum-like (QL) information processing. In particular, the calculus of quantum (and more general QL) probabilities can be used to explain some paradoxical statistical data which was collected in psychology and cognitive science. The main lesson of our study is that one should sharply distinguish the mathematical apparatus of QM from QM as a physical theory. The domain of application of the mathematical apparatus is essentially wider than quantum physics. Quantum-like representation algorithm, formula of total probability, interference of probabilities, psychology, cognition, decision making.

  4. Relationships among selected physical science misconceptions held by preservice elementary teachers and four variables: Formal reasoning ability, working memory capacity, verbal intelligence, and field dependence/independence

    NASA Astrophysics Data System (ADS)

    Griffin, Leslie Little

    The purpose of this study was to determine the relationship of selected cognitive abilities and physical science misconceptions held by preservice elementary teachers. The cognitive abilities under investigation were: formal reasoning ability as measured by the Lawson Classroom Test of Formal Reasoning (Lawson, 1978); working memory capacity as measured by the Figural Intersection Test (Burtis & Pascual-Leone, 1974); verbal intelligence as measured by the Acorn National Academic Aptitude Test: Verbal Intelligence (Kobal, Wrightstone, & Kunze, 1944); and field dependence/independence as measured by the Group Embedded Figures Test (Witkin, Oltman, & Raskin, 1971). The number of physical science misconceptions held by preservice elementary teachers was measured by the Misconceptions in Science Questionnaire (Franklin, 1992). The data utilized in this investigation were obtained from 36 preservice elementary teachers enrolled in two sections of a science methods course at a small regional university in the southeastern United States. Multiple regression techniques were used to analyze the collected data. The following conclusions were reached following an analysis of the data. The variables of formal reasoning ability and verbal intelligence were identified as having significant relationships, both individually and in combination, to the dependent variable of selected physical science misconceptions. Though the correlations were not high enough to yield strong predictors of physical science misconceptions or strong relationships, they were of sufficient magnitude to warrant further investigation. It is recommended that further investigation be conducted replicating this study with a larger sample size. In addition, experimental research should be implemented to explore the relationships suggested in this study between the cognitive variables of formal reasoning ability and verbal intelligence and the dependent variable of selected physical science misconceptions. Further research should also focus on the detection of a broad range of science misconceptions among preservice elementary teachers.

  5. Valuing the benefits of genetic testing for retinitis pigmentosa: a pilot application of the contingent valuation method.

    PubMed

    Eden, Martin; Payne, Katherine; Combs, Ryan M; Hall, Georgina; McAllister, Marion; Black, Graeme C M

    2013-08-01

    Technological advances present an opportunity for more people with, or at risk of, developing retinitis pigmentosa (RP) to be offered genetic testing. Valuation of these tests using current evaluative frameworks is problematic since benefits may be derived from diagnostic information rather than improvements in health. This pilot study aimed to explore if contingent valuation method (CVM) can be used to value the benefits of genetic testing for RP. CVM was used to elicit willingness-to-pay (WTP) values for (1) genetic counselling and (2) genetic counselling with genetic testing. Telephone and face-to-face interviews with a purposive sample of individuals with (n=25), and without (n=27), prior experience of RP were used to explore the feasibility and validity of CVM in this context. Faced with a hypothetical scenario, the majority of participants stated that they would seek genetic counselling and testing in the context of RP. Between participant groups, respondents offered similar justifications for stated WTP values. Overall stated WTP was higher for genetic counselling plus testing (median=£524.00) compared with counselling alone (median=£224.50). Between-group differences in stated WTP were statistically significant; participants with prior knowledge of the condition were willing to pay more for genetic ophthalmology services. Participants were able to attach a monetary value to the perceived potential benefit that genetic testing offered regardless of prior experience of the condition. This exploratory work represents an important step towards evaluating these services using formal cost-benefit analysis.

  6. Software Validation via Model Animation

    NASA Technical Reports Server (NTRS)

    Dutle, Aaron M.; Munoz, Cesar A.; Narkawicz, Anthony J.; Butler, Ricky W.

    2015-01-01

    This paper explores a new approach to validating software implementations that have been produced from formally-verified algorithms. Although visual inspection gives some confidence that the implementations faithfully reflect the formal models, it does not provide complete assurance that the software is correct. The proposed approach, which is based on animation of formal specifications, compares the outputs computed by the software implementations on a given suite of input values to the outputs computed by the formal models on the same inputs, and determines if they are equal up to a given tolerance. The approach is illustrated on a prototype air traffic management system that computes simple kinematic trajectories for aircraft. Proofs for the mathematical models of the system's algorithms are carried out in the Prototype Verification System (PVS). The animation tool PVSio is used to evaluate the formal models on a set of randomly generated test cases. Output values computed by PVSio are compared against output values computed by the actual software. This comparison improves the assurance that the translation from formal models to code is faithful and that, for example, floating point errors do not greatly affect correctness and safety properties.

  7. INVESTIGATING DIFFERENCES IN BRAIN FUNCTIONAL NETWORKS USING HIERARCHICAL COVARIATE-ADJUSTED INDEPENDENT COMPONENT ANALYSIS.

    PubMed

    Shi, Ran; Guo, Ying

    2016-12-01

    Human brains perform tasks via complex functional networks consisting of separated brain regions. A popular approach to characterize brain functional networks in fMRI studies is independent component analysis (ICA), which is a powerful method to reconstruct latent source signals from their linear mixtures. In many fMRI studies, an important goal is to investigate how brain functional networks change according to specific clinical and demographic variabilities. Existing ICA methods, however, cannot directly incorporate covariate effects in ICA decomposition. Heuristic post-ICA analysis to address this need can be inaccurate and inefficient. In this paper, we propose a hierarchical covariate-adjusted ICA (hc-ICA) model that provides a formal statistical framework for estimating covariate effects and testing differences between brain functional networks. Our method provides a more reliable and powerful statistical tool for evaluating group differences in brain functional networks while appropriately controlling for potential confounding factors. We present an analytically tractable EM algorithm to obtain maximum likelihood estimates of our model. We also develop a subspace-based approximate EM that runs significantly faster while retaining high accuracy. To test the differences in functional networks, we introduce a voxel-wise approximate inference procedure which eliminates the need of computationally expensive covariance matrix estimation and inversion. We demonstrate the advantages of our methods over the existing method via simulation studies. We apply our method to an fMRI study to investigate differences in brain functional networks associated with post-traumatic stress disorder (PTSD).

  8. Formal faculty observation and assessment of bedside skills for 3rd-year neurology clerks

    PubMed Central

    Mooney, Christopher; Wexler, Erika; Mink, Jonathan; Post, Jennifer; Jozefowicz, Ralph F.

    2016-01-01

    Objective: To evaluate the feasibility and utility of instituting a formalized bedside skills evaluation (BSE) for 3rd-year medical students on the neurology clerkship. Methods: A neurologic BSE was developed for 3rd-year neurology clerks at the University of Rochester for the 2012–2014 academic years. Faculty directly observed 189 students completing a full history and neurologic examination on real inpatients. Mock grades were calculated utilizing the BSE in the final grade, and number of students with a grade difference was determined when compared to true grade. Correlation was explored between the BSE and clinical scores, National Board of Medical Examiners (NBME) scores, case complexity, and true final grades. A survey was administered to students to assess their clinical skills exposure and the usefulness of the BSE. Results: Faculty completed and submitted a BSE form for 88.3% of students. There was a mock final grade change for 13.2% of students. Correlation coefficients between BSE score and clinical score/NBME score were 0.36 and 0.35, respectively. A statistically significant effect of BSE was found on final clerkship grade (F2,186 = 31.9, p < 0.0001). There was no statistical difference between BSE score and differing case complexities. Conclusions: Incorporating a formal faculty-observed BSE into the 3rd year neurology clerkship was feasible. Low correlation between BSE score and other evaluations indicated a unique measurement to contribute to student grade. Using real patients with differing case complexity did not alter the grade. PMID:27770072

  9. Insight into structural phase transitions from the decoupled anharmonic mode approximation

    NASA Astrophysics Data System (ADS)

    Adams, Donat J.; Passerone, Daniele

    2016-08-01

    We develop a formalism (decoupled anharmonic mode approximation, DAMA) that allows calculation of the vibrational free energy using density functional theory even for materials which exhibit negative curvature of the potential energy surface with respect to atomic displacements. We investigate vibrational modes beyond the harmonic approximation and approximate the potential energy surface with the superposition of the accurate potential along each normal mode. We show that the free energy can stabilize crystal structures at finite temperatures which appear dynamically unstable at T  =  0. The DAMA formalism is computationally fast because it avoids statistical sampling through molecular dynamics calculations, and is in principle completely ab initio. It is free of statistical uncertainties and independent of model parameters, but can give insight into the mechanism of a structural phase transition. We apply the formalism to the perovskite cryolite, and investigate the temperature-driven phase transition from the P21/n to the Immm space group. We calculate a phase transition temperature between 710 and 950 K, in fair agreement with the experimental value of 885 K. This can be related to the underestimation of the interaction of the vibrational states. We also calculate the main axes of the thermal ellipsoid and can explain the experimentally observed increase of its volume for the fluorine by 200-300% throughout the phase transition. Our calculations suggest the appearance of tunneling states in the high temperature phase. The convergence of the vibrational DOS and of the critical temperature with respect of reciprocal space sampling is investigated using the polarizable-ion model.

  10. Insight into structural phase transitions from the decoupled anharmonic mode approximation.

    PubMed

    Adams, Donat J; Passerone, Daniele

    2016-08-03

    We develop a formalism (decoupled anharmonic mode approximation, DAMA) that allows calculation of the vibrational free energy using density functional theory even for materials which exhibit negative curvature of the potential energy surface with respect to atomic displacements. We investigate vibrational modes beyond the harmonic approximation and approximate the potential energy surface with the superposition of the accurate potential along each normal mode. We show that the free energy can stabilize crystal structures at finite temperatures which appear dynamically unstable at T  =  0. The DAMA formalism is computationally fast because it avoids statistical sampling through molecular dynamics calculations, and is in principle completely ab initio. It is free of statistical uncertainties and independent of model parameters, but can give insight into the mechanism of a structural phase transition. We apply the formalism to the perovskite cryolite, and investigate the temperature-driven phase transition from the P21/n to the Immm space group. We calculate a phase transition temperature between 710 and 950 K, in fair agreement with the experimental value of 885 K. This can be related to the underestimation of the interaction of the vibrational states. We also calculate the main axes of the thermal ellipsoid and can explain the experimentally observed increase of its volume for the fluorine by 200-300% throughout the phase transition. Our calculations suggest the appearance of tunneling states in the high temperature phase. The convergence of the vibrational DOS and of the critical temperature with respect of reciprocal space sampling is investigated using the polarizable-ion model.

  11. Formal faculty observation and assessment of bedside skills for 3rd-year neurology clerks.

    PubMed

    Thompson Stone, Robert; Mooney, Christopher; Wexler, Erika; Mink, Jonathan; Post, Jennifer; Jozefowicz, Ralph F

    2016-11-22

    To evaluate the feasibility and utility of instituting a formalized bedside skills evaluation (BSE) for 3rd-year medical students on the neurology clerkship. A neurologic BSE was developed for 3rd - year neurology clerks at the University of Rochester for the 2012-2014 academic years. Faculty directly observed 189 students completing a full history and neurologic examination on real inpatients. Mock grades were calculated utilizing the BSE in the final grade, and number of students with a grade difference was determined when compared to true grade. Correlation was explored between the BSE and clinical scores, National Board of Medical Examiners (NBME) scores, case complexity, and true final grades. A survey was administered to students to assess their clinical skills exposure and the usefulness of the BSE. Faculty completed and submitted a BSE form for 88.3% of students. There was a mock final grade change for 13.2% of students. Correlation coefficients between BSE score and clinical score/NBME score were 0.36 and 0.35, respectively. A statistically significant effect of BSE was found on final clerkship grade (F 2,186 = 31.9, p < 0.0001). There was no statistical difference between BSE score and differing case complexities. Incorporating a formal faculty-observed BSE into the 3rd year neurology clerkship was feasible. Low correlation between BSE score and other evaluations indicated a unique measurement to contribute to student grade. Using real patients with differing case complexity did not alter the grade. © 2016 American Academy of Neurology.

  12. Perceptions of registered nurses in four state health insititutions on continuing formal education.

    PubMed

    Richards, L; Potgieter, E

    2010-06-01

    This study investigated registered nurses in four selected state health institutions' perceptions with regard to continuing formal education. The relevance of continuing formal education is being emphasised globally by the increasing quest for quality assurance and quality management systems within an ethos of continuous improvement. According to Tlholoe (2006:5), it is important to be committed to continual learning, as people's knowledge become less relevant because skills gained early in a career are insufficient to avoid costly mistakes made through ignorance. Continuing formal education in nursing is a key element to the maintenance of quality in health care delivery. The study described: registered nurses' views on continuing formal education. Registered nurses' perceived barriers to continuing formal education. A quantitative descriptive survey design was chosen using a questionnaire for data collection. The sample consisted of 40 registered nurses working at four state health institutions in the Western Cape Province, South Africa. Convenience sampling was selected to include registered nurses who were on duty on the days during which the researcher visited the health institutions to distribute the questionnaires. The questionnaire contained mainly closed-ended and a few open-ended questions. Content validity of the instrument was ensured by doing a thorough literature review before construction of items and a pretest. Reliability was established by the pretest and providing the same information to all respondents before completion of the questionnaires. The ethical considerations of informed consent, anonymity and confidentiality were adhered to and consent to conduct the study was obtained from relevant authorities. Descriptive statistics, based on calculations using the Microsoft (MS) Excel (for Windows 2000) programme, were used to summarise and describe the research results. The research results indicated that most registered nurses perceive continuing formal education as beneficial to their personal and professional growth and that it could lead towards improving the quality of patient/client care, but barriers exist which prevent or deter them from undertaking continuing formal education programmes. The main structural barriers included lack of funding and lack of coherent staff development planning and physical barriers including job and family responsibilities.

  13. Model Checking Artificial Intelligence Based Planners: Even the Best Laid Plans Must Be Verified

    NASA Technical Reports Server (NTRS)

    Smith, Margaret H.; Holzmann, Gerard J.; Cucullu, Gordon C., III; Smith, Benjamin D.

    2005-01-01

    Automated planning systems (APS) are gaining acceptance for use on NASA missions as evidenced by APS flown On missions such as Orbiter and Deep Space 1 both of which were commanded by onboard planning systems. The planning system takes high level goals and expands them onboard into a detailed of action fiat the spacecraft executes. The system must be verified to ensure that the automatically generated plans achieve the goals as expected and do not generate actions that would harm the spacecraft or mission. These systems are typically tested using empirical methods. Formal methods, such as model checking, offer exhaustive or measurable test coverage which leads to much greater confidence in correctness. This paper describes a formal method based on the SPIN model checker. This method guarantees that possible plans meet certain desirable properties. We express the input model in Promela, the language of SPIN and express the properties of desirable plans formally.

  14. 34 CFR 462.3 - What definitions apply?

    Code of Federal Regulations, 2012 CFR

    2012-07-01

    ... items across pre- and post-testing. Test administrator means an individual who is trained to administer... instructional time a student needs before post-testing. Violation of these protocols often invalidates the test... defined in the Act. Test means a standardized test, assessment, or instrument that has a formal protocol...

  15. 34 CFR 462.3 - What definitions apply?

    Code of Federal Regulations, 2010 CFR

    2010-07-01

    ... items across pre- and post-testing. Test administrator means an individual who is trained to administer... instructional time a student needs before post-testing. Violation of these protocols often invalidates the test... defined in the Act. Test means a standardized test, assessment, or instrument that has a formal protocol...

  16. 34 CFR 462.3 - What definitions apply?

    Code of Federal Regulations, 2011 CFR

    2011-07-01

    ... items across pre- and post-testing. Test administrator means an individual who is trained to administer... instructional time a student needs before post-testing. Violation of these protocols often invalidates the test... defined in the Act. Test means a standardized test, assessment, or instrument that has a formal protocol...

  17. 34 CFR 462.3 - What definitions apply?

    Code of Federal Regulations, 2014 CFR

    2014-07-01

    ... items across pre- and post-testing. Test administrator means an individual who is trained to administer... instructional time a student needs before post-testing. Violation of these protocols often invalidates the test... defined in the Act. Test means a standardized test, assessment, or instrument that has a formal protocol...

  18. 34 CFR 462.3 - What definitions apply?

    Code of Federal Regulations, 2013 CFR

    2013-07-01

    ... items across pre- and post-testing. Test administrator means an individual who is trained to administer... instructional time a student needs before post-testing. Violation of these protocols often invalidates the test... defined in the Act. Test means a standardized test, assessment, or instrument that has a formal protocol...

  19. Verification of Emergent Behaviors in Swarm-based Systems

    NASA Technical Reports Server (NTRS)

    Rouff, Christopher; Vanderbilt, Amy; Hinchey, Mike; Truszkowski, Walt; Rash, James

    2004-01-01

    The emergent properties of swarms make swarm-based missions powerful, but at the same time more difficult to design and to assure that the proper behaviors will emerge. We are currently investigating formal methods and techniques for verification and validation of swarm-based missions. The Autonomous Nano-Technology Swarm (ANTS) mission is being used as an example and case study for swarm-based missions to experiment and test current formal methods with intelligent swarms. Using the ANTS mission, we have evaluated multiple formal methods to determine their effectiveness in modeling and assuring swarm behavior. This paper introduces how intelligent swarm technology is being proposed for NASA missions, and gives the results of a comparison of several formal methods and approaches for specifying intelligent swarm-based systems and their effectiveness for predicting emergent behavior.

  20. Virginity Testing Beyond a Medical Examination

    PubMed Central

    Robatjazi, Mehri; Simbar, Masoumeh; Nahidi, Fatemeh; Gharehdaghi, Jaber; Emamhadi, Mohammadali; Vedadhir, Abou-Ali; Alavimajd, Hamid

    2016-01-01

    Apart from religious values, virginity is important in different communities because of its prominent role in reducing sexually transmitted diseases and teen pregnancies. Even though virginity testing has been proclaimed an example of violence against women by the World Health Organization, it is still conducted in many countries, including Iran. 16 in-depth, semi-structured interviews were conducted with participants aged 32 to 60 years to elucidate the perceptions and experiences of Iranian examiners of virginity testing. The perception and experience of examiners were reflected in five main themes. The result of this study indicated that virginity testing is more than a medical examination, considering the cultural factors involved and its overt and covert consequences. In Iran, testing is performed for both formal and informal reasons, and examiners view such testing with ambiguity about the accuracy and certainty of the diagnosis and uncertainty about ethics and reproductive rights. Examiners are affected by the overt and covert consequences of virginity testing, beliefs and cultural values underlying virginity testing, and informal and formal reasons for virginity testing. PMID:26925894

  1. Kindergarten Predictors of Math Learning Disability

    PubMed Central

    Mazzocco, Michèle M. M.; Thompson, Richard E.

    2009-01-01

    The aim of the present study was to address how to effectively predict mathematics learning disability (MLD). Specifically, we addressed whether cognitive data obtained during kindergarten can effectively predict which children will have MLD in third grade, whether an abbreviated test battery could be as effective as a standard psychoeducational assessment at predicting MLD, and whether the abbreviated battery corresponded to the literature on MLD characteristics. Participants were 226 children who enrolled in a 4-year prospective longitudinal study during kindergarten. We administered measures of mathematics achievement, formal and informal mathematics ability, visual-spatial reasoning, and rapid automatized naming and examined which test scores and test items from kindergarten best predicted MLD at grades 2 and 3. Statistical models using standardized scores from the entire test battery correctly classified ~80–83 percent of the participants as having, or not having, MLD. Regression models using scores from only individual test items were less predictive than models containing the standard scores, except for models using a specific subset of test items that dealt with reading numerals, number constancy, magnitude judgments of one-digit numbers, or mental addition of one-digit numbers. These models were as accurate in predicting MLD as was the model including the entire set of standard scores from the battery of tests examined. Our findings indicate that it is possible to effectively predict which kindergartners are at risk for MLD, and thus the findings have implications for early screening of MLD. PMID:20084182

  2. BRIDGE21--Exploring the Potential to Foster Intrinsic Student Motivation through a Team-Based, Technology-Mediated Learning Model

    ERIC Educational Resources Information Center

    Lawlor, John; Marshall, Kevin; Tangney, Brendan

    2016-01-01

    It is generally accepted that intrinsic student motivation is a critical requirement for effective learning but formal learning in school places a huge reliance on extrinsic motivation to focus the learner. This reliance on extrinsic motivation is driven by the pressure on formal schooling to "deliver to the test." The experience of the…

  3. An Exploration of the Interplay of Students' Dispositions to Critical Thinking, Formal Thinking and Procedural Knowledge in Science.

    ERIC Educational Resources Information Center

    Ferguson, Nicole; Vazquez-Abad, Jesus

    This paper describes part of a study undertaken to examine the relationship between dispositions to critical thinking, procedural knowledge in science, and formal reasoning. Three tests were administered to 346 grade 7 students at the beginning and at the end of the school year: California Critical Thinking Dispositions Inventory, Group Assessment…

  4. Formal Lifelong E-Learning for Employability and Job Stability during Turbulent Times in Spain

    ERIC Educational Resources Information Center

    Martínez-Cerdá, Juan-Francisco; Torrent-Sellens, Joan

    2017-01-01

    In recent decades, international organizations have developed initiatives that incorporate lifelong learning as a tool to increase the employability of citizens. In this context, the goal of this research is to test the influence of formal e-learning on estimating employment status. The research made use of a sample of 595 citizens in 2007 and…

  5. Does Formal Assessment of Comprehension by SLT Agree with Teachers' Perceptions of Functional Comprehension Skills in the Classroom?

    ERIC Educational Resources Information Center

    Purse, Katie; Gardner, Hilary

    2013-01-01

    This study aimed to consider collaborative practice in contributing to joint assessment and producing appropriate referral of children to speech and language therapy (SLT). Results of formal testing of selected comprehension skills are compared with functional/classroom performance as rated by class teachers. Thirty children aged 6.5-8.4 years,…

  6. Age and Schooling Effects on Early Literacy and Phoneme Awareness

    ERIC Educational Resources Information Center

    Cunningham, Anna; Carroll, Julia

    2011-01-01

    Previous research on age and schooling effects is largely restricted to studies of children who begin formal schooling at 6 years of age, and the measures of phoneme awareness used have typically lacked sensitivity for beginning readers. Our study addresses these issues by testing 4 to 6 year-olds (first 2 years of formal schooling in the United…

  7. Does basing an intervention on behavioral theory enhance the efficacy/effectiveness on dietary change for obesity prevention among children? A systematic review and meta-analysis

    USDA-ARS?s Scientific Manuscript database

    Our purpose was to test whether interventions based on theory, multiple theories, or a formal planning process were more effective in changing fruit and vegetable (FV) consumption among children than interventions with no behavioral theoretical foundation or no formal planning. The authors conducted...

  8. Optical simulation of a Popescu-Rohrlich Box.

    PubMed

    Chu, Wen-Jing; Zong, Xiao-Lan; Yang, Ming; Pan, Guo-Zhu; Cao, Zhuo-Liang

    2016-06-22

    It is well known that the fair-sampling loophole in Bell test opened by the selection of the state to be measured can lead to post-quantum correlations. In this paper, we make the selection of the results after measurement, which opens the fair- sampling loophole too, and thus can lead to post-quantum correlations. This kind of result-selection loophole can be realized by pre- and post-selection processes within the "two-state vector formalism", and a physical simulation of Popescu-Rohrlich (PR) box is designed in linear optical system. The probability distribution of the PR has a maximal CHSH value 4, i.e. it can maximally violate CHSH inequality. Because the "two-state vector formalism" violates the information causality, it opens the locality loophole too, which means that this kind of results selection within "two-state vector formalism" leads to both fair- sampling loophole and locality loophole, so we call it a comprehensive loophole in Bell test. The comprehensive loophole opened by the results selection within "two-state vector formalism" may be another possible explanation of why post-quantum correlations are incompatible with quantum mechanics and seem not to exist in nature.

  9. OI Issues: Hearing Loss

    MedlinePlus

    ... a formal audiologic assessment regardless of age. If borderline hearing is discovered, then yearly testing with a certified audiologist is recommended. Adults with borderline hearing should have yearly testing and follow up ...

  10. Effects of the Boy Scouts of America Personal Fitness Merit Badge on Cardio-Metabolic Risk, Health Related Fitness and Physical Activity in Adolescent Boys.

    PubMed

    Maxwell, Justin; Burns, Ryan D; Brusseau, Timothy A

    2017-01-01

    A growing number of adolescents are more sedentary and have fewer formal opportunities to participate in physical activity. With the mounting evidence that sedentary time has a negative impact on cardiometabolic profiles, health related fitness and physical activity, there is a pressing need to find an affordable adolescent physical activity intervention. One possible intervention that has been overlooked in the past is Boy Scouts of America. There are nearly 900,000 adolescent boys who participate in Boy Scouts in the United States. The purpose of this research study was to evaluate the effect of the Personal Fitness merit badge system on physical activity, health-related fitness, and cardio-metabolic blood profiles in Boy Scouts 11-17 years of age. Participants were fourteen (N = 14) Boy Scouts from the Great Salt Lake Council of the Boy Scouts of America who earned their Personal Fitness merit badge. Classes were held in the Spring of 2016 where boys received the information needed to obtain the merit badge and data were collected. Results from the related-samples Wilcoxon signed rank test showed that the median of differences between VO 2 peak pre-test and post-test scores were statistically significant ( p = 0.004). However, it also showed that the differences between the Pre-MetS (metabolic syndrome) and Post-MetS scores (p = 0.917), average steps taken per day ( p = 0.317), and BMI ( p = 0.419) were not statistically significant. In conclusion, the merit badge program had a positive impact on cardiovascular endurance, suggesting this program has potential to improve cardiovascular fitness and should be considered for boys participating in Boy Scouts.

  11. Chlorine-36 data at Yucca Mountain: Statistical tests of conceptual models for unsaturated-zone flow

    USGS Publications Warehouse

    Campbell, K.; Wolfsberg, A.; Fabryka-Martin, J.; Sweetkind, D.

    2003-01-01

    An extensive set of chlorine-36 (36Cl) data has been collected in the Exploratory Studies Facility (ESF), an 8-km-long tunnel at Yucca Mountain, Nevada, for the purpose of developing and testing conceptual models of flow and transport in the unsaturated zone (UZ) at this site. At several locations, the measured values of 36Cl/Cl ratios for salts leached from rock samples are high enough to provide strong evidence that at least a small component of bomb-pulse 36Cl, fallout from atmospheric testing of nuclear devices in the 1950s and 1960s, was measured, implying that some fraction of the water traveled from the ground surface through 200-300 m of unsaturated rock to the level of the ESF during the last 50 years. These data are analyzed here using a formal statistical approach based on log-linear models to evaluate alternative conceptual models for the distribution of such fast flow paths. The most significant determinant of the presence of bomb-pulse 36Cl in a sample from the welded Topopah Spring unit (TSw) is the structural setting from which the sample was collected. Our analysis generally supports the conceptual model that a fault that cuts through the nonwelded Paintbrush tuff unit (PTn) that overlies the TSw is required in order for bomb-pulse 36Cl to be transmitted to the sample depth in less than 50 years. Away from PTn-cutting faults, the ages of water samples at the ESF appear to be a strong function of the thickness of the nonwelded tuff between the ground surface and the ESF, due to slow matrix flow in that unit. ?? 2002 Elsevier Science B.V. All rights reserved.

  12. Line Mixing in Parallel and Perpendicular Bands of CO2: A Further Test of the Refined Robert-Bonamy Formalism

    NASA Technical Reports Server (NTRS)

    Boulet, C.; Ma, Qiancheng; Tipping, R. H.

    2015-01-01

    Starting from the refined Robert-Bonamy formalism [Q. Ma, C. Boulet, and R. H. Tipping, J. Chem. Phys. 139, 034305 (2013)], we propose here an extension of line mixing studies to infrared absorptions of linear polyatomic molecules having stretching and bending modes. The present formalism does not neglect the internal degrees of freedom of the perturbing molecules, contrary to the energy corrected sudden (ECS) modeling, and enables one to calculate the whole relaxation matrix starting from the potential energy surface. Meanwhile, similar to the ECS modeling, the present formalism properly accounts for roles played by all the internal angular momenta in the coupling process, including the vibrational angular momentum. The formalism has been applied to the important case of CO2 broadened by N2. Applications to two kinds of vibrational bands (sigma yields sigma and sigma yields pi) have shown that the present results are in good agreement with both experimental data and results derived from the ECS model.

  13. Removing interference-based effects from the infrared transflectance spectra of thin films on metallic substrates: a fast and wave optics conform solution.

    PubMed

    Mayerhöfer, Thomas G; Pahlow, Susanne; Hübner, Uwe; Popp, Jürgen

    2018-06-25

    A hybrid formalism combining elements from Kramers-Kronig based analyses and dispersion analysis was developed, which allows removing interference-based effects in the infrared spectra of layers on highly reflecting substrates. In order to enable a highly convenient application, the correction procedure is fully automatized and usually requires less than a minute with non-optimized software on a typical office PC. The formalism was tested with both synthetic and experimental spectra of poly(methyl methacrylate) on gold. The results confirmed the usefulness of the formalism: apparent peak ratios as well as the interference fringes in the original spectra were successfully corrected. Accordingly, the introduced formalism makes it possible to use inexpensive and robust highly reflecting substrates for routine infrared spectroscopic investigations of layers or films the thickness of which is limited by the imperative that reflectance absorbance must be smaller than about 1. For thicker films the formalism is still useful, but requires estimates for the optical constants.

  14. Participation Trends and Patterns in Adult Education: 1991-1999. Statistical Analysis Report.

    ERIC Educational Resources Information Center

    Creighton, Sean; Hudson, Lisa

    Participation of U.S. adults in formal learning activities during the 1990s was examined by analyzing data from the 1991, 1995, and 1999 Adult Education Surveys that were part of the National Household Education Surveys Program. Overall, participation in adult education between 1991 and 1999 increased among all but one age group (35-44 years), all…

  15. A Short Biography of Paul A. M. Dirac and Historical Development of Dirac Delta Function

    ERIC Educational Resources Information Center

    Debnath, Lokenath

    2013-01-01

    This paper deals with a short biography of Paul Dirac, his first celebrated work on quantum mechanics, his first formal systematic use of the Dirac delta function and his famous work on quantum electrodynamics and quantum statistics. Included are his first discovery of the Dirac relativistic wave equation, existence of positron and the intrinsic…

  16. Closing the gap in academic readiness and achievement: the role of early childcare

    PubMed Central

    Geoffroy, Marie-Claude; Côté, Sylvana. M.; Giguère, Charles-Édouard; Dionne, Ginette; Zelazo, Philip David; Tremblay, Richard E.; Boivin, Michel; Séguin, Jean. R.

    2012-01-01

    Background Socially disadvantaged children with academic difficulties at school entry are at increased risk for poor health and psychosocial outcomes. Our objective is to test the possibility that participation in childcare – at the population level – could attenuate the gap in academic readiness and achievement between children with and without a social disadvantage (indexed by low levels of maternal education). Methods A cohort of infants born in the Canadian province of Quebec in 1997/1998 was selected through birth registries and followed annually until 7 years of age (n = 1,863). Children receiving formal childcare (i.e., center-based or non-relative out-of-home) were distinguished from those receiving informal childcare (i.e., relative or nanny). Measures from 4 standardized tests that assessed cognitive school readiness (Lollipop Test for School Readiness), receptive vocabulary (Peabody Picture Vocabulary Test Revised), mathematics (Number Knowledge Test), and reading performance (Kaufman Assessment Battery for children) were administered at 6 and 7 years. Results Children of mothers with low levels of education showed a consistent pattern of lower scores on academic readiness and achievement tests at 6 and 7 years than those of highly educated mothers, unless they received formal childcare. Specifically, among children of mothers with low levels of education, those who received formal childcare obtained higher school readiness (d = 0.87), receptive vocabulary (d = 0.36), reading (d = 0.48) and math achievement scores (d = 0.38; although not significant at 5%) in comparison with those who were cared for by their parents. Childcare participation was not associated with cognitive outcomes among children of mothers with higher levels of education. Conclusions Public investments in early childcare are increasing in many countries with the intention of reducing cognitive inequalities between disadvantaged and advantaged children. Our findings provide further evidence suggesting that formal childcare could represent a preventative means of attenuating effects of disadvantage on children’s early academic trajectory. PMID:20883519

  17. A compliance assessment of midpoint formative assessments completed by APPE preceptors.

    PubMed

    Lea Bonner, C; Staton, April G; Naro, Patricia B; McCullough, Elizabeth; Lynn Stevenson, T; Williamson, Margaret; Sheffield, Melody C; Miller, Mindi; Fetterman, James W; Fan, Shirley; Momary, Kathryn M

    Experiential pharmacy preceptors should provide formative and summative feedback during a learning experience. Preceptors are required to provide colleges and schools of pharmacy with assessments or evaluations of students' performance. Students and experiential programs value on-time completion of midpoint evaluations by preceptors. The objective of this study was to determine the number of on-time electronically documented formative midpoint evaluations completed by preceptors during advanced pharmacy practice experiences (APPEs). Compliance rates of on-time electronically documented formative midpoint evaluations were reviewed by the Office of Experiential Education of a five-member consortium during the two-year study period prior to the adoption of Standards 2016. Pearson chi-square test and generalized linear models were used to determine if statistically significant differences were present. Average midpoint compliance rates for the two-year research period were 40.7% and 41% respectively. No statistical significance was noted comparing compliance rates for year one versus year two. However, statistical significance was present when comparing compliance rates between schools during year two. Feedback from students and preceptors pointed to the need for brief formal midpoint evaluations that require minimal time to complete, user friendly experiential management software, and methods for documenting verbal feedback through student self-reflection. Additional education and training to both affiliate and faculty preceptors on the importance of written formative feedback at midpoint is critical to remaining in compliance with Standards 2016. Copyright © 2017 Elsevier Inc. All rights reserved.

  18. Basic statistics (the fundamental concepts).

    PubMed

    Lim, Eric

    2014-12-01

    An appreciation and understanding of statistics is import to all practising clinicians, not simply researchers. This is because mathematics is the fundamental basis to which we base clinical decisions, usually with reference to the benefit in relation to risk. Unless a clinician has a basic understanding of statistics, he or she will never be in a position to question healthcare management decisions that have been handed down from generation to generation, will not be able to conduct research effectively nor evaluate the validity of published evidence (usually making an assumption that most published work is either all good or all bad). This article provides a brief introduction to basic statistical methods and illustrates its use in common clinical scenarios. In addition, pitfalls of incorrect usage have been highlighted. However, it is not meant to be a substitute for formal training or consultation with a qualified and experienced medical statistician prior to starting any research project.

  19. Condensate statistics and thermodynamics of weakly interacting Bose gas: Recursion relation approach

    NASA Astrophysics Data System (ADS)

    Dorfman, K. E.; Kim, M.; Svidzinsky, A. A.

    2011-03-01

    We study condensate statistics and thermodynamics of weakly interacting Bose gas with a fixed total number N of particles in a cubic box. We find the exact recursion relation for the canonical ensemble partition function. Using this relation, we calculate the distribution function of condensate particles for N=200. We also calculate the distribution function based on multinomial expansion of the characteristic function. Similar to the ideal gas, both approaches give exact statistical moments for all temperatures in the framework of Bogoliubov model. We compare them with the results of unconstraint canonical ensemble quasiparticle formalism and the hybrid master equation approach. The present recursion relation can be used for any external potential and boundary conditions. We investigate the temperature dependence of the first few statistical moments of condensate fluctuations as well as thermodynamic potentials and heat capacity analytically and numerically in the whole temperature range.

  20. Saturn Radiation (SATRAD) Model

    NASA Technical Reports Server (NTRS)

    Garrett, H. B.; Ratliff, J. M.; Evans, R. W.

    2005-01-01

    The Saturnian radiation belts have not received as much attention as the Jovian radiation belts because they are not nearly as intense-the famous Saturnian particle rings tend to deplete the belts near where their peak would occur. As a result, there has not been a systematic development of engineering models of the Saturnian radiation environment for mission design. A primary exception is that of Divine (1990). That study used published data from several charged particle experiments aboard the Pioneer 1 1, Voyager 1, and Voyager 2 spacecraft during their flybys at Saturn to generate numerical models for the electron and proton radiation belts between 2.3 and 13 Saturn radii. The Divine Saturn radiation model described the electron distributions at energies between 0.04 and 10 MeV and the proton distributions at energies between 0.14 and 80 MeV. The model was intended to predict particle intensity, flux, and fluence for the Cassini orbiter. Divine carried out hand calculations using the model but never formally developed a computer program that could be used for general mission analyses. This report seeks to fill that void by formally developing a FORTRAN version of the model that can be used as a computer design tool for missions to Saturn that require estimates of the radiation environment around the planet. The results of that effort and the program listings are presented here along with comparisons with the original estimates carried out by Divine. In addition, Pioneer and Voyager data were scanned in from the original references and compared with the FORTRAN model s predictions. The results were statistically analyzed in a manner consistent with Divine s approach to provide estimates of the ability of the model to reproduce the original data. Results of a formal review of the model by a panel of experts are also presented. Their recommendations for further tests, analyses, and extensions to the model are discussed.

  1. Work and heat fluctuations in two-state systems: a trajectory thermodynamics formalism

    NASA Astrophysics Data System (ADS)

    Ritort, F.

    2004-10-01

    Two-state models provide phenomenological descriptions of many different systems, ranging from physics to chemistry and biology. We investigate work fluctuations in an ensemble of two-state systems driven out of equilibrium under the action of an external perturbation. We calculate the probability density PN(W) that work equal to W is exerted upon the system (of size N) along a given non-equilibrium trajectory and introduce a trajectory thermodynamics formalism to quantify work fluctuations in the large-N limit. We then define a trajectory entropy SN(W) that counts the number of non-equilibrium trajectories PN(W) = exp(SN(W)/kBT) with work equal to W and characterizes fluctuations of work trajectories around the most probable value Wmp. A trajectory free energy {\\cal F}_N(W) can also be defined, which has a minimum at W = W†, this being the value of the work that has to be efficiently sampled to quantitatively test the Jarzynski equality. Within this formalism a Lagrange multiplier is also introduced, the inverse of which plays the role of a trajectory temperature. Our general solution for PN(W) exactly satisfies the fluctuation theorem by Crooks and allows us to investigate heat fluctuations for a protocol that is invariant under time reversal. The heat distribution is then characterized by a Gaussian component (describing small and frequent heat exchange events) and exponential tails (describing the statistics of large deviations and rare events). For the latter, the width of the exponential tails is related to the aforementioned trajectory temperature. Finite-size effects to the large-N theory and the recovery of work distributions for finite N are also discussed. Finally, we pay particular attention to the case of magnetic nanoparticle systems under the action of a magnetic field H where work and heat fluctuations are predicted to be observable in ramping experiments in micro-SQUIDs.

  2. The impact of conservative discourses in family policies, population politics, and gender rights in Poland and Turkey.

    PubMed

    Korkut, Umut; Eslen-Ziya, Hande

    2011-01-01

    This article uses childcare as a case study to test the impact of ideas that embody a traditional understanding of gender relations in relation to childcare. Conservative ideas regard increasing female labor market participation as a cause of decreasing fertility on the functioning of a set of general policies to increase fertility rates. It looks into the Polish and Turkish contexts for empirical evidence. The Polish context shows a highly institutionalized system of family policies in contrast to almost unessential institutions in Turkey. Formally, the labor market participation of women is much lower in Turkey than in Poland. Yet, given the size of the informal market in Turkey, women's labor participation is obviously higher than what appears in the statistics. Bearing in mind this divergence, the article suggests Poland and Turkey as two typologies for studying population politics in contexts where socially conservative ideas regarding gender remain paramount. We qualify ideas as conservative if they enforce a traditional understanding of gender relations in care-giving and underline women's role in the labor market as an element of declining fertility. In order to delineate ideational impact, this article looks into how ideas (a) supplant and (b) substitute formal institutions. Therefore, we argue that there are two mechanisms pertaining to the dominance of conservative conventions: conservative ideas may either supplant the institutional impact on family policies, or substitute them thanks to a superior reasoning which societies assign to them. Furthermore, conservative conventions prevail alongside women's customary unpaid work as care-givers regardless of the level of their formal workforce participation. We propose as our major findings for the literature of population politics that ideas, as ubiquitous belief systems, are more powerful than institutions since they provide what is perceived as legitimate, acceptable, and good for the societies under study. In the end, irrespective of the presence of institutions, socially conservative ideas prevail.

  3. The Statistical Consulting Center for Astronomy (SCCA)

    NASA Technical Reports Server (NTRS)

    Akritas, Michael

    2001-01-01

    The process by which raw astronomical data acquisition is transformed into scientifically meaningful results and interpretation typically involves many statistical steps. Traditional astronomy limits itself to a narrow range of old and familiar statistical methods: means and standard deviations; least-squares methods like chi(sup 2) minimization; and simple nonparametric procedures such as the Kolmogorov-Smirnov tests. These tools are often inadequate for the complex problems and datasets under investigations, and recent years have witnessed an increased usage of maximum-likelihood, survival analysis, multivariate analysis, wavelet and advanced time-series methods. The Statistical Consulting Center for Astronomy (SCCA) assisted astronomers with the use of sophisticated tools, and to match these tools with specific problems. The SCCA operated with two professors of statistics and a professor of astronomy working together. Questions were received by e-mail, and were discussed in detail with the questioner. Summaries of those questions and answers leading to new approaches were posted on the Web (www.state.psu.edu/ mga/SCCA). In addition to serving individual astronomers, the SCCA established a Web site for general use that provides hypertext links to selected on-line public-domain statistical software and services. The StatCodes site (www.astro.psu.edu/statcodes) provides over 200 links in the areas of: Bayesian statistics; censored and truncated data; correlation and regression, density estimation and smoothing, general statistics packages and information; image analysis; interactive Web tools; multivariate analysis; multivariate clustering and classification; nonparametric analysis; software written by astronomers; spatial statistics; statistical distributions; time series analysis; and visualization tools. StatCodes has received a remarkable high and constant hit rate of 250 hits/week (over 10,000/year) since its inception in mid-1997. It is of interest to scientists both within and outside of astronomy. The most popular sections are multivariate techniques, image analysis, and time series analysis. Hundreds of copies of the ASURV, SLOPES and CENS-TAU codes developed by SCCA scientists were also downloaded from the StatCodes site. In addition to formal SCCA duties, SCCA scientists continued a variety of related activities in astrostatistics, including refereeing of statistically oriented papers submitted to the Astrophysical Journal, talks in meetings including Feigelson's talk to science journalists entitled "The reemergence of astrostatistics" at the American Association for the Advancement of Science meeting, and published papers of astrostatistical content.

  4. Expert judgement and uncertainty quantification for climate change

    NASA Astrophysics Data System (ADS)

    Oppenheimer, Michael; Little, Christopher M.; Cooke, Roger M.

    2016-05-01

    Expert judgement is an unavoidable element of the process-based numerical models used for climate change projections, and the statistical approaches used to characterize uncertainty across model ensembles. Here, we highlight the need for formalized approaches to unifying numerical modelling with expert judgement in order to facilitate characterization of uncertainty in a reproducible, consistent and transparent fashion. As an example, we use probabilistic inversion, a well-established technique used in many other applications outside of climate change, to fuse two recent analyses of twenty-first century Antarctic ice loss. Probabilistic inversion is but one of many possible approaches to formalizing the role of expert judgement, and the Antarctic ice sheet is only one possible climate-related application. We recommend indicators or signposts that characterize successful science-based uncertainty quantification.

  5. Thermodynamics of adaptive molecular resolution

    NASA Astrophysics Data System (ADS)

    Delgado-Buscalioni, R.

    2016-11-01

    A relatively general thermodynamic formalism for adaptive molecular resolution (AMR) is presented. The description is based on the approximation of local thermodynamic equilibrium and considers the alchemic parameter λ as the conjugate variable of the potential energy difference between the atomistic and coarse-grained model Φ=U(1)-U(0). The thermodynamic formalism recovers the relations obtained from statistical mechanics of H-AdResS (Español et al., J. Chem. Phys. 142, 064115, 2015 (doi:10.1063/1.4907006)) and provides relations between the free energy compensation and thermodynamic potentials. Inspired by this thermodynamic analogy, several generalizations of AMR are proposed, such as the exploration of new Maxwell relations and how to treat λ and Φ as `real' thermodynamic variables. This article is part of the themed issue 'Multiscale modelling at the physics-chemistry-biology interface'.

  6. A Formal Approach to Domain-Oriented Software Design Environments

    NASA Technical Reports Server (NTRS)

    Lowry, Michael; Philpot, Andrew; Pressburger, Thomas; Underwood, Ian; Lum, Henry, Jr. (Technical Monitor)

    1994-01-01

    This paper describes a formal approach to domain-oriented software design environments, based on declarative domain theories, formal specifications, and deductive program synthesis. A declarative domain theory defines the semantics of a domain-oriented specification language and its relationship to implementation-level subroutines. Formal specification development and reuse is made accessible to end-users through an intuitive graphical interface that guides them in creating diagrams denoting formal specifications. The diagrams also serve to document the specifications. Deductive program synthesis ensures that end-user specifications are correctly implemented. AMPHION has been applied to the domain of solar system kinematics through the development of a declarative domain theory, which includes an axiomatization of JPL's SPICELIB subroutine library. Testing over six months with planetary scientists indicates that AMPHION's interactive specification acquisition paradigm enables users to develop, modify, and reuse specifications at least an order of magnitude more rapidly than manual program development. Furthermore, AMPHION synthesizes one to two page programs consisting of calls to SPICELIB subroutines from these specifications in just a few minutes. Test results obtained by metering AMPHION's deductive program synthesis component are examined. AMPHION has been installed at JPL and is currently undergoing further refinement in preparation for distribution to hundreds of SPICELIB users worldwide. Current work to support end-user customization of AMPHION's specification acquisition subsystem is briefly discussed, as well as future work to enable domain-expert creation of new AMPHION applications through development of suitable domain theories.

  7. 40 CFR 75.20 - Initial certification and recertification procedures.

    Code of Federal Regulations, 2014 CFR

    2014-07-01

    ... all applicable initial certification tests under paragraph (c) of this section are completed by the... installed, initial certification is required. (1) Notification of initial certification test dates. The...) tests. (4) Certification (or recertification) application formal approval process. The Administrator...

  8. 40 CFR 75.20 - Initial certification and recertification procedures.

    Code of Federal Regulations, 2013 CFR

    2013-07-01

    ... all applicable initial certification tests under paragraph (c) of this section are completed by the... installed, initial certification is required. (1) Notification of initial certification test dates. The...) tests. (4) Certification (or recertification) application formal approval process. The Administrator...

  9. 40 CFR 75.20 - Initial certification and recertification procedures.

    Code of Federal Regulations, 2011 CFR

    2011-07-01

    ... all applicable initial certification tests under paragraph (c) of this section are completed by the... installed, initial certification is required. (1) Notification of initial certification test dates. The...) tests. (4) Certification (or recertification) application formal approval process. The Administrator...

  10. Don't abandon hope all ye who enter here: The protective role of formal mentoring and learning processes on burnout in correctional officers.

    PubMed

    Farnese, M L; Barbieri, B; Bellò, B; Bartone, P T

    2017-01-01

    Within a Job Demands-Resources Model framework, formal mentoring can be conceived as a job resource expressing the organization's support for new members, which may prevent their being at risk for burnout. This research aims at understanding the protective role of formal mentoring on burnout, through the effect of increasing learning personal resources. Specifically, we hypothesized that formal mentoring enhances newcomers' learning about job and social domains related to the new work context, thus leading to lower burnout. In order to test the hypotheses, a multiple regression analysis using the bootstrapping method was used. Based on a questionnaire administered to 117 correctional officer newcomers who had a formal mentor assigned, our results confirm that formal mentoring exerts a positive influence on newcomers' adjustment, and that this in turn exerts a protective influence against burnout onset by reducing cynicism and interpersonal stress and also enhancing the sense of personal accomplishment. Confirming previous literature's suggestions, supportive mentoring and effective socialization seem to represent job and personal resources that are protective against burnout. This study provides empirical support for this relation in the prison context.

  11. Systems, methods and apparatus for pattern matching in procedure development and verification

    NASA Technical Reports Server (NTRS)

    Hinchey, Michael G. (Inventor); Rouff, Christopher A. (Inventor); Rash, James L. (Inventor)

    2011-01-01

    Systems, methods and apparatus are provided through which, in some embodiments, a formal specification is pattern-matched from scenarios, the formal specification is analyzed, and flaws in the formal specification are corrected. The systems, methods and apparatus may include pattern-matching an equivalent formal model from an informal specification. Such a model can be analyzed for contradictions, conflicts, use of resources before the resources are available, competition for resources, and so forth. From such a formal model, an implementation can be automatically generated in a variety of notations. The approach can improve the resulting implementation, which, in some embodiments, is provably equivalent to the procedures described at the outset, which in turn can improve confidence that the system reflects the requirements, and in turn reduces system development time and reduces the amount of testing required of a new system. Moreover, in some embodiments, two or more implementations can be "reversed" to appropriate formal models, the models can be combined, and the resulting combination checked for conflicts. Then, the combined, error-free model can be used to generate a new (single) implementation that combines the functionality of the original separate implementations, and may be more likely to be correct.

  12. Participatory Research or Participation Put-on: Reflections on the Research Phase of an Indonesian Experiment in Non-Formal Education.

    ERIC Educational Resources Information Center

    Colletta, Nat J.

    In the fall of 1974, I was invited to serve as a consultant to the Indonesian effort to develop a National Strategy for Non-Formal Education. The brunt of my effort concerned action research for developing and testing an empirical "Community Learning System" designed to link local learning needs with the management-resource-learning…

  13. “Forward-Thinking” in U.S. Biobanking

    PubMed Central

    Edwards, Teresa P.; Lassiter, Dragana; Davis, Arlene M.; Henderson, Gail E.

    2017-01-01

    Aims: Do biobanks enact policies and plans that allow them to anticipate and respond to potential challenges? If a biobank has one such policy or plan, is it likely to have more? Using survey data from 456 U.S. biobanks, we assess four possible indicators of such “forward-thinking.” Methods: We present response frequencies and cross-tabulations regarding policies for return of results and ownership of specimens, and for having a formal business plan and a plan for what happens to specimens if the biobank closes. We analyze the relationships among these indicators, using chi-square for tests of statistical significance. Results: Policies—Sixty-two percent of biobanks have a policy about returning individual research results; 70% have a policy designating ownership of specimens and/or technology. Having these two policies is significantly related (p < 0.001). Plans—34% of biobanks have a formal business plan; 26% have a written plan for what will happen to the specimens if the biobank closes. Having these two plans is significantly related (p < 0.001). Relationships among indicators—only 7% of biobanks are forward-thinking across all four indicators; 12% are forward-thinking across none. Discussion: The two policies we examined tend to occur together, as do the two plans. These policies and plans seem to tap different aspects of accountability and responsiveness. Specifically, the policies reflect issues most commonly raised in the ethical and legal literature on biobanking, while the plans are indicators of sustainability, a separate area of concern in biobanking. PMID:28118036

  14. Study of the statistical physics bases on superstatistics from the β-fluctuated to the T-fluctuated form

    NASA Astrophysics Data System (ADS)

    Sargolzaeipor, S.; Hassanabadi, H.; Chung, W. S.

    2018-04-01

    In this paper, we study the T -fluctuated form of superstatistics. In this form, some thermodynamic quantities such as the Helmholtz energy, the entropy and the internal energy, are expressed in terms of the T -fluctuated form for a canonical ensemble. In addition, the partition functions in the formalism for 2-level and 3-level distributions are derived. Then we make use of the T -fluctuated superstatistics for a quantum harmonic oscillator problem and the thermal properties of the system for three statistics of the Bose-Einstein, Maxwell-Boltzmann and Fermi-Dirac statistics are calculated. The effect of the deformation parameter on these properties is examined. All the results recover the well-known results by removing the deformation parameter.

  15. Bureau of the Census Center for International Research

    NASA Technical Reports Server (NTRS)

    Pinto, Nina Pane

    1994-01-01

    This paper describes the organization and activities of the Center for International Research at the Bureau of the Census. There is a formal publication exchange program with other government's statistical programs. This has resulted in the Center's collection being one of the world's largest in the area of international census and demographic information. Foreign statistical publications are in three libraries, one being dedicated to the former Soviet Union and one to the Peoples Republic of China. In addition to the libraries there are two computerized data bases. The International data base is a source of demographic and socio-economic statistics for all countries of the world. The second data base is the HIV/AIDS Surveillance Data Base which contains information related to the publication and dissemination of the results of seroprevalence surveys.

  16. Formal Functional Test Designs: Bridging the Gap Between Test Requirements and Test Specifications

    NASA Technical Reports Server (NTRS)

    Hops, Jonathan

    1993-01-01

    This presentation describes the testing life cycle, the purpose of the test design phase, and test design methods and gives an example application. Also included is a description of Test Representation Language (TRL), a summary of the language, and an example of an application of TRL. A sample test requirement and sample test design are included.

  17. P values are only an index to evidence: 20th- vs. 21st-century statistical science.

    PubMed

    Burnham, K P; Anderson, D R

    2014-03-01

    Early statistical methods focused on pre-data probability statements (i.e., data as random variables) such as P values; these are not really inferences nor are P values evidential. Statistical science clung to these principles throughout much of the 20th century as a wide variety of methods were developed for special cases. Looking back, it is clear that the underlying paradigm (i.e., testing and P values) was weak. As Kuhn (1970) suggests, new paradigms have taken the place of earlier ones: this is a goal of good science. New methods have been developed and older methods extended and these allow proper measures of strength of evidence and multimodel inference. It is time to move forward with sound theory and practice for the difficult practical problems that lie ahead. Given data the useful foundation shifts to post-data probability statements such as model probabilities (Akaike weights) or related quantities such as odds ratios and likelihood intervals. These new methods allow formal inference from multiple models in the a prior set. These quantities are properly evidential. The past century was aimed at finding the "best" model and making inferences from it. The goal in the 21st century is to base inference on all the models weighted by their model probabilities (model averaging). Estimates of precision can include model selection uncertainty leading to variances conditional on the model set. The 21st century will be about the quantification of information, proper measures of evidence, and multi-model inference. Nelder (1999:261) concludes, "The most important task before us in developing statistical science is to demolish the P-value culture, which has taken root to a frightening extent in many areas of both pure and applied science and technology".

  18. Guidelines for the design and statistical analysis of experiments in papers submitted to ATLA.

    PubMed

    Festing, M F

    2001-01-01

    In vitro experiments need to be well designed and correctly analysed if they are to achieve their full potential to replace the use of animals in research. An "experiment" is a procedure for collecting scientific data in order to answer a hypothesis, or to provide material for generating new hypotheses, and differs from a survey because the scientist has control over the treatments that can be applied. Most experiments can be classified into one of a few formal designs, the most common being completely randomised, and randomised block designs. These are quite common with in vitro experiments, which are often replicated in time. Some experiments involve a single independent (treatment) variable, while other "factorial" designs simultaneously vary two or more independent variables, such as drug treatment and cell line. Factorial designs often provide additional information at little extra cost. Experiments need to be carefully planned to avoid bias, be powerful yet simple, provide for a valid statistical analysis and, in some cases, have a wide range of applicability. Virtually all experiments need some sort of statistical analysis in order to take account of biological variation among the experimental subjects. Parametric methods using the t test or analysis of variance are usually more powerful than non-parametric methods, provided the underlying assumptions of normality of the residuals and equal variances are approximately valid. The statistical analyses of data from a completely randomised design, and from a randomised-block design are demonstrated in Appendices 1 and 2, and methods of determining sample size are discussed in Appendix 3. Appendix 4 gives a checklist for authors submitting papers to ATLA.

  19. A statistical approach to investigating enhancement of polonium-210 in the Eastern Irish Sea arising from discharges from a former phosphate processing plant.

    PubMed

    Dewar, Alastair; Camplin, William; Barry, Jon; Kennedy, Paul

    2014-12-01

    Since the cessation of phosphoric acid production (in 1992) and subsequent closure and decommissioning (2004) of the Rhodia Consumer Specialties Limited plant in Whitehaven, the concentration levels of polonium-210 ((210)Po) in local marine materials have declined towards a level more typical of natural background. However, enhanced concentrations of (210)Po and lead-210 ((210)Pb), due to this historic industrial activity (plant discharges and ingrowth of (210)Po from (210)Pb), have been observed in fish and shellfish samples collected from this area over the last 20 years. The results of this monitoring, and assessments of the dose from these radionuclides, to high-rate aquatic food consumers are published annually in the Radioactivity in Food and the Environment (RIFE) report series. The RIFE assessment uses a simple approach to determine whether and by how much activity is enhanced above the normal background. As a potential tool to improve the assessment of enhanced concentrations of (210)Po in routine dose assessments, a formal statistical test, where the null hypothesis is that the Whitehaven area is contaminated with (210)Po, was applied to sample data. This statistical, modified "green", test has been used in assessments of chemicals by the OSPAR commission. It involves comparison of the reported environmental concentrations of (210)Po in a given aquatic species against its corresponding Background Assessment Concentration (BAC), which is based upon environmental samples collected from regions assumed to be not enhanced by industrial sources of (210)Po, over the period for which regular monitoring data are available (1990-2010). Unlike RIFE, these BAC values take account of the variability of the natural background level. As an example, for 2010 data, crab, lobster, mussels and winkles passed the modified "green" test (i.e. the null hypothesis is rejected) and as such are deemed not to be enhanced. Since the cessation of phosphoric acid production in 1992, the modified "green" test pass rate for crustaceans is ∼53% and ∼64% for molluscs. Results of dose calculations are made (i) using the RIFE approach and (ii) with the application of the modified "green" test, where samples passing the modified "green" test are assumed to have background levels and hence zero enhancement of (210)Po. Applying the modified "green" test reduces the dose on average by 44% over the period of this study (1990-2010). Crown Copyright © 2014. Published by Elsevier Ltd. All rights reserved.

  20. 49 CFR 384.228 - Examiner training and record checks.

    Code of Federal Regulations, 2011 CFR

    2011-10-01

    ... the CDL testing program, and with all of the knowledge and skills necessary to serve as a CDL test... knowledge and skills test examiners to successfully complete a formal CDL test examiner training course and examination before certifying them to administer CDL knowledge and skills tests. (c) The training course for...

  1. Development of dynamic Bayesian models for web application test management

    NASA Astrophysics Data System (ADS)

    Azarnova, T. V.; Polukhin, P. V.; Bondarenko, Yu V.; Kashirina, I. L.

    2018-03-01

    The mathematical apparatus of dynamic Bayesian networks is an effective and technically proven tool that can be used to model complex stochastic dynamic processes. According to the results of the research, mathematical models and methods of dynamic Bayesian networks provide a high coverage of stochastic tasks associated with error testing in multiuser software products operated in a dynamically changing environment. Formalized representation of the discrete test process as a dynamic Bayesian model allows us to organize the logical connection between individual test assets for multiple time slices. This approach gives an opportunity to present testing as a discrete process with set structural components responsible for the generation of test assets. Dynamic Bayesian network-based models allow us to combine in one management area individual units and testing components with different functionalities and a direct influence on each other in the process of comprehensive testing of various groups of computer bugs. The application of the proposed models provides an opportunity to use a consistent approach to formalize test principles and procedures, methods used to treat situational error signs, and methods used to produce analytical conclusions based on test results.

  2. Municipal water consumption forecast accuracy

    NASA Astrophysics Data System (ADS)

    Fullerton, Thomas M.; Molina, Angel L.

    2010-06-01

    Municipal water consumption planning is an active area of research because of infrastructure construction and maintenance costs, supply constraints, and water quality assurance. In spite of that, relatively few water forecast accuracy assessments have been completed to date, although some internal documentation may exist as part of the proprietary "grey literature." This study utilizes a data set of previously published municipal consumption forecasts to partially fill that gap in the empirical water economics literature. Previously published municipal water econometric forecasts for three public utilities are examined for predictive accuracy against two random walk benchmarks commonly used in regional analyses. Descriptive metrics used to quantify forecast accuracy include root-mean-square error and Theil inequality statistics. Formal statistical assessments are completed using four-pronged error differential regression F tests. Similar to studies for other metropolitan econometric forecasts in areas with similar demographic and labor market characteristics, model predictive performances for the municipal water aggregates in this effort are mixed for each of the municipalities included in the sample. Given the competitiveness of the benchmarks, analysts should employ care when utilizing econometric forecasts of municipal water consumption for planning purposes, comparing them to recent historical observations and trends to insure reliability. Comparative results using data from other markets, including regions facing differing labor and demographic conditions, would also be helpful.

  3. Thermodynamic analogies in economics and finance: instability of markets

    NASA Astrophysics Data System (ADS)

    McCauley, Joseph L.

    2003-11-01

    Interest in thermodynamic analogies in economics is older than the idea of von Neumann to look for market entropy in liquidity, advice that was not taken in any thermodynamic analogy presented so far in the literature. In this paper, we go further and use a standard strategy from trading theory to pinpoint why thermodynamic analogies necessarily fail to describe financial markets, in spite of the presence of liquidity as the underlying basis for market entropy. Market liquidity of frequently traded assets does play the role of the ‘heat bath‘, as anticipated by von Neumann, but we are able to identify the no-arbitrage condition geometrically as an assumption of translational and rotational invariance rather than (as finance theorists would claim) an equilibrium condition. We then use the empirical market distribution to introduce an asset's entropy and discuss the underlying reason why real financial markets cannot behave thermodynamically: financial markets are unstable, they do not approach statistical equilibrium, nor are there any available topological invariants on which to base a purely formal statistical mechanics. After discussing financial markets, we finally generalize our result by proposing that the idea of Adam Smith's Invisible Hand is a falsifiable proposition: we suggest how to test nonfinancial markets empirically for the stabilizing action of The Invisible Hand.

  4. Estimating statistical isotropy violation in CMB due to non-circular beam and complex scan in minutes

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Pant, Nidhi; Das, Santanu; Mitra, Sanjit

    Mild, unavoidable deviations from circular-symmetry of instrumental beams along with scan strategy can give rise to measurable Statistical Isotropy (SI) violation in Cosmic Microwave Background (CMB) experiments. If not accounted properly, this spurious signal can complicate the extraction of other SI violation signals (if any) in the data. However, estimation of this effect through exact numerical simulation is computationally intensive and time consuming. A generalized analytical formalism not only provides a quick way of estimating this signal, but also gives a detailed understanding connecting the leading beam anisotropy components to a measurable BipoSH characterisation of SI violation. In this paper,more » we provide an approximate generic analytical method for estimating the SI violation generated due to a non-circular (NC) beam and arbitrary scan strategy, in terms of the Bipolar Spherical Harmonic (BipoSH) spectra. Our analytical method can predict almost all the features introduced by a NC beam in a complex scan and thus reduces the need for extensive numerical simulation worth tens of thousands of CPU hours into minutes long calculations. As an illustrative example, we use WMAP beams and scanning strategy to demonstrate the easability, usability and efficiency of our method. We test all our analytical results against that from exact numerical simulations.« less

  5. An Assessment of Teaching and Learning Practices: A Questionnaire Study for Dental Educators of Karnataka

    PubMed Central

    Meenakshi, S.; Raghunath, N.; Shreeshyla, H. S.

    2017-01-01

    Aims and Objectives: Faculty members of dental institutions are being asked to assume new academic duties for which they have received no formal training. To succeed in new teaching tasks, faculty development through assessment of teaching skills is essential. Materials and Methods: A Self-Assessment Questionnaire consisting 18 closed-ended questions was sent to various faculty members of dental colleges of Karnataka. A total of 210 faculty members volunteered to participate in the study. The response rate was 69.8%. Data gathered were statistically analyzed using SPSS software version 16, Chi-square test, and descriptive statistics. Results: In the present study, 27.3% of participants were unaware of andragogy, 33.3% were unaware of teachers development programs, 44.6% do not obtain student feedback after teaching, 52.6% were unaware of peer review of teaching skills, and 50% were unaware of interprofessional education initiatives. Conclusion: By incorporating teaching and learning skills, dental faculty could acquire competencies and academic credentials to become valuable contributors to the institution. This study emphasizes the areas of improvement in dental school learning environment, based on activation of prior knowledge, elaboration of new learning, learning in context, transfer of learning, and organization of knowledge toward learning. PMID:29285474

  6. ELISPOTs Produced by CD8 and CD4 Cells Follow Log Normal Size Distribution Permitting Objective Counting

    PubMed Central

    Karulin, Alexey Y.; Karacsony, Kinga; Zhang, Wenji; Targoni, Oleg S.; Moldovan, Ioana; Dittrich, Marcus; Sundararaman, Srividya; Lehmann, Paul V.

    2015-01-01

    Each positive well in ELISPOT assays contains spots of variable sizes that can range from tens of micrometers up to a millimeter in diameter. Therefore, when it comes to counting these spots the decision on setting the lower and the upper spot size thresholds to discriminate between non-specific background noise, spots produced by individual T cells, and spots formed by T cell clusters is critical. If the spot sizes follow a known statistical distribution, precise predictions on minimal and maximal spot sizes, belonging to a given T cell population, can be made. We studied the size distributional properties of IFN-γ, IL-2, IL-4, IL-5 and IL-17 spots elicited in ELISPOT assays with PBMC from 172 healthy donors, upon stimulation with 32 individual viral peptides representing defined HLA Class I-restricted epitopes for CD8 cells, and with protein antigens of CMV and EBV activating CD4 cells. A total of 334 CD8 and 80 CD4 positive T cell responses were analyzed. In 99.7% of the test cases, spot size distributions followed Log Normal function. These data formally demonstrate that it is possible to establish objective, statistically validated parameters for counting T cell ELISPOTs. PMID:25612115

  7. It's All Relative: A Validation of Radiation Quality Comparison Metrics

    NASA Technical Reports Server (NTRS)

    Chappell, Lori J.; Milder, Caitlin M.; Elgart, S. Robin; Semones, Edward J.

    2017-01-01

    The difference between high-LET and low-LET radiation is quantified by a measure called relative biological effectiveness (RBE). RBE is defined as the ratio of the dose of a reference radiation to that of a test radiation to achieve the same effect level, and thus, is described either as an iso-effector dose-to-dose ratio. A single dose point is not sufficient to calculate an RBE value; therefore, studies with only one dose point usually calculate an effect-to-effect ratio. While not formally used in radiation protection, these iso-dose values may still be informative. Shuryak, et al 2017 investigated the use of an iso-dose metric termed "radiation effects ratio" (RER) and used both RBE and RER to estimate high-LET risks. To apply RBE or RER to risk prediction, the selected metric must be uniquely defined. That is, the calculated value must be consistent within a model given a constant set of constraints and assumptions, regardless of how effects are defined using statistical transformations from raw endpoint data. We first test the RBE and the RER to determine whether they are uniquely defined using transformations applied to raw data. Then, we test whether both metrics can predict heavy ion response data after simulated effect size scaling between human populations or when converting animal to human endpoints.

  8. Lets Quantify Our Welding Instruction

    ERIC Educational Resources Information Center

    Shinn, Glen C.

    1973-01-01

    By using the nick-break test, and the guided bend test, a student of welding can individually assess his performance either in a formal class setting, or at the home, farm, or job. Employment of these tests will tend to limit personal bias in evaluation. (KP)

  9. Uniform peanut performance test 2017

    USDA-ARS?s Scientific Manuscript database

    The Uniform Peanut Performance Tests (UPPT) are designed to evaluate the commercial potential of advanced breeding peanut lines not formally released. The tests are performed in ten locations across the peanut production belt. In this study, 2 controls and 14 entries were evaluated at 8 locations....

  10. Academic productivity among fellowship associated adult total joint reconstruction surgeons.

    PubMed

    Khan, Adam Z; Kelley, Benjamin V; Patel, Ankur D; McAllister, David R; Leong, Natalie L

    2017-12-01

    The Hirsch index (h-index) is a measure that evaluates both research volume and quality-taking into consideration both publications and citations of a single author. No prior work has evaluated academic productivity and contributions to the literature of adult total joint replacement surgeons. This study uses h-index to benchmark the academic impact and identify characteristics associated with productivity of faculty members at joint replacement fellowships. Adult reconstruction fellowship programs were obtained via the American Association of Hip and Knee Surgeons website. Via the San Francisco match and program-specific websites, program characteristics (Accreditation Council for Graduate Medical Education approval, academic affiliation, region, number of fellows, fellow research requirement), associated faculty members, and faculty-specific characteristics (gender, academic title, formal fellowship training, years in practice) were obtained. H-index and total faculty publications served as primary outcome measures. Multivariable linear regression determined statistical significance. Sixty-six adult total joint reconstruction fellowship programs were identified: 30% were Accreditation Council for Graduate Medical Education approved and 73% had an academic affiliation. At these institutions, 375 adult reconstruction surgeons were identified; 98.1% were men and 85.3% had formal arthroplasty fellowship training. Average number of publications per faculty member was 50.1 (standard deviation 76.8; range 0-588); mean h-index was 12.8 (standard deviation 13.8; range 0-67). Number of fellows, faculty academic title, years in practice, and formal fellowship training had a significant ( P < .05) positive correlation with both h-index and total publications. The statistical overview presented in this work can help total joint surgeons quantitatively benchmark their academic performance against that of their peers.

  11. Relative mass distributions of neutron-rich thermally fissile nuclei within a statistical model

    NASA Astrophysics Data System (ADS)

    Kumar, Bharat; Kannan, M. T. Senthil; Balasubramaniam, M.; Agrawal, B. K.; Patra, S. K.

    2017-09-01

    We study the binary mass distribution for the recently predicted thermally fissile neutron-rich uranium and thorium nuclei using a statistical model. The level density parameters needed for the study are evaluated from the excitation energies of the temperature-dependent relativistic mean field formalism. The excitation energy and the level density parameter for a given temperature are employed in the convolution integral method to obtain the probability of the particular fragmentation. As representative cases, we present the results for the binary yields of 250U and 254Th. The relative yields are presented for three different temperatures: T =1 , 2, and 3 MeV.

  12. [Is there life beyond SPSS? Discover R].

    PubMed

    Elosua Oliden, Paula

    2009-11-01

    R is a GNU statistical and programming environment with very high graphical capabilities. It is very powerful for research purposes, but it is also an exceptional tool for teaching. R is composed of more than 1400 packages that allow using it for simple statistics and applying the most complex and most recent formal models. Using graphical interfaces like the Rcommander package, permits working in user-friendly environments which are similar to the graphical environment used by SPSS. This last characteristic allows non-statisticians to overcome the obstacle of accessibility, and it makes R the best tool for teaching. Is there anything better? Open, free, affordable, accessible and always on the cutting edge.

  13. An investigation of the effects of interventions on problem-solving strategies and abilities

    NASA Astrophysics Data System (ADS)

    Cox, Charles Terrence, Jr.

    Problem-solving has been described as being the "heart" of the chemistry classroom, and students' development of problem-solving skills is essential for their success in chemistry. Despite the importance of problem-solving, there has been little research within the chemistry domain, largely because of the lack of tools to collect data for large populations. Problem-solving was assessed using a software package known as IMMEX (for Interactive Multimedia Exercises) which has an HTML tracking feature that allows for collection of problem-solving data in the background as students work the problems. The primary goal of this research was to develop methods (known as interventions) that could promote improvements in students' problem-solving and most notably aid in their transition from the novice to competent level. Three intervention techniques that were incorporated within the chemistry curricula: collaborative grouping (face-to-face and distance), concept mapping, and peer-led team learning. The face-to-face collaborative grouping intervention was designed to probe the factors affecting the quality of the group interaction. Students' logical reasoning abilities were measured using the Group Assessment of Logical Thinking (GALT) test which classifies students as formal, transitional, or concrete. These classifications essentially provide a basis for identifying scientific aptitude. These designations were used as the basis for forming collaborative groups of two students. The six possibilities (formal-formal, formal-transitional, etc.) were formed to determine how the group composition influences the gains in student abilities observed from collaborative grouping interventions. Students were given three assignments (an individual pre-collaborative, an individual post collaborative, and a collaborative assignment) each requiring them to work an IMMEX problem set. Similar gains in performance of 10% gains were observed for each group with two exceptions. The transitional students who were paired with concrete students had a 15% gain, and the concrete students paired with other concrete students had only a marginal gain. In fact, there was no statistical difference in the pre-collaborative and post-collaborative student abilities for concrete-concrete groups. The distance collaborative intervention was completed using a new interface for the IMMEX software designed to mimic face-to-face collaboration. A stereochemistry problem set which had a solved rate of 28% prior to collaboration was chosen for incorporation into this distance collaboration study. (Abstract shortened by UMI.)

  14. Studies in Non-Equilibrium Statistical Mechanics.

    DTIC Science & Technology

    1982-09-01

    in the formalism, and this is used to simulate the effects of rotational states and collisions. At each stochastic step the energy changes in the...uses of this method. 10. A Scaling Theoretical Analysis of Vibrational Relaxation Experiments: Rotational Effects and Long-Range Collisions 0...in- elude rotational effects through the rotational energy gaps and the rotational distributions. The variables in this theory are a fundamental set

  15. Labor Force Participation in Formal Work-Related Education in 2000-01. Statistical Analysis Report. NCES 2005-048

    ERIC Educational Resources Information Center

    Hudson, Lisa; Bhandari, Rajika; Peter, Katharin; Bills, David B.

    2005-01-01

    Of the many purposes education serves in society, one of the most important is to prepare people for work. In today's economy, education is important not just to help adults enter the labor market, but also to ensure that adults remain marketable throughout their working lives. This report examines how adults in the labor force use formal…

  16. Not so Fast My Friend: The Rush to R and the Need for Rigorous Evaluation of Data Analysis and Software in Education

    ERIC Educational Resources Information Center

    Harwell, Michael

    2014-01-01

    Commercial data analysis software has been a fixture of quantitative analyses in education for more than three decades. Despite its apparent widespread use there is no formal evidence cataloging what software is used in educational research and educational statistics classes, by whom and for what purpose, and whether some programs should be…

  17. Data-driven non-Markovian closure models

    NASA Astrophysics Data System (ADS)

    Kondrashov, Dmitri; Chekroun, Mickaël D.; Ghil, Michael

    2015-03-01

    This paper has two interrelated foci: (i) obtaining stable and efficient data-driven closure models by using a multivariate time series of partial observations from a large-dimensional system; and (ii) comparing these closure models with the optimal closures predicted by the Mori-Zwanzig (MZ) formalism of statistical physics. Multilayer stochastic models (MSMs) are introduced as both a generalization and a time-continuous limit of existing multilevel, regression-based approaches to closure in a data-driven setting; these approaches include empirical model reduction (EMR), as well as more recent multi-layer modeling. It is shown that the multilayer structure of MSMs can provide a natural Markov approximation to the generalized Langevin equation (GLE) of the MZ formalism. A simple correlation-based stopping criterion for an EMR-MSM model is derived to assess how well it approximates the GLE solution. Sufficient conditions are derived on the structure of the nonlinear cross-interactions between the constitutive layers of a given MSM to guarantee the existence of a global random attractor. This existence ensures that no blow-up can occur for a broad class of MSM applications, a class that includes non-polynomial predictors and nonlinearities that do not necessarily preserve quadratic energy invariants. The EMR-MSM methodology is first applied to a conceptual, nonlinear, stochastic climate model of coupled slow and fast variables, in which only slow variables are observed. It is shown that the resulting closure model with energy-conserving nonlinearities efficiently captures the main statistical features of the slow variables, even when there is no formal scale separation and the fast variables are quite energetic. Second, an MSM is shown to successfully reproduce the statistics of a partially observed, generalized Lotka-Volterra model of population dynamics in its chaotic regime. The challenges here include the rarity of strange attractors in the model's parameter space and the existence of multiple attractor basins with fractal boundaries. The positivity constraint on the solutions' components replaces here the quadratic-energy-preserving constraint of fluid-flow problems and it successfully prevents blow-up.

  18. The effect of non-invasive positive pressure ventilation (NIPPV) on cognitive function in amyotrophic lateral sclerosis (ALS): a prospective study

    PubMed Central

    Newsom-Davis, I; Lyall, R; Leigh, P; Moxham, J; Goldstein, L

    2001-01-01

    OBJECTIVES—Neuropsychological investigations have shown a degree of cognitive dysfunction in a proportion of non-demented patients with ALS. Respiratory muscle weakness in ALS can lead to nocturnal hypoventilation, resulting in sleep disturbance and daytime somnolence. Sleep deprivation of this type may cause impairments in cognitive function, but this has not been formally evaluated in ALS.
METHODS—Cognitive functioning was evaluated in nine patients with ALS with sleep disturbance caused by nocturnal hypoventilation (NIPPV group), and in a comparison group of 10 similar patients without ventilation problems (control group). The NIPPV group then started non-invasive positive pressure ventilation (NIPPV) at night. After about 6 weeks, change in cognitive function was evaluated.
RESULTS—Statistically significant improvement in scores on two of the seven cognitive tests was demonstrated in the NIPPV group postventilation, and a trend towards significant improvement was found for two further tests. Scores in the control group did not improve significantly for these four tests, although an improvement was found on one other test.
CONCLUSIONS—Nocturnal hypoventilation and sleep disturbance may cause cognitive dysfunction in ALS. These deficits may be partially improved by NIPPV over a 6 week period. This has important implications for investigations of both cognitive dysfunction in non-demented patients with ALS, and the effect of ventilation on quality of life.

 PMID:11561031

  19. Products of composite operators in the exact renormalization group formalism

    NASA Astrophysics Data System (ADS)

    Pagani, C.; Sonoda, H.

    2018-02-01

    We discuss a general method of constructing the products of composite operators using the exact renormalization group formalism. Considering mainly the Wilson action at a generic fixed point of the renormalization group, we give an argument for the validity of short-distance expansions of operator products. We show how to compute the expansion coefficients by solving differential equations, and test our method with some simple examples.

  20. Utilisation of formal and informal care and services at home among persons with dementia: a cross-sectional study.

    PubMed

    Bökberg, Christina; Ahlström, Gerd; Karlsson, Staffan

    2017-09-04

    The progression of dementia disease implies increasing needs for both informal and formal care and services but also risk of institutionalisation. To better adjust care and services in the phase preceding institutionalisation it is important to find out whether utilisation of formal and informal care and services is determined by increased needs and by who meets the needs. The aim was to compare persons with dementia (65+) with different levels of cognitive impairment, regarding utilisation of formal and informal care and service at home. The participants consisted of 177 persons with dementia ≥65 years old and at risk of nursing home admission, divided into groups according to their cognitive function. Structured interviews were conducted based on questionnaires about type and amount of formal and informal care utilised, as well as questions regarding cognitive impairment, dependency in activities of daily living (ADLs) and neuropsychiatric symptoms. To analyse the data, descriptive and comparative statistics were used. The findings revealed that the group with severe dementia used significantly more help with ADLs and supervision in terms of time (number of hours and days) provided by the informal caregiver, compared with the group with moderate dementia. Utilisation of formal care and services was highest in the group with the most severe cognitive impairments (Standardized Mini-Mental State Examination score of <9). The group with severe dementia were more dependent in ADLs and had more neuropsychiatric symptoms (hallucinations and motor disturbances). They were younger and more often cohabitated with the informal caregiver, compared with the group with moderate dementia. This study shows that in the phase preceding institutionalisation the ADL and supervision needs due to progression of dementia appear to tend to be met first and foremost by the informal caregivers. © 2017 Nordic College of Caring Science.

Top